00:00:00.001 Started by upstream project "autotest-per-patch" build number 130487 00:00:00.001 originally caused by: 00:00:00.002 Started by user sys_sgci 00:00:00.113 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/nvmf-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.114 The recommended git tool is: git 00:00:00.114 using credential 00000000-0000-0000-0000-000000000002 00:00:00.116 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/nvmf-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.190 Fetching changes from the remote Git repository 00:00:00.193 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.252 Using shallow fetch with depth 1 00:00:00.252 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.252 > git --version # timeout=10 00:00:00.301 > git --version # 'git version 2.39.2' 00:00:00.301 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.324 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.324 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:07.910 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:07.923 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:07.937 Checking out Revision 7510e71a2b3ec6fca98e4ec196065590f900d444 (FETCH_HEAD) 00:00:07.937 > git config core.sparsecheckout # timeout=10 00:00:07.948 > git read-tree -mu HEAD # timeout=10 00:00:07.964 > git checkout -f 7510e71a2b3ec6fca98e4ec196065590f900d444 # timeout=5 00:00:07.985 Commit message: "kid: add issue 3541" 00:00:07.985 > git rev-list --no-walk 7510e71a2b3ec6fca98e4ec196065590f900d444 # timeout=10 00:00:08.069 [Pipeline] Start of Pipeline 00:00:08.084 [Pipeline] library 00:00:08.086 Loading library shm_lib@master 00:00:08.893 Library shm_lib@master is cached. Copying from home. 00:00:08.934 [Pipeline] node 00:00:08.981 Running on WFP31 in /var/jenkins/workspace/nvmf-phy-autotest 00:00:08.984 [Pipeline] { 00:00:08.993 [Pipeline] catchError 00:00:08.994 [Pipeline] { 00:00:09.002 [Pipeline] wrap 00:00:09.008 [Pipeline] { 00:00:09.015 [Pipeline] stage 00:00:09.016 [Pipeline] { (Prologue) 00:00:09.228 [Pipeline] sh 00:00:09.510 + logger -p user.info -t JENKINS-CI 00:00:09.528 [Pipeline] echo 00:00:09.530 Node: WFP31 00:00:09.537 [Pipeline] sh 00:00:09.838 [Pipeline] setCustomBuildProperty 00:00:09.849 [Pipeline] echo 00:00:09.850 Cleanup processes 00:00:09.855 [Pipeline] sh 00:00:10.140 + sudo pgrep -af /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:00:10.140 1565818 sudo pgrep -af /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:00:10.153 [Pipeline] sh 00:00:10.440 ++ sudo pgrep -af /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:00:10.440 ++ grep -v 'sudo pgrep' 00:00:10.440 ++ awk '{print $1}' 00:00:10.440 + sudo kill -9 00:00:10.440 + true 00:00:10.453 [Pipeline] cleanWs 00:00:10.463 [WS-CLEANUP] Deleting project workspace... 00:00:10.463 [WS-CLEANUP] Deferred wipeout is used... 00:00:10.469 [WS-CLEANUP] done 00:00:10.473 [Pipeline] setCustomBuildProperty 00:00:10.487 [Pipeline] sh 00:00:10.771 + sudo git config --global --replace-all safe.directory '*' 00:00:10.862 [Pipeline] httpRequest 00:00:12.048 [Pipeline] echo 00:00:12.050 Sorcerer 10.211.164.101 is alive 00:00:12.061 [Pipeline] retry 00:00:12.063 [Pipeline] { 00:00:12.079 [Pipeline] httpRequest 00:00:12.084 HttpMethod: GET 00:00:12.084 URL: http://10.211.164.101/packages/jbp_7510e71a2b3ec6fca98e4ec196065590f900d444.tar.gz 00:00:12.084 Sending request to url: http://10.211.164.101/packages/jbp_7510e71a2b3ec6fca98e4ec196065590f900d444.tar.gz 00:00:12.088 Response Code: HTTP/1.1 200 OK 00:00:12.088 Success: Status code 200 is in the accepted range: 200,404 00:00:12.088 Saving response body to /var/jenkins/workspace/nvmf-phy-autotest/jbp_7510e71a2b3ec6fca98e4ec196065590f900d444.tar.gz 00:00:12.269 [Pipeline] } 00:00:12.284 [Pipeline] // retry 00:00:12.293 [Pipeline] sh 00:00:12.582 + tar --no-same-owner -xf jbp_7510e71a2b3ec6fca98e4ec196065590f900d444.tar.gz 00:00:12.597 [Pipeline] httpRequest 00:00:12.966 [Pipeline] echo 00:00:12.968 Sorcerer 10.211.164.101 is alive 00:00:12.978 [Pipeline] retry 00:00:12.980 [Pipeline] { 00:00:12.995 [Pipeline] httpRequest 00:00:12.999 HttpMethod: GET 00:00:13.000 URL: http://10.211.164.101/packages/spdk_71dc0c1e9d880d7a3a970b961426bccd54c6c096.tar.gz 00:00:13.001 Sending request to url: http://10.211.164.101/packages/spdk_71dc0c1e9d880d7a3a970b961426bccd54c6c096.tar.gz 00:00:13.003 Response Code: HTTP/1.1 200 OK 00:00:13.003 Success: Status code 200 is in the accepted range: 200,404 00:00:13.004 Saving response body to /var/jenkins/workspace/nvmf-phy-autotest/spdk_71dc0c1e9d880d7a3a970b961426bccd54c6c096.tar.gz 00:00:27.629 [Pipeline] } 00:00:27.647 [Pipeline] // retry 00:00:27.656 [Pipeline] sh 00:00:27.942 + tar --no-same-owner -xf spdk_71dc0c1e9d880d7a3a970b961426bccd54c6c096.tar.gz 00:00:30.495 [Pipeline] sh 00:00:30.785 + git -C spdk log --oneline -n5 00:00:30.785 71dc0c1e9 test/nvmf: Solve ambiguity around $NVMF_SECOND_TARGET_IP 00:00:30.785 5495ea97a test/nvmf: Don't pin nvmf_bdevperf and nvmf_target_disconnect to phy 00:00:30.785 41a395c47 test/nvmf: Remove all transport conditions from the test suites 00:00:30.785 0d645b00a test/nvmf: Drop $RDMA_IP_LIST 00:00:30.785 f09fa45e8 test/nvmf: Drop $NVMF_INITIATOR_IP in favor of $NVMF_FIRST_INITIATOR_IP 00:00:30.796 [Pipeline] } 00:00:30.811 [Pipeline] // stage 00:00:30.820 [Pipeline] stage 00:00:30.822 [Pipeline] { (Prepare) 00:00:30.839 [Pipeline] writeFile 00:00:30.855 [Pipeline] sh 00:00:31.138 + logger -p user.info -t JENKINS-CI 00:00:31.151 [Pipeline] sh 00:00:31.435 + logger -p user.info -t JENKINS-CI 00:00:31.448 [Pipeline] sh 00:00:31.732 + cat autorun-spdk.conf 00:00:31.732 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:31.732 SPDK_TEST_NVMF=1 00:00:31.732 SPDK_TEST_NVME_CLI=1 00:00:31.732 SPDK_TEST_NVMF_NICS=mlx5 00:00:31.732 SPDK_RUN_UBSAN=1 00:00:31.732 NET_TYPE=phy 00:00:31.740 RUN_NIGHTLY=0 00:00:31.744 [Pipeline] readFile 00:00:31.769 [Pipeline] withEnv 00:00:31.771 [Pipeline] { 00:00:31.785 [Pipeline] sh 00:00:32.071 + set -ex 00:00:32.071 + [[ -f /var/jenkins/workspace/nvmf-phy-autotest/autorun-spdk.conf ]] 00:00:32.071 + source /var/jenkins/workspace/nvmf-phy-autotest/autorun-spdk.conf 00:00:32.071 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:32.071 ++ SPDK_TEST_NVMF=1 00:00:32.071 ++ SPDK_TEST_NVME_CLI=1 00:00:32.071 ++ SPDK_TEST_NVMF_NICS=mlx5 00:00:32.071 ++ SPDK_RUN_UBSAN=1 00:00:32.071 ++ NET_TYPE=phy 00:00:32.071 ++ RUN_NIGHTLY=0 00:00:32.071 + case $SPDK_TEST_NVMF_NICS in 00:00:32.071 + DRIVERS=mlx5_ib 00:00:32.071 + [[ -n mlx5_ib ]] 00:00:32.071 + sudo rmmod mlx4_ib mlx5_ib irdma i40iw iw_cxgb4 00:00:32.071 rmmod: ERROR: Module mlx4_ib is not currently loaded 00:00:38.645 rmmod: ERROR: Module irdma is not currently loaded 00:00:38.645 rmmod: ERROR: Module i40iw is not currently loaded 00:00:38.645 rmmod: ERROR: Module iw_cxgb4 is not currently loaded 00:00:38.645 + true 00:00:38.645 + for D in $DRIVERS 00:00:38.645 + sudo modprobe mlx5_ib 00:00:38.645 + exit 0 00:00:38.770 [Pipeline] } 00:00:38.786 [Pipeline] // withEnv 00:00:38.791 [Pipeline] } 00:00:38.804 [Pipeline] // stage 00:00:38.813 [Pipeline] catchError 00:00:38.815 [Pipeline] { 00:00:38.828 [Pipeline] timeout 00:00:38.828 Timeout set to expire in 1 hr 0 min 00:00:38.829 [Pipeline] { 00:00:38.842 [Pipeline] stage 00:00:38.844 [Pipeline] { (Tests) 00:00:38.858 [Pipeline] sh 00:00:39.146 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/nvmf-phy-autotest 00:00:39.146 ++ readlink -f /var/jenkins/workspace/nvmf-phy-autotest 00:00:39.146 + DIR_ROOT=/var/jenkins/workspace/nvmf-phy-autotest 00:00:39.146 + [[ -n /var/jenkins/workspace/nvmf-phy-autotest ]] 00:00:39.146 + DIR_SPDK=/var/jenkins/workspace/nvmf-phy-autotest/spdk 00:00:39.146 + DIR_OUTPUT=/var/jenkins/workspace/nvmf-phy-autotest/output 00:00:39.146 + [[ -d /var/jenkins/workspace/nvmf-phy-autotest/spdk ]] 00:00:39.146 + [[ ! -d /var/jenkins/workspace/nvmf-phy-autotest/output ]] 00:00:39.146 + mkdir -p /var/jenkins/workspace/nvmf-phy-autotest/output 00:00:39.146 + [[ -d /var/jenkins/workspace/nvmf-phy-autotest/output ]] 00:00:39.146 + [[ nvmf-phy-autotest == pkgdep-* ]] 00:00:39.146 + cd /var/jenkins/workspace/nvmf-phy-autotest 00:00:39.146 + source /etc/os-release 00:00:39.146 ++ NAME='Fedora Linux' 00:00:39.146 ++ VERSION='39 (Cloud Edition)' 00:00:39.146 ++ ID=fedora 00:00:39.146 ++ VERSION_ID=39 00:00:39.146 ++ VERSION_CODENAME= 00:00:39.146 ++ PLATFORM_ID=platform:f39 00:00:39.146 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:00:39.146 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:39.146 ++ LOGO=fedora-logo-icon 00:00:39.146 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:00:39.146 ++ HOME_URL=https://fedoraproject.org/ 00:00:39.146 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:00:39.146 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:39.146 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:39.146 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:39.146 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:00:39.146 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:39.146 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:00:39.146 ++ SUPPORT_END=2024-11-12 00:00:39.146 ++ VARIANT='Cloud Edition' 00:00:39.146 ++ VARIANT_ID=cloud 00:00:39.146 + uname -a 00:00:39.146 Linux spdk-wfp-31 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:00:39.146 + sudo /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh status 00:00:42.505 Hugepages 00:00:42.505 node hugesize free / total 00:00:42.505 node0 1048576kB 0 / 0 00:00:42.505 node0 2048kB 0 / 0 00:00:42.505 node1 1048576kB 0 / 0 00:00:42.505 node1 2048kB 0 / 0 00:00:42.505 00:00:42.505 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:42.505 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:42.505 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:42.505 NVMe 0000:5e:00.0 144d a80a 0 nvme nvme0 nvme0n1 00:00:42.505 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:42.505 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:42.505 + rm -f /tmp/spdk-ld-path 00:00:42.505 + source autorun-spdk.conf 00:00:42.505 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:42.505 ++ SPDK_TEST_NVMF=1 00:00:42.505 ++ SPDK_TEST_NVME_CLI=1 00:00:42.505 ++ SPDK_TEST_NVMF_NICS=mlx5 00:00:42.505 ++ SPDK_RUN_UBSAN=1 00:00:42.505 ++ NET_TYPE=phy 00:00:42.505 ++ RUN_NIGHTLY=0 00:00:42.505 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:42.505 + [[ -n '' ]] 00:00:42.505 + sudo git config --global --add safe.directory /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:00:42.505 + for M in /var/spdk/build-*-manifest.txt 00:00:42.505 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:00:42.505 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/nvmf-phy-autotest/output/ 00:00:42.505 + for M in /var/spdk/build-*-manifest.txt 00:00:42.505 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:42.505 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/nvmf-phy-autotest/output/ 00:00:42.505 + for M in /var/spdk/build-*-manifest.txt 00:00:42.505 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:42.505 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/nvmf-phy-autotest/output/ 00:00:42.505 ++ uname 00:00:42.505 + [[ Linux == \L\i\n\u\x ]] 00:00:42.505 + sudo dmesg -T 00:00:42.505 + sudo dmesg --clear 00:00:42.505 + dmesg_pid=1566697 00:00:42.505 + [[ Fedora Linux == FreeBSD ]] 00:00:42.505 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:42.505 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:42.505 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:42.505 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:00:42.505 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:00:42.505 + [[ -x /usr/src/fio-static/fio ]] 00:00:42.505 + export FIO_BIN=/usr/src/fio-static/fio 00:00:42.505 + FIO_BIN=/usr/src/fio-static/fio 00:00:42.505 + sudo dmesg -Tw 00:00:42.505 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\n\v\m\f\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:42.505 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:42.505 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:42.505 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:42.505 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:42.505 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:42.505 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:42.505 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:42.505 + spdk/autorun.sh /var/jenkins/workspace/nvmf-phy-autotest/autorun-spdk.conf 00:00:42.505 Test configuration: 00:00:42.505 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:42.505 SPDK_TEST_NVMF=1 00:00:42.505 SPDK_TEST_NVME_CLI=1 00:00:42.505 SPDK_TEST_NVMF_NICS=mlx5 00:00:42.505 SPDK_RUN_UBSAN=1 00:00:42.505 NET_TYPE=phy 00:00:42.505 RUN_NIGHTLY=0 15:05:44 -- common/autotest_common.sh@1680 -- $ [[ n == y ]] 00:00:42.505 15:05:44 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:00:42.505 15:05:44 -- scripts/common.sh@15 -- $ shopt -s extglob 00:00:42.505 15:05:44 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:42.505 15:05:44 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:42.505 15:05:44 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:42.505 15:05:44 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:42.505 15:05:44 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:42.505 15:05:44 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:42.505 15:05:44 -- paths/export.sh@5 -- $ export PATH 00:00:42.505 15:05:44 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:42.505 15:05:44 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/nvmf-phy-autotest/spdk/../output 00:00:42.505 15:05:44 -- common/autobuild_common.sh@479 -- $ date +%s 00:00:42.505 15:05:44 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1727442344.XXXXXX 00:00:42.505 15:05:44 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1727442344.0PoFfe 00:00:42.505 15:05:44 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:00:42.505 15:05:44 -- common/autobuild_common.sh@485 -- $ '[' -n '' ']' 00:00:42.505 15:05:44 -- common/autobuild_common.sh@488 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/' 00:00:42.505 15:05:44 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/nvmf-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:42.505 15:05:44 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/nvmf-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:42.505 15:05:44 -- common/autobuild_common.sh@495 -- $ get_config_params 00:00:42.505 15:05:44 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:00:42.505 15:05:44 -- common/autotest_common.sh@10 -- $ set +x 00:00:42.505 15:05:44 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk' 00:00:42.505 15:05:44 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:00:42.505 15:05:44 -- pm/common@17 -- $ local monitor 00:00:42.505 15:05:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:42.505 15:05:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:42.505 15:05:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:42.505 15:05:44 -- pm/common@21 -- $ date +%s 00:00:42.505 15:05:44 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:42.505 15:05:44 -- pm/common@21 -- $ date +%s 00:00:42.505 15:05:44 -- pm/common@25 -- $ sleep 1 00:00:42.505 15:05:44 -- pm/common@21 -- $ date +%s 00:00:42.505 15:05:44 -- pm/common@21 -- $ date +%s 00:00:42.505 15:05:44 -- pm/common@21 -- $ /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1727442344 00:00:42.505 15:05:44 -- pm/common@21 -- $ /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1727442344 00:00:42.505 15:05:44 -- pm/common@21 -- $ /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1727442344 00:00:42.505 15:05:44 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1727442344 00:00:42.505 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1727442344_collect-vmstat.pm.log 00:00:42.505 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1727442344_collect-cpu-load.pm.log 00:00:42.505 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1727442344_collect-cpu-temp.pm.log 00:00:42.505 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1727442344_collect-bmc-pm.bmc.pm.log 00:00:43.441 15:05:45 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:00:43.441 15:05:45 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:43.441 15:05:45 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:43.441 15:05:45 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:00:43.441 15:05:45 -- spdk/autobuild.sh@16 -- $ date -u 00:00:43.441 Fri Sep 27 01:05:45 PM UTC 2024 00:00:43.441 15:05:45 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:43.441 v25.01-pre-24-g71dc0c1e9 00:00:43.441 15:05:45 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:43.441 15:05:45 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:43.441 15:05:45 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:43.441 15:05:45 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:00:43.441 15:05:45 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:00:43.441 15:05:45 -- common/autotest_common.sh@10 -- $ set +x 00:00:43.441 ************************************ 00:00:43.441 START TEST ubsan 00:00:43.441 ************************************ 00:00:43.441 15:05:45 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:00:43.441 using ubsan 00:00:43.441 00:00:43.441 real 0m0.001s 00:00:43.441 user 0m0.000s 00:00:43.441 sys 0m0.000s 00:00:43.441 15:05:45 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:00:43.441 15:05:45 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:43.441 ************************************ 00:00:43.441 END TEST ubsan 00:00:43.441 ************************************ 00:00:43.441 15:05:45 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:43.441 15:05:45 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:43.441 15:05:45 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:43.441 15:05:45 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:43.441 15:05:45 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:43.441 15:05:45 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:43.441 15:05:45 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:43.441 15:05:45 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:43.441 15:05:45 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/nvmf-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:43.700 Using default SPDK env in /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk 00:00:43.700 Using default DPDK in /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build 00:00:43.958 Using 'verbs' RDMA provider 00:00:59.787 Configuring ISA-L (logfile: /var/jenkins/workspace/nvmf-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:12.006 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/nvmf-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:12.577 Creating mk/config.mk...done. 00:01:12.577 Creating mk/cc.flags.mk...done. 00:01:12.577 Type 'make' to build. 00:01:12.577 15:06:14 -- spdk/autobuild.sh@70 -- $ run_test make make -j72 00:01:12.577 15:06:14 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:12.577 15:06:14 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:12.577 15:06:14 -- common/autotest_common.sh@10 -- $ set +x 00:01:12.577 ************************************ 00:01:12.577 START TEST make 00:01:12.577 ************************************ 00:01:12.577 15:06:14 make -- common/autotest_common.sh@1125 -- $ make -j72 00:01:13.149 make[1]: Nothing to be done for 'all'. 00:01:23.149 The Meson build system 00:01:23.149 Version: 1.5.0 00:01:23.149 Source dir: /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk 00:01:23.149 Build dir: /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build-tmp 00:01:23.149 Build type: native build 00:01:23.149 Program cat found: YES (/usr/bin/cat) 00:01:23.149 Project name: DPDK 00:01:23.149 Project version: 24.03.0 00:01:23.149 C compiler for the host machine: cc (gcc 13.3.1 "cc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:23.149 C linker for the host machine: cc ld.bfd 2.40-14 00:01:23.149 Host machine cpu family: x86_64 00:01:23.149 Host machine cpu: x86_64 00:01:23.149 Message: ## Building in Developer Mode ## 00:01:23.149 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:23.149 Program check-symbols.sh found: YES (/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:23.149 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:23.149 Program python3 found: YES (/usr/bin/python3) 00:01:23.149 Program cat found: YES (/usr/bin/cat) 00:01:23.149 Compiler for C supports arguments -march=native: YES 00:01:23.149 Checking for size of "void *" : 8 00:01:23.149 Checking for size of "void *" : 8 (cached) 00:01:23.149 Compiler for C supports link arguments -Wl,--undefined-version: YES 00:01:23.149 Library m found: YES 00:01:23.149 Library numa found: YES 00:01:23.149 Has header "numaif.h" : YES 00:01:23.149 Library fdt found: NO 00:01:23.149 Library execinfo found: NO 00:01:23.149 Has header "execinfo.h" : YES 00:01:23.149 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:23.149 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:23.149 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:23.149 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:23.149 Run-time dependency openssl found: YES 3.1.1 00:01:23.149 Run-time dependency libpcap found: YES 1.10.4 00:01:23.149 Has header "pcap.h" with dependency libpcap: YES 00:01:23.149 Compiler for C supports arguments -Wcast-qual: YES 00:01:23.149 Compiler for C supports arguments -Wdeprecated: YES 00:01:23.149 Compiler for C supports arguments -Wformat: YES 00:01:23.149 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:23.149 Compiler for C supports arguments -Wformat-security: NO 00:01:23.149 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:23.149 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:23.149 Compiler for C supports arguments -Wnested-externs: YES 00:01:23.149 Compiler for C supports arguments -Wold-style-definition: YES 00:01:23.149 Compiler for C supports arguments -Wpointer-arith: YES 00:01:23.149 Compiler for C supports arguments -Wsign-compare: YES 00:01:23.149 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:23.149 Compiler for C supports arguments -Wundef: YES 00:01:23.149 Compiler for C supports arguments -Wwrite-strings: YES 00:01:23.149 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:23.149 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:23.149 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:23.149 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:23.149 Program objdump found: YES (/usr/bin/objdump) 00:01:23.149 Compiler for C supports arguments -mavx512f: YES 00:01:23.149 Checking if "AVX512 checking" compiles: YES 00:01:23.149 Fetching value of define "__SSE4_2__" : 1 00:01:23.149 Fetching value of define "__AES__" : 1 00:01:23.149 Fetching value of define "__AVX__" : 1 00:01:23.149 Fetching value of define "__AVX2__" : 1 00:01:23.149 Fetching value of define "__AVX512BW__" : 1 00:01:23.149 Fetching value of define "__AVX512CD__" : 1 00:01:23.149 Fetching value of define "__AVX512DQ__" : 1 00:01:23.149 Fetching value of define "__AVX512F__" : 1 00:01:23.149 Fetching value of define "__AVX512VL__" : 1 00:01:23.149 Fetching value of define "__PCLMUL__" : 1 00:01:23.149 Fetching value of define "__RDRND__" : 1 00:01:23.149 Fetching value of define "__RDSEED__" : 1 00:01:23.149 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:23.149 Fetching value of define "__znver1__" : (undefined) 00:01:23.149 Fetching value of define "__znver2__" : (undefined) 00:01:23.149 Fetching value of define "__znver3__" : (undefined) 00:01:23.149 Fetching value of define "__znver4__" : (undefined) 00:01:23.149 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:23.149 Message: lib/log: Defining dependency "log" 00:01:23.149 Message: lib/kvargs: Defining dependency "kvargs" 00:01:23.149 Message: lib/telemetry: Defining dependency "telemetry" 00:01:23.149 Checking for function "getentropy" : NO 00:01:23.149 Message: lib/eal: Defining dependency "eal" 00:01:23.149 Message: lib/ring: Defining dependency "ring" 00:01:23.149 Message: lib/rcu: Defining dependency "rcu" 00:01:23.149 Message: lib/mempool: Defining dependency "mempool" 00:01:23.149 Message: lib/mbuf: Defining dependency "mbuf" 00:01:23.149 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:23.149 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:23.149 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:23.149 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:23.149 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:23.149 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:23.149 Compiler for C supports arguments -mpclmul: YES 00:01:23.149 Compiler for C supports arguments -maes: YES 00:01:23.149 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:23.149 Compiler for C supports arguments -mavx512bw: YES 00:01:23.149 Compiler for C supports arguments -mavx512dq: YES 00:01:23.149 Compiler for C supports arguments -mavx512vl: YES 00:01:23.149 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:23.149 Compiler for C supports arguments -mavx2: YES 00:01:23.149 Compiler for C supports arguments -mavx: YES 00:01:23.149 Message: lib/net: Defining dependency "net" 00:01:23.149 Message: lib/meter: Defining dependency "meter" 00:01:23.149 Message: lib/ethdev: Defining dependency "ethdev" 00:01:23.149 Message: lib/pci: Defining dependency "pci" 00:01:23.149 Message: lib/cmdline: Defining dependency "cmdline" 00:01:23.149 Message: lib/hash: Defining dependency "hash" 00:01:23.149 Message: lib/timer: Defining dependency "timer" 00:01:23.149 Message: lib/compressdev: Defining dependency "compressdev" 00:01:23.149 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:23.149 Message: lib/dmadev: Defining dependency "dmadev" 00:01:23.149 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:23.149 Message: lib/power: Defining dependency "power" 00:01:23.149 Message: lib/reorder: Defining dependency "reorder" 00:01:23.149 Message: lib/security: Defining dependency "security" 00:01:23.149 Has header "linux/userfaultfd.h" : YES 00:01:23.149 Has header "linux/vduse.h" : YES 00:01:23.149 Message: lib/vhost: Defining dependency "vhost" 00:01:23.149 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:23.149 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:23.149 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:23.149 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:23.149 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:23.149 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:23.149 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:23.149 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:23.149 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:23.149 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:23.149 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:23.149 Configuring doxy-api-html.conf using configuration 00:01:23.149 Configuring doxy-api-man.conf using configuration 00:01:23.149 Program mandb found: YES (/usr/bin/mandb) 00:01:23.149 Program sphinx-build found: NO 00:01:23.149 Configuring rte_build_config.h using configuration 00:01:23.149 Message: 00:01:23.149 ================= 00:01:23.149 Applications Enabled 00:01:23.149 ================= 00:01:23.149 00:01:23.149 apps: 00:01:23.150 00:01:23.150 00:01:23.150 Message: 00:01:23.150 ================= 00:01:23.150 Libraries Enabled 00:01:23.150 ================= 00:01:23.150 00:01:23.150 libs: 00:01:23.150 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:23.150 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:23.150 cryptodev, dmadev, power, reorder, security, vhost, 00:01:23.150 00:01:23.150 Message: 00:01:23.150 =============== 00:01:23.150 Drivers Enabled 00:01:23.150 =============== 00:01:23.150 00:01:23.150 common: 00:01:23.150 00:01:23.150 bus: 00:01:23.150 pci, vdev, 00:01:23.150 mempool: 00:01:23.150 ring, 00:01:23.150 dma: 00:01:23.150 00:01:23.150 net: 00:01:23.150 00:01:23.150 crypto: 00:01:23.150 00:01:23.150 compress: 00:01:23.150 00:01:23.150 vdpa: 00:01:23.150 00:01:23.150 00:01:23.150 Message: 00:01:23.150 ================= 00:01:23.150 Content Skipped 00:01:23.150 ================= 00:01:23.150 00:01:23.150 apps: 00:01:23.150 dumpcap: explicitly disabled via build config 00:01:23.150 graph: explicitly disabled via build config 00:01:23.150 pdump: explicitly disabled via build config 00:01:23.150 proc-info: explicitly disabled via build config 00:01:23.150 test-acl: explicitly disabled via build config 00:01:23.150 test-bbdev: explicitly disabled via build config 00:01:23.150 test-cmdline: explicitly disabled via build config 00:01:23.150 test-compress-perf: explicitly disabled via build config 00:01:23.150 test-crypto-perf: explicitly disabled via build config 00:01:23.150 test-dma-perf: explicitly disabled via build config 00:01:23.150 test-eventdev: explicitly disabled via build config 00:01:23.150 test-fib: explicitly disabled via build config 00:01:23.150 test-flow-perf: explicitly disabled via build config 00:01:23.150 test-gpudev: explicitly disabled via build config 00:01:23.150 test-mldev: explicitly disabled via build config 00:01:23.150 test-pipeline: explicitly disabled via build config 00:01:23.150 test-pmd: explicitly disabled via build config 00:01:23.150 test-regex: explicitly disabled via build config 00:01:23.150 test-sad: explicitly disabled via build config 00:01:23.150 test-security-perf: explicitly disabled via build config 00:01:23.150 00:01:23.150 libs: 00:01:23.150 argparse: explicitly disabled via build config 00:01:23.150 metrics: explicitly disabled via build config 00:01:23.150 acl: explicitly disabled via build config 00:01:23.150 bbdev: explicitly disabled via build config 00:01:23.150 bitratestats: explicitly disabled via build config 00:01:23.150 bpf: explicitly disabled via build config 00:01:23.150 cfgfile: explicitly disabled via build config 00:01:23.150 distributor: explicitly disabled via build config 00:01:23.150 efd: explicitly disabled via build config 00:01:23.150 eventdev: explicitly disabled via build config 00:01:23.150 dispatcher: explicitly disabled via build config 00:01:23.150 gpudev: explicitly disabled via build config 00:01:23.150 gro: explicitly disabled via build config 00:01:23.150 gso: explicitly disabled via build config 00:01:23.150 ip_frag: explicitly disabled via build config 00:01:23.150 jobstats: explicitly disabled via build config 00:01:23.150 latencystats: explicitly disabled via build config 00:01:23.150 lpm: explicitly disabled via build config 00:01:23.150 member: explicitly disabled via build config 00:01:23.150 pcapng: explicitly disabled via build config 00:01:23.150 rawdev: explicitly disabled via build config 00:01:23.150 regexdev: explicitly disabled via build config 00:01:23.150 mldev: explicitly disabled via build config 00:01:23.150 rib: explicitly disabled via build config 00:01:23.150 sched: explicitly disabled via build config 00:01:23.150 stack: explicitly disabled via build config 00:01:23.150 ipsec: explicitly disabled via build config 00:01:23.150 pdcp: explicitly disabled via build config 00:01:23.150 fib: explicitly disabled via build config 00:01:23.150 port: explicitly disabled via build config 00:01:23.150 pdump: explicitly disabled via build config 00:01:23.150 table: explicitly disabled via build config 00:01:23.150 pipeline: explicitly disabled via build config 00:01:23.150 graph: explicitly disabled via build config 00:01:23.150 node: explicitly disabled via build config 00:01:23.150 00:01:23.150 drivers: 00:01:23.150 common/cpt: not in enabled drivers build config 00:01:23.150 common/dpaax: not in enabled drivers build config 00:01:23.150 common/iavf: not in enabled drivers build config 00:01:23.150 common/idpf: not in enabled drivers build config 00:01:23.150 common/ionic: not in enabled drivers build config 00:01:23.150 common/mvep: not in enabled drivers build config 00:01:23.150 common/octeontx: not in enabled drivers build config 00:01:23.150 bus/auxiliary: not in enabled drivers build config 00:01:23.150 bus/cdx: not in enabled drivers build config 00:01:23.150 bus/dpaa: not in enabled drivers build config 00:01:23.150 bus/fslmc: not in enabled drivers build config 00:01:23.150 bus/ifpga: not in enabled drivers build config 00:01:23.150 bus/platform: not in enabled drivers build config 00:01:23.150 bus/uacce: not in enabled drivers build config 00:01:23.150 bus/vmbus: not in enabled drivers build config 00:01:23.150 common/cnxk: not in enabled drivers build config 00:01:23.150 common/mlx5: not in enabled drivers build config 00:01:23.150 common/nfp: not in enabled drivers build config 00:01:23.150 common/nitrox: not in enabled drivers build config 00:01:23.150 common/qat: not in enabled drivers build config 00:01:23.150 common/sfc_efx: not in enabled drivers build config 00:01:23.150 mempool/bucket: not in enabled drivers build config 00:01:23.150 mempool/cnxk: not in enabled drivers build config 00:01:23.150 mempool/dpaa: not in enabled drivers build config 00:01:23.150 mempool/dpaa2: not in enabled drivers build config 00:01:23.150 mempool/octeontx: not in enabled drivers build config 00:01:23.150 mempool/stack: not in enabled drivers build config 00:01:23.150 dma/cnxk: not in enabled drivers build config 00:01:23.150 dma/dpaa: not in enabled drivers build config 00:01:23.150 dma/dpaa2: not in enabled drivers build config 00:01:23.150 dma/hisilicon: not in enabled drivers build config 00:01:23.150 dma/idxd: not in enabled drivers build config 00:01:23.150 dma/ioat: not in enabled drivers build config 00:01:23.150 dma/skeleton: not in enabled drivers build config 00:01:23.150 net/af_packet: not in enabled drivers build config 00:01:23.150 net/af_xdp: not in enabled drivers build config 00:01:23.150 net/ark: not in enabled drivers build config 00:01:23.150 net/atlantic: not in enabled drivers build config 00:01:23.150 net/avp: not in enabled drivers build config 00:01:23.150 net/axgbe: not in enabled drivers build config 00:01:23.150 net/bnx2x: not in enabled drivers build config 00:01:23.150 net/bnxt: not in enabled drivers build config 00:01:23.150 net/bonding: not in enabled drivers build config 00:01:23.150 net/cnxk: not in enabled drivers build config 00:01:23.150 net/cpfl: not in enabled drivers build config 00:01:23.150 net/cxgbe: not in enabled drivers build config 00:01:23.150 net/dpaa: not in enabled drivers build config 00:01:23.150 net/dpaa2: not in enabled drivers build config 00:01:23.150 net/e1000: not in enabled drivers build config 00:01:23.150 net/ena: not in enabled drivers build config 00:01:23.150 net/enetc: not in enabled drivers build config 00:01:23.150 net/enetfec: not in enabled drivers build config 00:01:23.150 net/enic: not in enabled drivers build config 00:01:23.150 net/failsafe: not in enabled drivers build config 00:01:23.150 net/fm10k: not in enabled drivers build config 00:01:23.150 net/gve: not in enabled drivers build config 00:01:23.150 net/hinic: not in enabled drivers build config 00:01:23.150 net/hns3: not in enabled drivers build config 00:01:23.150 net/i40e: not in enabled drivers build config 00:01:23.150 net/iavf: not in enabled drivers build config 00:01:23.150 net/ice: not in enabled drivers build config 00:01:23.150 net/idpf: not in enabled drivers build config 00:01:23.150 net/igc: not in enabled drivers build config 00:01:23.150 net/ionic: not in enabled drivers build config 00:01:23.150 net/ipn3ke: not in enabled drivers build config 00:01:23.150 net/ixgbe: not in enabled drivers build config 00:01:23.150 net/mana: not in enabled drivers build config 00:01:23.150 net/memif: not in enabled drivers build config 00:01:23.150 net/mlx4: not in enabled drivers build config 00:01:23.150 net/mlx5: not in enabled drivers build config 00:01:23.150 net/mvneta: not in enabled drivers build config 00:01:23.150 net/mvpp2: not in enabled drivers build config 00:01:23.150 net/netvsc: not in enabled drivers build config 00:01:23.150 net/nfb: not in enabled drivers build config 00:01:23.150 net/nfp: not in enabled drivers build config 00:01:23.150 net/ngbe: not in enabled drivers build config 00:01:23.150 net/null: not in enabled drivers build config 00:01:23.150 net/octeontx: not in enabled drivers build config 00:01:23.150 net/octeon_ep: not in enabled drivers build config 00:01:23.150 net/pcap: not in enabled drivers build config 00:01:23.150 net/pfe: not in enabled drivers build config 00:01:23.150 net/qede: not in enabled drivers build config 00:01:23.150 net/ring: not in enabled drivers build config 00:01:23.150 net/sfc: not in enabled drivers build config 00:01:23.150 net/softnic: not in enabled drivers build config 00:01:23.150 net/tap: not in enabled drivers build config 00:01:23.150 net/thunderx: not in enabled drivers build config 00:01:23.150 net/txgbe: not in enabled drivers build config 00:01:23.150 net/vdev_netvsc: not in enabled drivers build config 00:01:23.150 net/vhost: not in enabled drivers build config 00:01:23.150 net/virtio: not in enabled drivers build config 00:01:23.150 net/vmxnet3: not in enabled drivers build config 00:01:23.150 raw/*: missing internal dependency, "rawdev" 00:01:23.150 crypto/armv8: not in enabled drivers build config 00:01:23.150 crypto/bcmfs: not in enabled drivers build config 00:01:23.150 crypto/caam_jr: not in enabled drivers build config 00:01:23.150 crypto/ccp: not in enabled drivers build config 00:01:23.150 crypto/cnxk: not in enabled drivers build config 00:01:23.150 crypto/dpaa_sec: not in enabled drivers build config 00:01:23.150 crypto/dpaa2_sec: not in enabled drivers build config 00:01:23.150 crypto/ipsec_mb: not in enabled drivers build config 00:01:23.150 crypto/mlx5: not in enabled drivers build config 00:01:23.150 crypto/mvsam: not in enabled drivers build config 00:01:23.150 crypto/nitrox: not in enabled drivers build config 00:01:23.150 crypto/null: not in enabled drivers build config 00:01:23.150 crypto/octeontx: not in enabled drivers build config 00:01:23.151 crypto/openssl: not in enabled drivers build config 00:01:23.151 crypto/scheduler: not in enabled drivers build config 00:01:23.151 crypto/uadk: not in enabled drivers build config 00:01:23.151 crypto/virtio: not in enabled drivers build config 00:01:23.151 compress/isal: not in enabled drivers build config 00:01:23.151 compress/mlx5: not in enabled drivers build config 00:01:23.151 compress/nitrox: not in enabled drivers build config 00:01:23.151 compress/octeontx: not in enabled drivers build config 00:01:23.151 compress/zlib: not in enabled drivers build config 00:01:23.151 regex/*: missing internal dependency, "regexdev" 00:01:23.151 ml/*: missing internal dependency, "mldev" 00:01:23.151 vdpa/ifc: not in enabled drivers build config 00:01:23.151 vdpa/mlx5: not in enabled drivers build config 00:01:23.151 vdpa/nfp: not in enabled drivers build config 00:01:23.151 vdpa/sfc: not in enabled drivers build config 00:01:23.151 event/*: missing internal dependency, "eventdev" 00:01:23.151 baseband/*: missing internal dependency, "bbdev" 00:01:23.151 gpu/*: missing internal dependency, "gpudev" 00:01:23.151 00:01:23.151 00:01:23.151 Build targets in project: 85 00:01:23.151 00:01:23.151 DPDK 24.03.0 00:01:23.151 00:01:23.151 User defined options 00:01:23.151 buildtype : debug 00:01:23.151 default_library : shared 00:01:23.151 libdir : lib 00:01:23.151 prefix : /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build 00:01:23.151 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -fPIC -Werror 00:01:23.151 c_link_args : 00:01:23.151 cpu_instruction_set: native 00:01:23.151 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:01:23.151 disable_libs : bbdev,argparse,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:01:23.151 enable_docs : false 00:01:23.151 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:23.151 enable_kmods : false 00:01:23.151 max_lcores : 128 00:01:23.151 tests : false 00:01:23.151 00:01:23.151 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:23.151 ninja: Entering directory `/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build-tmp' 00:01:23.151 [1/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:23.151 [2/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:23.151 [3/268] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:23.151 [4/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:23.151 [5/268] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:23.151 [6/268] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:23.151 [7/268] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:23.151 [8/268] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:23.151 [9/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:23.151 [10/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:23.151 [11/268] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:23.151 [12/268] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:23.151 [13/268] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:23.151 [14/268] Linking static target lib/librte_kvargs.a 00:01:23.151 [15/268] Linking static target lib/librte_log.a 00:01:23.151 [16/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:23.151 [17/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:23.151 [18/268] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:23.151 [19/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:23.151 [20/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:23.151 [21/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:23.151 [22/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:23.151 [23/268] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:23.151 [24/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:23.151 [25/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:23.151 [26/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:23.151 [27/268] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:23.151 [28/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:23.151 [29/268] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:23.151 [30/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:23.151 [31/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:23.151 [32/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:23.151 [33/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:23.151 [34/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:23.151 [35/268] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:23.151 [36/268] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:23.151 [37/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:23.151 [38/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:23.151 [39/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:23.151 [40/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:23.151 [41/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:23.151 [42/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:23.151 [43/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:23.151 [44/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:23.151 [45/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:23.151 [46/268] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:23.151 [47/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:23.151 [48/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:23.151 [49/268] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:23.151 [50/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:23.151 [51/268] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:23.151 [52/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:23.151 [53/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:23.151 [54/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:23.151 [55/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:23.151 [56/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:23.151 [57/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:23.151 [58/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:23.151 [59/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:23.151 [60/268] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:23.151 [61/268] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:23.151 [62/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:23.151 [63/268] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:23.151 [64/268] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:23.151 [65/268] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:23.151 [66/268] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:23.151 [67/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:23.151 [68/268] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:23.151 [69/268] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:23.151 [70/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:23.151 [71/268] Linking static target lib/librte_ring.a 00:01:23.151 [72/268] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:23.151 [73/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:23.151 [74/268] Linking static target lib/librte_telemetry.a 00:01:23.151 [75/268] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:23.151 [76/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:23.151 [77/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:23.151 [78/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:23.151 [79/268] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:23.151 [80/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:23.151 [81/268] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:23.151 [82/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:23.151 [83/268] Linking static target lib/librte_pci.a 00:01:23.151 [84/268] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:23.151 [85/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:23.151 [86/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:23.151 [87/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:23.151 [88/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:23.151 [89/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:23.151 [90/268] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:23.151 [91/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:23.151 [92/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:23.151 [93/268] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:23.151 [94/268] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.151 [95/268] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:23.151 [96/268] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:23.151 [97/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:23.151 [98/268] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:23.151 [99/268] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:23.151 [100/268] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:23.151 [101/268] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:23.413 [102/268] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:23.413 [103/268] Linking static target lib/librte_mempool.a 00:01:23.413 [104/268] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:23.413 [105/268] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:23.413 [106/268] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:23.413 [107/268] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:23.413 [108/268] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:23.413 [109/268] Linking static target lib/librte_rcu.a 00:01:23.414 [110/268] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:23.414 [111/268] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:23.414 [112/268] Linking static target lib/librte_eal.a 00:01:23.414 [113/268] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:23.414 [114/268] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:23.414 [115/268] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:23.414 [116/268] Linking static target lib/librte_net.a 00:01:23.414 [117/268] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:23.414 [118/268] Linking static target lib/librte_meter.a 00:01:23.672 [119/268] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.672 [120/268] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:23.672 [121/268] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:23.672 [122/268] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.672 [123/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:23.672 [124/268] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:23.673 [125/268] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:23.673 [126/268] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:23.673 [127/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:23.673 [128/268] Linking static target lib/librte_cmdline.a 00:01:23.673 [129/268] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:23.673 [130/268] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:23.673 [131/268] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:23.673 [132/268] Linking target lib/librte_log.so.24.1 00:01:23.673 [133/268] Linking static target lib/librte_mbuf.a 00:01:23.673 [134/268] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:23.673 [135/268] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:23.673 [136/268] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:23.673 [137/268] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:23.673 [138/268] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:23.673 [139/268] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:23.673 [140/268] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.673 [141/268] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:23.673 [142/268] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:23.673 [143/268] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:23.673 [144/268] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:23.673 [145/268] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:23.673 [146/268] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:23.673 [147/268] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:23.673 [148/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:23.673 [149/268] Linking static target lib/librte_timer.a 00:01:23.673 [150/268] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:23.673 [151/268] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:23.673 [152/268] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:23.673 [153/268] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:23.673 [154/268] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:23.673 [155/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:23.673 [156/268] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.673 [157/268] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:23.673 [158/268] Linking static target lib/librte_compressdev.a 00:01:23.673 [159/268] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:23.673 [160/268] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:23.673 [161/268] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:23.673 [162/268] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:23.673 [163/268] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:23.673 [164/268] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:23.673 [165/268] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:23.932 [166/268] Linking static target lib/librte_dmadev.a 00:01:23.932 [167/268] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.932 [168/268] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:23.932 [169/268] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:23.932 [170/268] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:23.932 [171/268] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:23.932 [172/268] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:23.932 [173/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:23.932 [174/268] Linking static target lib/librte_power.a 00:01:23.932 [175/268] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.932 [176/268] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.932 [177/268] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:23.932 [178/268] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:01:23.932 [179/268] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:23.932 [180/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:23.932 [181/268] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:23.932 [182/268] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:23.932 [183/268] Linking static target lib/librte_reorder.a 00:01:23.932 [184/268] Linking target lib/librte_kvargs.so.24.1 00:01:23.932 [185/268] Linking target lib/librte_telemetry.so.24.1 00:01:23.932 [186/268] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:23.932 [187/268] Linking static target lib/librte_security.a 00:01:23.932 [188/268] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:23.932 [189/268] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:23.932 [190/268] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:23.932 [191/268] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:23.932 [192/268] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:23.932 [193/268] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:23.932 [194/268] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:01:23.932 [195/268] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:01:23.932 [196/268] Linking static target lib/librte_hash.a 00:01:23.932 [197/268] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:23.932 [198/268] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:23.932 [199/268] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:23.932 [200/268] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:23.932 [201/268] Linking static target drivers/librte_bus_vdev.a 00:01:24.192 [202/268] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:24.192 [203/268] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:24.192 [204/268] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:24.192 [205/268] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.192 [206/268] Linking static target drivers/librte_bus_pci.a 00:01:24.192 [207/268] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:24.192 [208/268] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.192 [209/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:24.192 [210/268] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:24.192 [211/268] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:24.192 [212/268] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:24.192 [213/268] Linking static target drivers/librte_mempool_ring.a 00:01:24.192 [214/268] Linking static target lib/librte_cryptodev.a 00:01:24.192 [215/268] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.451 [216/268] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.451 [217/268] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.451 [218/268] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.451 [219/268] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.451 [220/268] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.711 [221/268] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:24.711 [222/268] Linking static target lib/librte_ethdev.a 00:01:24.711 [223/268] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:24.711 [224/268] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.970 [225/268] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.970 [226/268] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.970 [227/268] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.907 [228/268] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:25.907 [229/268] Linking static target lib/librte_vhost.a 00:01:26.475 [230/268] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:28.378 [231/268] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:34.971 [232/268] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.424 [233/268] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.424 [234/268] Linking target lib/librte_eal.so.24.1 00:01:36.424 [235/268] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:01:36.424 [236/268] Linking target lib/librte_pci.so.24.1 00:01:36.424 [237/268] Linking target lib/librte_ring.so.24.1 00:01:36.424 [238/268] Linking target lib/librte_meter.so.24.1 00:01:36.424 [239/268] Linking target lib/librte_timer.so.24.1 00:01:36.424 [240/268] Linking target drivers/librte_bus_vdev.so.24.1 00:01:36.424 [241/268] Linking target lib/librte_dmadev.so.24.1 00:01:36.683 [242/268] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:01:36.683 [243/268] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:01:36.683 [244/268] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:01:36.683 [245/268] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:01:36.683 [246/268] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:01:36.683 [247/268] Linking target drivers/librte_bus_pci.so.24.1 00:01:36.683 [248/268] Linking target lib/librte_rcu.so.24.1 00:01:36.683 [249/268] Linking target lib/librte_mempool.so.24.1 00:01:36.942 [250/268] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:01:36.942 [251/268] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:01:36.942 [252/268] Linking target lib/librte_mbuf.so.24.1 00:01:36.942 [253/268] Linking target drivers/librte_mempool_ring.so.24.1 00:01:36.942 [254/268] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:01:36.942 [255/268] Linking target lib/librte_compressdev.so.24.1 00:01:36.942 [256/268] Linking target lib/librte_net.so.24.1 00:01:36.942 [257/268] Linking target lib/librte_reorder.so.24.1 00:01:37.200 [258/268] Linking target lib/librte_cryptodev.so.24.1 00:01:37.200 [259/268] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:01:37.200 [260/268] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:01:37.200 [261/268] Linking target lib/librte_cmdline.so.24.1 00:01:37.200 [262/268] Linking target lib/librte_hash.so.24.1 00:01:37.200 [263/268] Linking target lib/librte_ethdev.so.24.1 00:01:37.200 [264/268] Linking target lib/librte_security.so.24.1 00:01:37.459 [265/268] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:01:37.459 [266/268] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:01:37.459 [267/268] Linking target lib/librte_power.so.24.1 00:01:37.459 [268/268] Linking target lib/librte_vhost.so.24.1 00:01:37.459 INFO: autodetecting backend as ninja 00:01:37.459 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build-tmp -j 72 00:01:45.584 CC lib/ut_mock/mock.o 00:01:45.584 CC lib/ut/ut.o 00:01:45.584 CC lib/log/log.o 00:01:45.584 CC lib/log/log_flags.o 00:01:45.584 CC lib/log/log_deprecated.o 00:01:45.843 LIB libspdk_ut.a 00:01:45.843 LIB libspdk_log.a 00:01:45.843 LIB libspdk_ut_mock.a 00:01:45.843 SO libspdk_ut.so.2.0 00:01:45.843 SO libspdk_log.so.7.0 00:01:45.843 SO libspdk_ut_mock.so.6.0 00:01:46.102 SYMLINK libspdk_ut.so 00:01:46.102 SYMLINK libspdk_log.so 00:01:46.102 SYMLINK libspdk_ut_mock.so 00:01:46.361 CXX lib/trace_parser/trace.o 00:01:46.361 CC lib/util/base64.o 00:01:46.361 CC lib/dma/dma.o 00:01:46.361 CC lib/util/bit_array.o 00:01:46.361 CC lib/util/cpuset.o 00:01:46.361 CC lib/ioat/ioat.o 00:01:46.361 CC lib/util/crc16.o 00:01:46.361 CC lib/util/crc32.o 00:01:46.361 CC lib/util/crc32c.o 00:01:46.361 CC lib/util/crc32_ieee.o 00:01:46.361 CC lib/util/crc64.o 00:01:46.361 CC lib/util/dif.o 00:01:46.361 CC lib/util/fd.o 00:01:46.361 CC lib/util/fd_group.o 00:01:46.361 CC lib/util/file.o 00:01:46.361 CC lib/util/hexlify.o 00:01:46.361 CC lib/util/iov.o 00:01:46.361 CC lib/util/math.o 00:01:46.361 CC lib/util/net.o 00:01:46.361 CC lib/util/pipe.o 00:01:46.361 CC lib/util/strerror_tls.o 00:01:46.361 CC lib/util/string.o 00:01:46.361 CC lib/util/uuid.o 00:01:46.361 CC lib/util/xor.o 00:01:46.361 CC lib/util/zipf.o 00:01:46.361 CC lib/util/md5.o 00:01:46.620 CC lib/vfio_user/host/vfio_user_pci.o 00:01:46.620 CC lib/vfio_user/host/vfio_user.o 00:01:46.620 LIB libspdk_dma.a 00:01:46.620 SO libspdk_dma.so.5.0 00:01:46.620 LIB libspdk_ioat.a 00:01:46.620 SO libspdk_ioat.so.7.0 00:01:46.620 SYMLINK libspdk_dma.so 00:01:46.620 SYMLINK libspdk_ioat.so 00:01:46.879 LIB libspdk_vfio_user.a 00:01:46.879 SO libspdk_vfio_user.so.5.0 00:01:46.879 LIB libspdk_util.a 00:01:46.879 SYMLINK libspdk_vfio_user.so 00:01:46.879 SO libspdk_util.so.10.0 00:01:47.139 SYMLINK libspdk_util.so 00:01:47.139 LIB libspdk_trace_parser.a 00:01:47.139 SO libspdk_trace_parser.so.6.0 00:01:47.139 SYMLINK libspdk_trace_parser.so 00:01:47.397 CC lib/vmd/led.o 00:01:47.397 CC lib/vmd/vmd.o 00:01:47.397 CC lib/rdma_utils/rdma_utils.o 00:01:47.397 CC lib/idxd/idxd.o 00:01:47.397 CC lib/idxd/idxd_user.o 00:01:47.397 CC lib/json/json_parse.o 00:01:47.397 CC lib/json/json_util.o 00:01:47.397 CC lib/idxd/idxd_kernel.o 00:01:47.397 CC lib/json/json_write.o 00:01:47.397 CC lib/conf/conf.o 00:01:47.397 CC lib/env_dpdk/env.o 00:01:47.397 CC lib/env_dpdk/memory.o 00:01:47.397 CC lib/env_dpdk/pci.o 00:01:47.397 CC lib/env_dpdk/init.o 00:01:47.398 CC lib/env_dpdk/threads.o 00:01:47.398 CC lib/env_dpdk/pci_ioat.o 00:01:47.398 CC lib/rdma_provider/common.o 00:01:47.398 CC lib/rdma_provider/rdma_provider_verbs.o 00:01:47.398 CC lib/env_dpdk/pci_virtio.o 00:01:47.398 CC lib/env_dpdk/pci_vmd.o 00:01:47.398 CC lib/env_dpdk/pci_idxd.o 00:01:47.398 CC lib/env_dpdk/pci_event.o 00:01:47.398 CC lib/env_dpdk/sigbus_handler.o 00:01:47.398 CC lib/env_dpdk/pci_dpdk.o 00:01:47.398 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:47.398 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:47.654 LIB libspdk_rdma_provider.a 00:01:47.654 LIB libspdk_conf.a 00:01:47.654 SO libspdk_rdma_provider.so.6.0 00:01:47.654 SO libspdk_conf.so.6.0 00:01:47.654 LIB libspdk_rdma_utils.a 00:01:47.654 SO libspdk_rdma_utils.so.1.0 00:01:47.654 LIB libspdk_json.a 00:01:47.654 SYMLINK libspdk_conf.so 00:01:47.654 SYMLINK libspdk_rdma_provider.so 00:01:47.911 SO libspdk_json.so.6.0 00:01:47.911 SYMLINK libspdk_rdma_utils.so 00:01:47.911 SYMLINK libspdk_json.so 00:01:47.911 LIB libspdk_idxd.a 00:01:47.911 SO libspdk_idxd.so.12.1 00:01:47.911 LIB libspdk_vmd.a 00:01:47.911 SO libspdk_vmd.so.6.0 00:01:47.911 SYMLINK libspdk_idxd.so 00:01:48.168 SYMLINK libspdk_vmd.so 00:01:48.168 CC lib/jsonrpc/jsonrpc_server.o 00:01:48.168 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:48.168 CC lib/jsonrpc/jsonrpc_client.o 00:01:48.168 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:48.495 LIB libspdk_jsonrpc.a 00:01:48.495 SO libspdk_jsonrpc.so.6.0 00:01:48.495 LIB libspdk_env_dpdk.a 00:01:48.495 SYMLINK libspdk_jsonrpc.so 00:01:48.495 SO libspdk_env_dpdk.so.15.0 00:01:48.755 SYMLINK libspdk_env_dpdk.so 00:01:49.013 CC lib/rpc/rpc.o 00:01:49.013 LIB libspdk_rpc.a 00:01:49.271 SO libspdk_rpc.so.6.0 00:01:49.271 SYMLINK libspdk_rpc.so 00:01:49.531 CC lib/notify/notify.o 00:01:49.531 CC lib/notify/notify_rpc.o 00:01:49.531 CC lib/keyring/keyring.o 00:01:49.531 CC lib/trace/trace.o 00:01:49.531 CC lib/keyring/keyring_rpc.o 00:01:49.531 CC lib/trace/trace_flags.o 00:01:49.531 CC lib/trace/trace_rpc.o 00:01:49.790 LIB libspdk_notify.a 00:01:49.790 SO libspdk_notify.so.6.0 00:01:49.790 LIB libspdk_keyring.a 00:01:49.790 LIB libspdk_trace.a 00:01:49.790 SYMLINK libspdk_notify.so 00:01:49.790 SO libspdk_keyring.so.2.0 00:01:49.790 SO libspdk_trace.so.11.0 00:01:50.049 SYMLINK libspdk_keyring.so 00:01:50.049 SYMLINK libspdk_trace.so 00:01:50.307 CC lib/sock/sock.o 00:01:50.307 CC lib/sock/sock_rpc.o 00:01:50.307 CC lib/thread/thread.o 00:01:50.307 CC lib/thread/iobuf.o 00:01:50.566 LIB libspdk_sock.a 00:01:50.566 SO libspdk_sock.so.10.0 00:01:50.826 SYMLINK libspdk_sock.so 00:01:51.085 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:51.085 CC lib/nvme/nvme_ctrlr.o 00:01:51.085 CC lib/nvme/nvme_fabric.o 00:01:51.085 CC lib/nvme/nvme_ns_cmd.o 00:01:51.085 CC lib/nvme/nvme_ns.o 00:01:51.085 CC lib/nvme/nvme_pcie_common.o 00:01:51.085 CC lib/nvme/nvme_pcie.o 00:01:51.085 CC lib/nvme/nvme_qpair.o 00:01:51.085 CC lib/nvme/nvme.o 00:01:51.085 CC lib/nvme/nvme_quirks.o 00:01:51.085 CC lib/nvme/nvme_transport.o 00:01:51.085 CC lib/nvme/nvme_discovery.o 00:01:51.085 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:51.085 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:51.085 CC lib/nvme/nvme_tcp.o 00:01:51.085 CC lib/nvme/nvme_opal.o 00:01:51.085 CC lib/nvme/nvme_io_msg.o 00:01:51.085 CC lib/nvme/nvme_poll_group.o 00:01:51.085 CC lib/nvme/nvme_zns.o 00:01:51.085 CC lib/nvme/nvme_stubs.o 00:01:51.085 CC lib/nvme/nvme_auth.o 00:01:51.085 CC lib/nvme/nvme_cuse.o 00:01:51.085 CC lib/nvme/nvme_rdma.o 00:01:51.344 LIB libspdk_thread.a 00:01:51.344 SO libspdk_thread.so.10.1 00:01:51.603 SYMLINK libspdk_thread.so 00:01:51.863 CC lib/blob/blobstore.o 00:01:51.863 CC lib/blob/request.o 00:01:51.863 CC lib/init/json_config.o 00:01:51.863 CC lib/accel/accel_rpc.o 00:01:51.863 CC lib/init/subsystem_rpc.o 00:01:51.863 CC lib/init/subsystem.o 00:01:51.863 CC lib/accel/accel.o 00:01:51.863 CC lib/blob/blob_bs_dev.o 00:01:51.863 CC lib/blob/zeroes.o 00:01:51.863 CC lib/init/rpc.o 00:01:51.863 CC lib/accel/accel_sw.o 00:01:51.863 CC lib/virtio/virtio.o 00:01:51.863 CC lib/virtio/virtio_vhost_user.o 00:01:51.863 CC lib/virtio/virtio_pci.o 00:01:51.863 CC lib/virtio/virtio_vfio_user.o 00:01:51.863 CC lib/fsdev/fsdev.o 00:01:51.863 CC lib/fsdev/fsdev_io.o 00:01:51.863 CC lib/fsdev/fsdev_rpc.o 00:01:52.121 LIB libspdk_init.a 00:01:52.121 SO libspdk_init.so.6.0 00:01:52.121 LIB libspdk_virtio.a 00:01:52.121 SYMLINK libspdk_init.so 00:01:52.121 SO libspdk_virtio.so.7.0 00:01:52.378 SYMLINK libspdk_virtio.so 00:01:52.378 LIB libspdk_fsdev.a 00:01:52.378 SO libspdk_fsdev.so.1.0 00:01:52.636 CC lib/event/app.o 00:01:52.636 CC lib/event/reactor.o 00:01:52.636 CC lib/event/log_rpc.o 00:01:52.636 CC lib/event/app_rpc.o 00:01:52.636 CC lib/event/scheduler_static.o 00:01:52.636 SYMLINK libspdk_fsdev.so 00:01:52.636 LIB libspdk_accel.a 00:01:52.894 SO libspdk_accel.so.16.0 00:01:52.894 SYMLINK libspdk_accel.so 00:01:52.894 LIB libspdk_nvme.a 00:01:52.894 LIB libspdk_event.a 00:01:52.894 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:01:52.894 SO libspdk_event.so.14.0 00:01:52.894 SO libspdk_nvme.so.14.0 00:01:52.894 SYMLINK libspdk_event.so 00:01:53.152 CC lib/bdev/bdev.o 00:01:53.152 CC lib/bdev/bdev_rpc.o 00:01:53.152 CC lib/bdev/bdev_zone.o 00:01:53.152 CC lib/bdev/part.o 00:01:53.152 CC lib/bdev/scsi_nvme.o 00:01:53.152 SYMLINK libspdk_nvme.so 00:01:53.412 LIB libspdk_fuse_dispatcher.a 00:01:53.412 SO libspdk_fuse_dispatcher.so.1.0 00:01:53.412 SYMLINK libspdk_fuse_dispatcher.so 00:01:53.981 LIB libspdk_blob.a 00:01:54.245 SO libspdk_blob.so.11.0 00:01:54.245 SYMLINK libspdk_blob.so 00:01:54.505 CC lib/blobfs/blobfs.o 00:01:54.505 CC lib/blobfs/tree.o 00:01:54.505 CC lib/lvol/lvol.o 00:01:55.073 LIB libspdk_bdev.a 00:01:55.073 SO libspdk_bdev.so.16.0 00:01:55.073 LIB libspdk_blobfs.a 00:01:55.332 SYMLINK libspdk_bdev.so 00:01:55.332 SO libspdk_blobfs.so.10.0 00:01:55.332 LIB libspdk_lvol.a 00:01:55.332 SYMLINK libspdk_blobfs.so 00:01:55.332 SO libspdk_lvol.so.10.0 00:01:55.332 SYMLINK libspdk_lvol.so 00:01:55.601 CC lib/scsi/dev.o 00:01:55.601 CC lib/scsi/lun.o 00:01:55.601 CC lib/scsi/port.o 00:01:55.601 CC lib/scsi/scsi_bdev.o 00:01:55.601 CC lib/scsi/scsi.o 00:01:55.601 CC lib/scsi/scsi_pr.o 00:01:55.601 CC lib/scsi/scsi_rpc.o 00:01:55.601 CC lib/scsi/task.o 00:01:55.601 CC lib/ftl/ftl_init.o 00:01:55.601 CC lib/nvmf/ctrlr.o 00:01:55.601 CC lib/ftl/ftl_core.o 00:01:55.601 CC lib/ftl/ftl_layout.o 00:01:55.601 CC lib/nvmf/ctrlr_bdev.o 00:01:55.601 CC lib/ftl/ftl_debug.o 00:01:55.601 CC lib/nvmf/nvmf.o 00:01:55.601 CC lib/nvmf/subsystem.o 00:01:55.601 CC lib/ftl/ftl_io.o 00:01:55.601 CC lib/nvmf/ctrlr_discovery.o 00:01:55.601 CC lib/ftl/ftl_sb.o 00:01:55.601 CC lib/nvmf/nvmf_rpc.o 00:01:55.601 CC lib/ftl/ftl_l2p.o 00:01:55.601 CC lib/nvmf/transport.o 00:01:55.601 CC lib/ftl/ftl_nv_cache.o 00:01:55.601 CC lib/ftl/ftl_l2p_flat.o 00:01:55.601 CC lib/nvmf/tcp.o 00:01:55.601 CC lib/ftl/ftl_band.o 00:01:55.601 CC lib/nvmf/stubs.o 00:01:55.601 CC lib/ftl/ftl_band_ops.o 00:01:55.601 CC lib/nvmf/mdns_server.o 00:01:55.601 CC lib/nbd/nbd.o 00:01:55.601 CC lib/ftl/ftl_writer.o 00:01:55.601 CC lib/nvmf/rdma.o 00:01:55.601 CC lib/nbd/nbd_rpc.o 00:01:55.601 CC lib/ftl/ftl_reloc.o 00:01:55.601 CC lib/ftl/ftl_rq.o 00:01:55.601 CC lib/nvmf/auth.o 00:01:55.601 CC lib/ublk/ublk.o 00:01:55.601 CC lib/ftl/ftl_l2p_cache.o 00:01:55.601 CC lib/ublk/ublk_rpc.o 00:01:55.601 CC lib/ftl/ftl_p2l.o 00:01:55.601 CC lib/ftl/ftl_p2l_log.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_startup.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_md.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_misc.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_band.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:01:55.601 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:01:55.601 CC lib/ftl/utils/ftl_conf.o 00:01:55.601 CC lib/ftl/utils/ftl_md.o 00:01:55.601 CC lib/ftl/utils/ftl_mempool.o 00:01:55.601 CC lib/ftl/utils/ftl_bitmap.o 00:01:55.601 CC lib/ftl/utils/ftl_property.o 00:01:55.601 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:01:55.601 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:01:55.601 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:01:55.601 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:01:55.601 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:01:55.601 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:01:55.601 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:01:55.601 CC lib/ftl/upgrade/ftl_sb_v5.o 00:01:55.601 CC lib/ftl/upgrade/ftl_sb_v3.o 00:01:55.601 CC lib/ftl/nvc/ftl_nvc_dev.o 00:01:55.601 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:01:55.601 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:01:55.601 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:01:55.860 CC lib/ftl/base/ftl_base_dev.o 00:01:55.860 CC lib/ftl/base/ftl_base_bdev.o 00:01:55.860 CC lib/ftl/ftl_trace.o 00:01:56.117 LIB libspdk_nbd.a 00:01:56.117 LIB libspdk_scsi.a 00:01:56.117 SO libspdk_nbd.so.7.0 00:01:56.376 SO libspdk_scsi.so.9.0 00:01:56.376 SYMLINK libspdk_nbd.so 00:01:56.376 SYMLINK libspdk_scsi.so 00:01:56.376 LIB libspdk_ublk.a 00:01:56.376 SO libspdk_ublk.so.3.0 00:01:56.376 SYMLINK libspdk_ublk.so 00:01:56.635 LIB libspdk_ftl.a 00:01:56.635 CC lib/vhost/vhost.o 00:01:56.635 CC lib/vhost/vhost_rpc.o 00:01:56.635 CC lib/vhost/vhost_scsi.o 00:01:56.635 CC lib/vhost/vhost_blk.o 00:01:56.635 CC lib/vhost/rte_vhost_user.o 00:01:56.635 CC lib/iscsi/conn.o 00:01:56.635 CC lib/iscsi/init_grp.o 00:01:56.635 CC lib/iscsi/iscsi.o 00:01:56.635 CC lib/iscsi/param.o 00:01:56.635 CC lib/iscsi/portal_grp.o 00:01:56.635 CC lib/iscsi/tgt_node.o 00:01:56.635 CC lib/iscsi/iscsi_subsystem.o 00:01:56.635 CC lib/iscsi/iscsi_rpc.o 00:01:56.635 CC lib/iscsi/task.o 00:01:56.635 SO libspdk_ftl.so.9.0 00:01:56.894 SYMLINK libspdk_ftl.so 00:01:57.461 LIB libspdk_nvmf.a 00:01:57.461 SO libspdk_nvmf.so.19.0 00:01:57.461 LIB libspdk_vhost.a 00:01:57.461 SO libspdk_vhost.so.8.0 00:01:57.720 SYMLINK libspdk_nvmf.so 00:01:57.720 SYMLINK libspdk_vhost.so 00:01:57.720 LIB libspdk_iscsi.a 00:01:57.720 SO libspdk_iscsi.so.8.0 00:01:57.979 SYMLINK libspdk_iscsi.so 00:01:58.546 CC module/env_dpdk/env_dpdk_rpc.o 00:01:58.546 CC module/sock/posix/posix.o 00:01:58.546 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:01:58.546 CC module/keyring/file/keyring.o 00:01:58.546 CC module/keyring/file/keyring_rpc.o 00:01:58.546 CC module/accel/dsa/accel_dsa_rpc.o 00:01:58.804 LIB libspdk_env_dpdk_rpc.a 00:01:58.804 CC module/accel/dsa/accel_dsa.o 00:01:58.804 CC module/fsdev/aio/fsdev_aio.o 00:01:58.804 CC module/fsdev/aio/fsdev_aio_rpc.o 00:01:58.804 CC module/accel/iaa/accel_iaa.o 00:01:58.804 CC module/fsdev/aio/linux_aio_mgr.o 00:01:58.804 CC module/accel/iaa/accel_iaa_rpc.o 00:01:58.805 CC module/blob/bdev/blob_bdev.o 00:01:58.805 CC module/scheduler/gscheduler/gscheduler.o 00:01:58.805 CC module/scheduler/dynamic/scheduler_dynamic.o 00:01:58.805 CC module/keyring/linux/keyring.o 00:01:58.805 CC module/keyring/linux/keyring_rpc.o 00:01:58.805 CC module/accel/error/accel_error.o 00:01:58.805 CC module/accel/error/accel_error_rpc.o 00:01:58.805 CC module/accel/ioat/accel_ioat_rpc.o 00:01:58.805 CC module/accel/ioat/accel_ioat.o 00:01:58.805 SO libspdk_env_dpdk_rpc.so.6.0 00:01:58.805 SYMLINK libspdk_env_dpdk_rpc.so 00:01:58.805 LIB libspdk_keyring_file.a 00:01:58.805 LIB libspdk_scheduler_dpdk_governor.a 00:01:58.805 LIB libspdk_scheduler_gscheduler.a 00:01:58.805 LIB libspdk_keyring_linux.a 00:01:58.805 SO libspdk_keyring_file.so.2.0 00:01:58.805 SO libspdk_scheduler_gscheduler.so.4.0 00:01:58.805 SO libspdk_scheduler_dpdk_governor.so.4.0 00:01:58.805 LIB libspdk_scheduler_dynamic.a 00:01:58.805 LIB libspdk_accel_ioat.a 00:01:58.805 SO libspdk_keyring_linux.so.1.0 00:01:58.805 LIB libspdk_accel_error.a 00:01:58.805 LIB libspdk_accel_iaa.a 00:01:58.805 SO libspdk_scheduler_dynamic.so.4.0 00:01:59.064 SO libspdk_accel_error.so.2.0 00:01:59.064 SYMLINK libspdk_keyring_file.so 00:01:59.064 SO libspdk_accel_ioat.so.6.0 00:01:59.064 SO libspdk_accel_iaa.so.3.0 00:01:59.064 SYMLINK libspdk_scheduler_dpdk_governor.so 00:01:59.064 LIB libspdk_accel_dsa.a 00:01:59.064 SYMLINK libspdk_scheduler_gscheduler.so 00:01:59.064 SYMLINK libspdk_keyring_linux.so 00:01:59.064 LIB libspdk_blob_bdev.a 00:01:59.064 SO libspdk_accel_dsa.so.5.0 00:01:59.064 SYMLINK libspdk_scheduler_dynamic.so 00:01:59.064 SYMLINK libspdk_accel_ioat.so 00:01:59.064 SYMLINK libspdk_accel_error.so 00:01:59.064 SO libspdk_blob_bdev.so.11.0 00:01:59.064 SYMLINK libspdk_accel_iaa.so 00:01:59.064 SYMLINK libspdk_accel_dsa.so 00:01:59.064 SYMLINK libspdk_blob_bdev.so 00:01:59.322 LIB libspdk_fsdev_aio.a 00:01:59.322 SO libspdk_fsdev_aio.so.1.0 00:01:59.322 LIB libspdk_sock_posix.a 00:01:59.322 SO libspdk_sock_posix.so.6.0 00:01:59.322 SYMLINK libspdk_fsdev_aio.so 00:01:59.322 SYMLINK libspdk_sock_posix.so 00:01:59.580 CC module/bdev/nvme/bdev_nvme_rpc.o 00:01:59.580 CC module/bdev/nvme/nvme_rpc.o 00:01:59.580 CC module/bdev/nvme/bdev_nvme.o 00:01:59.580 CC module/bdev/aio/bdev_aio.o 00:01:59.580 CC module/bdev/nvme/vbdev_opal.o 00:01:59.580 CC module/bdev/aio/bdev_aio_rpc.o 00:01:59.580 CC module/bdev/nvme/bdev_mdns_client.o 00:01:59.580 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:01:59.580 CC module/bdev/nvme/vbdev_opal_rpc.o 00:01:59.580 CC module/bdev/malloc/bdev_malloc.o 00:01:59.580 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:01:59.580 CC module/bdev/lvol/vbdev_lvol.o 00:01:59.580 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:01:59.580 CC module/bdev/malloc/bdev_malloc_rpc.o 00:01:59.580 CC module/blobfs/bdev/blobfs_bdev.o 00:01:59.580 CC module/bdev/split/vbdev_split.o 00:01:59.580 CC module/bdev/split/vbdev_split_rpc.o 00:01:59.580 CC module/bdev/zone_block/vbdev_zone_block.o 00:01:59.580 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:01:59.580 CC module/bdev/delay/vbdev_delay.o 00:01:59.580 CC module/bdev/null/bdev_null_rpc.o 00:01:59.580 CC module/bdev/virtio/bdev_virtio_blk.o 00:01:59.580 CC module/bdev/virtio/bdev_virtio_scsi.o 00:01:59.580 CC module/bdev/null/bdev_null.o 00:01:59.580 CC module/bdev/ftl/bdev_ftl.o 00:01:59.580 CC module/bdev/delay/vbdev_delay_rpc.o 00:01:59.580 CC module/bdev/ftl/bdev_ftl_rpc.o 00:01:59.580 CC module/bdev/virtio/bdev_virtio_rpc.o 00:01:59.580 CC module/bdev/gpt/gpt.o 00:01:59.580 CC module/bdev/gpt/vbdev_gpt.o 00:01:59.580 CC module/bdev/raid/bdev_raid.o 00:01:59.580 CC module/bdev/raid/bdev_raid_rpc.o 00:01:59.580 CC module/bdev/raid/bdev_raid_sb.o 00:01:59.580 CC module/bdev/raid/raid0.o 00:01:59.580 CC module/bdev/raid/raid1.o 00:01:59.580 CC module/bdev/passthru/vbdev_passthru.o 00:01:59.580 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:01:59.580 CC module/bdev/raid/concat.o 00:01:59.580 CC module/bdev/error/vbdev_error.o 00:01:59.580 CC module/bdev/error/vbdev_error_rpc.o 00:01:59.580 CC module/bdev/iscsi/bdev_iscsi.o 00:01:59.580 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:01:59.838 LIB libspdk_blobfs_bdev.a 00:01:59.838 SO libspdk_blobfs_bdev.so.6.0 00:01:59.838 LIB libspdk_bdev_split.a 00:01:59.838 LIB libspdk_bdev_null.a 00:01:59.838 SO libspdk_bdev_split.so.6.0 00:01:59.838 SO libspdk_bdev_null.so.6.0 00:01:59.838 LIB libspdk_bdev_gpt.a 00:01:59.838 SYMLINK libspdk_blobfs_bdev.so 00:01:59.838 LIB libspdk_bdev_passthru.a 00:01:59.838 LIB libspdk_bdev_ftl.a 00:01:59.838 SYMLINK libspdk_bdev_split.so 00:01:59.838 SO libspdk_bdev_gpt.so.6.0 00:01:59.838 LIB libspdk_bdev_malloc.a 00:01:59.838 SO libspdk_bdev_passthru.so.6.0 00:01:59.838 SO libspdk_bdev_ftl.so.6.0 00:01:59.838 LIB libspdk_bdev_error.a 00:01:59.838 SYMLINK libspdk_bdev_null.so 00:01:59.838 LIB libspdk_bdev_delay.a 00:02:00.097 SO libspdk_bdev_malloc.so.6.0 00:02:00.097 SO libspdk_bdev_error.so.6.0 00:02:00.097 SYMLINK libspdk_bdev_gpt.so 00:02:00.097 SO libspdk_bdev_delay.so.6.0 00:02:00.097 SYMLINK libspdk_bdev_passthru.so 00:02:00.097 LIB libspdk_bdev_zone_block.a 00:02:00.097 SYMLINK libspdk_bdev_ftl.so 00:02:00.097 SYMLINK libspdk_bdev_malloc.so 00:02:00.097 SYMLINK libspdk_bdev_error.so 00:02:00.097 SO libspdk_bdev_zone_block.so.6.0 00:02:00.097 LIB libspdk_bdev_iscsi.a 00:02:00.097 LIB libspdk_bdev_lvol.a 00:02:00.097 LIB libspdk_bdev_aio.a 00:02:00.097 SYMLINK libspdk_bdev_delay.so 00:02:00.097 LIB libspdk_bdev_virtio.a 00:02:00.097 SO libspdk_bdev_iscsi.so.6.0 00:02:00.097 SO libspdk_bdev_virtio.so.6.0 00:02:00.097 SO libspdk_bdev_lvol.so.6.0 00:02:00.097 SO libspdk_bdev_aio.so.6.0 00:02:00.097 SYMLINK libspdk_bdev_zone_block.so 00:02:00.097 SYMLINK libspdk_bdev_aio.so 00:02:00.097 SYMLINK libspdk_bdev_lvol.so 00:02:00.097 SYMLINK libspdk_bdev_iscsi.so 00:02:00.097 SYMLINK libspdk_bdev_virtio.so 00:02:00.356 LIB libspdk_bdev_raid.a 00:02:00.616 SO libspdk_bdev_raid.so.6.0 00:02:00.616 SYMLINK libspdk_bdev_raid.so 00:02:01.555 LIB libspdk_bdev_nvme.a 00:02:01.555 SO libspdk_bdev_nvme.so.7.0 00:02:01.555 SYMLINK libspdk_bdev_nvme.so 00:02:02.494 CC module/event/subsystems/vmd/vmd.o 00:02:02.494 CC module/event/subsystems/iobuf/iobuf.o 00:02:02.494 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:02.494 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:02.494 CC module/event/subsystems/scheduler/scheduler.o 00:02:02.494 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:02.494 CC module/event/subsystems/fsdev/fsdev.o 00:02:02.494 CC module/event/subsystems/keyring/keyring.o 00:02:02.494 CC module/event/subsystems/sock/sock.o 00:02:02.494 LIB libspdk_event_vhost_blk.a 00:02:02.494 LIB libspdk_event_scheduler.a 00:02:02.494 LIB libspdk_event_fsdev.a 00:02:02.494 LIB libspdk_event_vmd.a 00:02:02.494 LIB libspdk_event_keyring.a 00:02:02.494 LIB libspdk_event_iobuf.a 00:02:02.494 LIB libspdk_event_sock.a 00:02:02.494 SO libspdk_event_vhost_blk.so.3.0 00:02:02.494 SO libspdk_event_scheduler.so.4.0 00:02:02.494 SO libspdk_event_vmd.so.6.0 00:02:02.494 SO libspdk_event_fsdev.so.1.0 00:02:02.494 SO libspdk_event_keyring.so.1.0 00:02:02.494 SO libspdk_event_iobuf.so.3.0 00:02:02.494 SO libspdk_event_sock.so.5.0 00:02:02.494 SYMLINK libspdk_event_vhost_blk.so 00:02:02.494 SYMLINK libspdk_event_keyring.so 00:02:02.494 SYMLINK libspdk_event_scheduler.so 00:02:02.494 SYMLINK libspdk_event_fsdev.so 00:02:02.494 SYMLINK libspdk_event_iobuf.so 00:02:02.494 SYMLINK libspdk_event_vmd.so 00:02:02.494 SYMLINK libspdk_event_sock.so 00:02:03.064 CC module/event/subsystems/accel/accel.o 00:02:03.064 LIB libspdk_event_accel.a 00:02:03.064 SO libspdk_event_accel.so.6.0 00:02:03.064 SYMLINK libspdk_event_accel.so 00:02:03.631 CC module/event/subsystems/bdev/bdev.o 00:02:03.631 LIB libspdk_event_bdev.a 00:02:03.631 SO libspdk_event_bdev.so.6.0 00:02:03.891 SYMLINK libspdk_event_bdev.so 00:02:04.150 CC module/event/subsystems/scsi/scsi.o 00:02:04.150 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:04.150 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:04.150 CC module/event/subsystems/nbd/nbd.o 00:02:04.150 CC module/event/subsystems/ublk/ublk.o 00:02:04.407 LIB libspdk_event_nbd.a 00:02:04.408 LIB libspdk_event_ublk.a 00:02:04.408 LIB libspdk_event_scsi.a 00:02:04.408 SO libspdk_event_scsi.so.6.0 00:02:04.408 SO libspdk_event_nbd.so.6.0 00:02:04.408 SO libspdk_event_ublk.so.3.0 00:02:04.408 LIB libspdk_event_nvmf.a 00:02:04.408 SYMLINK libspdk_event_scsi.so 00:02:04.408 SYMLINK libspdk_event_nbd.so 00:02:04.408 SYMLINK libspdk_event_ublk.so 00:02:04.408 SO libspdk_event_nvmf.so.6.0 00:02:04.408 SYMLINK libspdk_event_nvmf.so 00:02:04.667 CC module/event/subsystems/iscsi/iscsi.o 00:02:04.667 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:04.996 LIB libspdk_event_iscsi.a 00:02:04.996 LIB libspdk_event_vhost_scsi.a 00:02:04.996 SO libspdk_event_vhost_scsi.so.3.0 00:02:04.996 SO libspdk_event_iscsi.so.6.0 00:02:04.996 SYMLINK libspdk_event_vhost_scsi.so 00:02:04.996 SYMLINK libspdk_event_iscsi.so 00:02:05.256 SO libspdk.so.6.0 00:02:05.256 SYMLINK libspdk.so 00:02:05.515 CC app/trace_record/trace_record.o 00:02:05.515 CXX app/trace/trace.o 00:02:05.515 CC app/spdk_nvme_identify/identify.o 00:02:05.515 CC app/spdk_top/spdk_top.o 00:02:05.515 TEST_HEADER include/spdk/accel.h 00:02:05.515 TEST_HEADER include/spdk/accel_module.h 00:02:05.515 TEST_HEADER include/spdk/barrier.h 00:02:05.515 TEST_HEADER include/spdk/assert.h 00:02:05.515 TEST_HEADER include/spdk/base64.h 00:02:05.515 TEST_HEADER include/spdk/bdev.h 00:02:05.515 TEST_HEADER include/spdk/bdev_module.h 00:02:05.515 TEST_HEADER include/spdk/bdev_zone.h 00:02:05.515 CC app/spdk_lspci/spdk_lspci.o 00:02:05.515 TEST_HEADER include/spdk/bit_array.h 00:02:05.515 TEST_HEADER include/spdk/bit_pool.h 00:02:05.515 CC test/rpc_client/rpc_client_test.o 00:02:05.515 TEST_HEADER include/spdk/blob_bdev.h 00:02:05.515 TEST_HEADER include/spdk/blobfs.h 00:02:05.515 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:05.515 TEST_HEADER include/spdk/blob.h 00:02:05.515 TEST_HEADER include/spdk/conf.h 00:02:05.515 CC app/spdk_nvme_perf/perf.o 00:02:05.515 TEST_HEADER include/spdk/config.h 00:02:05.515 TEST_HEADER include/spdk/crc16.h 00:02:05.515 TEST_HEADER include/spdk/cpuset.h 00:02:05.515 TEST_HEADER include/spdk/crc64.h 00:02:05.515 TEST_HEADER include/spdk/dif.h 00:02:05.515 TEST_HEADER include/spdk/crc32.h 00:02:05.515 CC app/spdk_nvme_discover/discovery_aer.o 00:02:05.515 TEST_HEADER include/spdk/dma.h 00:02:05.515 TEST_HEADER include/spdk/endian.h 00:02:05.515 TEST_HEADER include/spdk/env_dpdk.h 00:02:05.515 TEST_HEADER include/spdk/env.h 00:02:05.515 TEST_HEADER include/spdk/fd_group.h 00:02:05.515 TEST_HEADER include/spdk/event.h 00:02:05.515 TEST_HEADER include/spdk/fd.h 00:02:05.515 TEST_HEADER include/spdk/file.h 00:02:05.515 TEST_HEADER include/spdk/fsdev_module.h 00:02:05.515 TEST_HEADER include/spdk/fsdev.h 00:02:05.515 TEST_HEADER include/spdk/ftl.h 00:02:05.515 TEST_HEADER include/spdk/fuse_dispatcher.h 00:02:05.515 TEST_HEADER include/spdk/gpt_spec.h 00:02:05.515 TEST_HEADER include/spdk/hexlify.h 00:02:05.515 TEST_HEADER include/spdk/idxd.h 00:02:05.515 TEST_HEADER include/spdk/histogram_data.h 00:02:05.515 TEST_HEADER include/spdk/idxd_spec.h 00:02:05.515 TEST_HEADER include/spdk/init.h 00:02:05.515 TEST_HEADER include/spdk/ioat_spec.h 00:02:05.515 TEST_HEADER include/spdk/ioat.h 00:02:05.515 TEST_HEADER include/spdk/iscsi_spec.h 00:02:05.515 TEST_HEADER include/spdk/json.h 00:02:05.515 TEST_HEADER include/spdk/jsonrpc.h 00:02:05.515 CC app/nvmf_tgt/nvmf_main.o 00:02:05.515 TEST_HEADER include/spdk/keyring.h 00:02:05.515 TEST_HEADER include/spdk/keyring_module.h 00:02:05.515 TEST_HEADER include/spdk/likely.h 00:02:05.515 TEST_HEADER include/spdk/log.h 00:02:05.515 TEST_HEADER include/spdk/lvol.h 00:02:05.783 TEST_HEADER include/spdk/memory.h 00:02:05.783 TEST_HEADER include/spdk/md5.h 00:02:05.783 TEST_HEADER include/spdk/mmio.h 00:02:05.783 TEST_HEADER include/spdk/nbd.h 00:02:05.783 TEST_HEADER include/spdk/net.h 00:02:05.783 TEST_HEADER include/spdk/notify.h 00:02:05.783 TEST_HEADER include/spdk/nvme.h 00:02:05.783 TEST_HEADER include/spdk/nvme_intel.h 00:02:05.783 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:05.783 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:05.783 TEST_HEADER include/spdk/nvme_spec.h 00:02:05.783 TEST_HEADER include/spdk/nvme_zns.h 00:02:05.783 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:05.783 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:05.783 CC app/iscsi_tgt/iscsi_tgt.o 00:02:05.783 TEST_HEADER include/spdk/nvmf_spec.h 00:02:05.783 TEST_HEADER include/spdk/nvmf.h 00:02:05.783 TEST_HEADER include/spdk/nvmf_transport.h 00:02:05.783 TEST_HEADER include/spdk/opal.h 00:02:05.783 TEST_HEADER include/spdk/opal_spec.h 00:02:05.783 TEST_HEADER include/spdk/pci_ids.h 00:02:05.783 TEST_HEADER include/spdk/pipe.h 00:02:05.783 TEST_HEADER include/spdk/queue.h 00:02:05.783 TEST_HEADER include/spdk/reduce.h 00:02:05.783 TEST_HEADER include/spdk/rpc.h 00:02:05.783 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:05.783 TEST_HEADER include/spdk/scheduler.h 00:02:05.783 TEST_HEADER include/spdk/scsi.h 00:02:05.783 TEST_HEADER include/spdk/scsi_spec.h 00:02:05.783 TEST_HEADER include/spdk/sock.h 00:02:05.783 TEST_HEADER include/spdk/stdinc.h 00:02:05.783 TEST_HEADER include/spdk/string.h 00:02:05.783 TEST_HEADER include/spdk/thread.h 00:02:05.783 CC app/spdk_dd/spdk_dd.o 00:02:05.783 TEST_HEADER include/spdk/trace.h 00:02:05.783 TEST_HEADER include/spdk/trace_parser.h 00:02:05.783 TEST_HEADER include/spdk/tree.h 00:02:05.783 TEST_HEADER include/spdk/ublk.h 00:02:05.783 TEST_HEADER include/spdk/util.h 00:02:05.783 TEST_HEADER include/spdk/uuid.h 00:02:05.783 TEST_HEADER include/spdk/version.h 00:02:05.783 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:05.783 TEST_HEADER include/spdk/vhost.h 00:02:05.783 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:05.783 TEST_HEADER include/spdk/vmd.h 00:02:05.783 TEST_HEADER include/spdk/xor.h 00:02:05.783 TEST_HEADER include/spdk/zipf.h 00:02:05.783 CXX test/cpp_headers/accel.o 00:02:05.783 CXX test/cpp_headers/accel_module.o 00:02:05.783 CXX test/cpp_headers/assert.o 00:02:05.783 CXX test/cpp_headers/barrier.o 00:02:05.783 CXX test/cpp_headers/base64.o 00:02:05.783 CXX test/cpp_headers/bdev.o 00:02:05.783 CXX test/cpp_headers/bdev_module.o 00:02:05.783 CXX test/cpp_headers/bdev_zone.o 00:02:05.783 CXX test/cpp_headers/bit_array.o 00:02:05.783 CXX test/cpp_headers/bit_pool.o 00:02:05.783 CXX test/cpp_headers/blob_bdev.o 00:02:05.783 CXX test/cpp_headers/blobfs_bdev.o 00:02:05.783 CXX test/cpp_headers/blobfs.o 00:02:05.783 CXX test/cpp_headers/blob.o 00:02:05.783 CXX test/cpp_headers/conf.o 00:02:05.783 CXX test/cpp_headers/config.o 00:02:05.783 CXX test/cpp_headers/cpuset.o 00:02:05.783 CXX test/cpp_headers/crc16.o 00:02:05.783 CXX test/cpp_headers/crc64.o 00:02:05.783 CXX test/cpp_headers/dif.o 00:02:05.783 CXX test/cpp_headers/crc32.o 00:02:05.783 CXX test/cpp_headers/dma.o 00:02:05.783 CXX test/cpp_headers/event.o 00:02:05.783 CXX test/cpp_headers/env.o 00:02:05.783 CXX test/cpp_headers/endian.o 00:02:05.783 CXX test/cpp_headers/env_dpdk.o 00:02:05.783 CXX test/cpp_headers/fd.o 00:02:05.783 CXX test/cpp_headers/fd_group.o 00:02:05.783 CXX test/cpp_headers/fsdev.o 00:02:05.783 CXX test/cpp_headers/file.o 00:02:05.783 CXX test/cpp_headers/fsdev_module.o 00:02:05.783 CXX test/cpp_headers/fuse_dispatcher.o 00:02:05.783 CXX test/cpp_headers/ftl.o 00:02:05.783 CXX test/cpp_headers/gpt_spec.o 00:02:05.783 CXX test/cpp_headers/histogram_data.o 00:02:05.783 CXX test/cpp_headers/idxd_spec.o 00:02:05.783 CXX test/cpp_headers/hexlify.o 00:02:05.783 CXX test/cpp_headers/init.o 00:02:05.783 CXX test/cpp_headers/idxd.o 00:02:05.783 CXX test/cpp_headers/ioat.o 00:02:05.783 CXX test/cpp_headers/ioat_spec.o 00:02:05.783 CXX test/cpp_headers/iscsi_spec.o 00:02:05.783 CC app/spdk_tgt/spdk_tgt.o 00:02:05.783 CXX test/cpp_headers/json.o 00:02:05.783 CC examples/ioat/perf/perf.o 00:02:05.783 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:05.783 CC test/env/pci/pci_ut.o 00:02:05.783 CC examples/util/zipf/zipf.o 00:02:05.783 CC examples/ioat/verify/verify.o 00:02:05.783 CC test/thread/poller_perf/poller_perf.o 00:02:05.783 CC test/env/memory/memory_ut.o 00:02:05.783 CC app/fio/nvme/fio_plugin.o 00:02:05.783 CC test/app/stub/stub.o 00:02:05.783 CC test/app/histogram_perf/histogram_perf.o 00:02:05.783 CC test/app/jsoncat/jsoncat.o 00:02:05.783 CC test/env/vtophys/vtophys.o 00:02:05.783 CC app/fio/bdev/fio_plugin.o 00:02:05.783 CC test/app/bdev_svc/bdev_svc.o 00:02:05.783 CC test/dma/test_dma/test_dma.o 00:02:06.044 LINK spdk_lspci 00:02:06.044 LINK rpc_client_test 00:02:06.044 LINK nvmf_tgt 00:02:06.044 CC test/env/mem_callbacks/mem_callbacks.o 00:02:06.044 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:06.044 LINK spdk_trace_record 00:02:06.044 LINK interrupt_tgt 00:02:06.044 LINK spdk_nvme_discover 00:02:06.044 LINK env_dpdk_post_init 00:02:06.044 LINK jsoncat 00:02:06.044 LINK zipf 00:02:06.306 CXX test/cpp_headers/jsonrpc.o 00:02:06.306 CXX test/cpp_headers/keyring.o 00:02:06.306 CXX test/cpp_headers/keyring_module.o 00:02:06.306 CXX test/cpp_headers/likely.o 00:02:06.306 CXX test/cpp_headers/log.o 00:02:06.306 CXX test/cpp_headers/lvol.o 00:02:06.306 CXX test/cpp_headers/md5.o 00:02:06.306 CXX test/cpp_headers/memory.o 00:02:06.306 LINK spdk_tgt 00:02:06.306 CXX test/cpp_headers/mmio.o 00:02:06.306 CXX test/cpp_headers/nbd.o 00:02:06.306 LINK iscsi_tgt 00:02:06.306 CXX test/cpp_headers/net.o 00:02:06.306 LINK stub 00:02:06.306 CXX test/cpp_headers/notify.o 00:02:06.306 CXX test/cpp_headers/nvme.o 00:02:06.306 CXX test/cpp_headers/nvme_intel.o 00:02:06.306 CXX test/cpp_headers/nvme_ocssd.o 00:02:06.306 LINK poller_perf 00:02:06.306 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:06.306 LINK ioat_perf 00:02:06.306 LINK histogram_perf 00:02:06.306 CXX test/cpp_headers/nvme_spec.o 00:02:06.306 CXX test/cpp_headers/nvme_zns.o 00:02:06.306 LINK vtophys 00:02:06.306 CXX test/cpp_headers/nvmf_cmd.o 00:02:06.306 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:06.306 CXX test/cpp_headers/nvmf.o 00:02:06.306 CXX test/cpp_headers/nvmf_spec.o 00:02:06.306 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:06.306 CXX test/cpp_headers/nvmf_transport.o 00:02:06.306 LINK bdev_svc 00:02:06.306 CXX test/cpp_headers/opal.o 00:02:06.306 CXX test/cpp_headers/opal_spec.o 00:02:06.306 CXX test/cpp_headers/pci_ids.o 00:02:06.306 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:06.306 CXX test/cpp_headers/pipe.o 00:02:06.306 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:06.306 CXX test/cpp_headers/queue.o 00:02:06.306 CXX test/cpp_headers/reduce.o 00:02:06.306 CXX test/cpp_headers/rpc.o 00:02:06.306 CXX test/cpp_headers/scheduler.o 00:02:06.306 CXX test/cpp_headers/scsi.o 00:02:06.306 CXX test/cpp_headers/scsi_spec.o 00:02:06.306 CXX test/cpp_headers/sock.o 00:02:06.306 CXX test/cpp_headers/stdinc.o 00:02:06.307 CXX test/cpp_headers/string.o 00:02:06.307 CXX test/cpp_headers/thread.o 00:02:06.307 CXX test/cpp_headers/trace.o 00:02:06.307 CXX test/cpp_headers/trace_parser.o 00:02:06.307 LINK verify 00:02:06.307 CXX test/cpp_headers/tree.o 00:02:06.307 CXX test/cpp_headers/ublk.o 00:02:06.307 CXX test/cpp_headers/util.o 00:02:06.307 CXX test/cpp_headers/uuid.o 00:02:06.307 LINK spdk_dd 00:02:06.307 CXX test/cpp_headers/version.o 00:02:06.566 CXX test/cpp_headers/vfio_user_pci.o 00:02:06.566 CXX test/cpp_headers/vfio_user_spec.o 00:02:06.566 CXX test/cpp_headers/vhost.o 00:02:06.566 CXX test/cpp_headers/vmd.o 00:02:06.566 CXX test/cpp_headers/xor.o 00:02:06.566 CXX test/cpp_headers/zipf.o 00:02:06.566 LINK pci_ut 00:02:06.566 LINK spdk_trace 00:02:06.566 LINK nvme_fuzz 00:02:06.825 LINK spdk_nvme 00:02:06.825 CC examples/vmd/lsvmd/lsvmd.o 00:02:06.825 CC examples/vmd/led/led.o 00:02:06.825 CC test/event/reactor/reactor.o 00:02:06.825 CC test/event/reactor_perf/reactor_perf.o 00:02:06.825 CC test/event/event_perf/event_perf.o 00:02:06.825 CC examples/sock/hello_world/hello_sock.o 00:02:06.825 CC examples/idxd/perf/perf.o 00:02:06.825 CC test/event/app_repeat/app_repeat.o 00:02:06.825 CC test/event/scheduler/scheduler.o 00:02:06.825 LINK spdk_bdev 00:02:06.825 CC examples/thread/thread/thread_ex.o 00:02:06.825 LINK spdk_nvme_perf 00:02:06.825 LINK test_dma 00:02:06.825 LINK spdk_nvme_identify 00:02:06.825 LINK lsvmd 00:02:06.825 LINK mem_callbacks 00:02:06.825 LINK spdk_top 00:02:06.825 LINK reactor_perf 00:02:06.825 LINK reactor 00:02:06.825 LINK led 00:02:06.825 LINK event_perf 00:02:06.825 LINK app_repeat 00:02:07.084 LINK vhost_fuzz 00:02:07.084 LINK hello_sock 00:02:07.084 LINK scheduler 00:02:07.084 CC app/vhost/vhost.o 00:02:07.084 LINK thread 00:02:07.084 LINK idxd_perf 00:02:07.342 LINK vhost 00:02:07.342 LINK memory_ut 00:02:07.342 CC test/nvme/boot_partition/boot_partition.o 00:02:07.342 CC test/nvme/err_injection/err_injection.o 00:02:07.342 CC test/nvme/sgl/sgl.o 00:02:07.342 CC test/nvme/aer/aer.o 00:02:07.342 CC test/nvme/simple_copy/simple_copy.o 00:02:07.342 CC test/nvme/fdp/fdp.o 00:02:07.342 CC test/nvme/overhead/overhead.o 00:02:07.342 CC test/nvme/cuse/cuse.o 00:02:07.342 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:07.342 CC test/nvme/reserve/reserve.o 00:02:07.342 CC test/nvme/startup/startup.o 00:02:07.342 CC test/nvme/compliance/nvme_compliance.o 00:02:07.342 CC test/nvme/fused_ordering/fused_ordering.o 00:02:07.342 CC test/nvme/connect_stress/connect_stress.o 00:02:07.342 CC test/nvme/e2edp/nvme_dp.o 00:02:07.342 CC test/nvme/reset/reset.o 00:02:07.342 CC test/accel/dif/dif.o 00:02:07.342 CC test/blobfs/mkfs/mkfs.o 00:02:07.342 CC examples/nvme/arbitration/arbitration.o 00:02:07.342 CC examples/nvme/abort/abort.o 00:02:07.342 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:07.342 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:07.342 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:07.342 CC examples/nvme/hotplug/hotplug.o 00:02:07.342 CC examples/nvme/hello_world/hello_world.o 00:02:07.342 CC examples/nvme/reconnect/reconnect.o 00:02:07.342 CC test/lvol/esnap/esnap.o 00:02:07.601 CC examples/accel/perf/accel_perf.o 00:02:07.601 CC examples/fsdev/hello_world/hello_fsdev.o 00:02:07.601 CC examples/blob/cli/blobcli.o 00:02:07.601 CC examples/blob/hello_world/hello_blob.o 00:02:07.601 LINK boot_partition 00:02:07.601 LINK doorbell_aers 00:02:07.601 LINK err_injection 00:02:07.601 LINK connect_stress 00:02:07.601 LINK fused_ordering 00:02:07.601 LINK startup 00:02:07.601 LINK pmr_persistence 00:02:07.601 LINK reserve 00:02:07.601 LINK sgl 00:02:07.601 LINK hello_world 00:02:07.601 LINK simple_copy 00:02:07.601 LINK cmb_copy 00:02:07.601 LINK aer 00:02:07.601 LINK mkfs 00:02:07.601 LINK reset 00:02:07.601 LINK overhead 00:02:07.601 LINK nvme_dp 00:02:07.601 LINK hotplug 00:02:07.601 LINK fdp 00:02:07.601 LINK nvme_compliance 00:02:07.860 LINK arbitration 00:02:07.860 LINK reconnect 00:02:07.860 LINK abort 00:02:07.860 LINK hello_blob 00:02:07.860 LINK hello_fsdev 00:02:07.860 LINK nvme_manage 00:02:07.860 LINK iscsi_fuzz 00:02:07.860 LINK accel_perf 00:02:08.119 LINK blobcli 00:02:08.119 LINK dif 00:02:08.378 LINK cuse 00:02:08.378 CC examples/bdev/hello_world/hello_bdev.o 00:02:08.378 CC examples/bdev/bdevperf/bdevperf.o 00:02:08.637 CC test/bdev/bdevio/bdevio.o 00:02:08.637 LINK hello_bdev 00:02:08.897 LINK bdevio 00:02:09.156 LINK bdevperf 00:02:09.724 CC examples/nvmf/nvmf/nvmf.o 00:02:09.982 LINK nvmf 00:02:11.362 LINK esnap 00:02:11.362 00:02:11.362 real 0m58.763s 00:02:11.362 user 8m10.310s 00:02:11.362 sys 3m23.944s 00:02:11.362 15:07:13 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:11.362 15:07:13 make -- common/autotest_common.sh@10 -- $ set +x 00:02:11.362 ************************************ 00:02:11.362 END TEST make 00:02:11.362 ************************************ 00:02:11.362 15:07:13 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:11.362 15:07:13 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:11.362 15:07:13 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:11.362 15:07:13 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.362 15:07:13 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:11.362 15:07:13 -- pm/common@44 -- $ pid=1566728 00:02:11.362 15:07:13 -- pm/common@50 -- $ kill -TERM 1566728 00:02:11.362 15:07:13 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.362 15:07:13 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:11.362 15:07:13 -- pm/common@44 -- $ pid=1566730 00:02:11.362 15:07:13 -- pm/common@50 -- $ kill -TERM 1566730 00:02:11.362 15:07:13 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.362 15:07:13 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:11.362 15:07:13 -- pm/common@44 -- $ pid=1566731 00:02:11.362 15:07:13 -- pm/common@50 -- $ kill -TERM 1566731 00:02:11.362 15:07:13 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.362 15:07:13 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:11.362 15:07:13 -- pm/common@44 -- $ pid=1566755 00:02:11.362 15:07:13 -- pm/common@50 -- $ sudo -E kill -TERM 1566755 00:02:11.622 15:07:13 -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:02:11.622 15:07:13 -- common/autotest_common.sh@1681 -- # lcov --version 00:02:11.622 15:07:13 -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:02:11.622 15:07:13 -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:02:11.622 15:07:13 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:02:11.622 15:07:13 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:02:11.622 15:07:13 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:02:11.622 15:07:13 -- scripts/common.sh@336 -- # IFS=.-: 00:02:11.622 15:07:13 -- scripts/common.sh@336 -- # read -ra ver1 00:02:11.622 15:07:13 -- scripts/common.sh@337 -- # IFS=.-: 00:02:11.622 15:07:13 -- scripts/common.sh@337 -- # read -ra ver2 00:02:11.622 15:07:13 -- scripts/common.sh@338 -- # local 'op=<' 00:02:11.622 15:07:13 -- scripts/common.sh@340 -- # ver1_l=2 00:02:11.622 15:07:13 -- scripts/common.sh@341 -- # ver2_l=1 00:02:11.622 15:07:13 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:02:11.622 15:07:13 -- scripts/common.sh@344 -- # case "$op" in 00:02:11.622 15:07:13 -- scripts/common.sh@345 -- # : 1 00:02:11.622 15:07:13 -- scripts/common.sh@364 -- # (( v = 0 )) 00:02:11.622 15:07:13 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:11.622 15:07:13 -- scripts/common.sh@365 -- # decimal 1 00:02:11.622 15:07:13 -- scripts/common.sh@353 -- # local d=1 00:02:11.622 15:07:13 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:11.622 15:07:13 -- scripts/common.sh@355 -- # echo 1 00:02:11.622 15:07:13 -- scripts/common.sh@365 -- # ver1[v]=1 00:02:11.622 15:07:13 -- scripts/common.sh@366 -- # decimal 2 00:02:11.622 15:07:13 -- scripts/common.sh@353 -- # local d=2 00:02:11.622 15:07:13 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:11.622 15:07:13 -- scripts/common.sh@355 -- # echo 2 00:02:11.622 15:07:13 -- scripts/common.sh@366 -- # ver2[v]=2 00:02:11.622 15:07:13 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:02:11.622 15:07:13 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:02:11.622 15:07:13 -- scripts/common.sh@368 -- # return 0 00:02:11.622 15:07:13 -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:11.622 15:07:13 -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:02:11.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:11.622 --rc genhtml_branch_coverage=1 00:02:11.622 --rc genhtml_function_coverage=1 00:02:11.622 --rc genhtml_legend=1 00:02:11.622 --rc geninfo_all_blocks=1 00:02:11.622 --rc geninfo_unexecuted_blocks=1 00:02:11.622 00:02:11.622 ' 00:02:11.622 15:07:13 -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:02:11.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:11.622 --rc genhtml_branch_coverage=1 00:02:11.622 --rc genhtml_function_coverage=1 00:02:11.622 --rc genhtml_legend=1 00:02:11.622 --rc geninfo_all_blocks=1 00:02:11.622 --rc geninfo_unexecuted_blocks=1 00:02:11.622 00:02:11.622 ' 00:02:11.622 15:07:13 -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:02:11.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:11.622 --rc genhtml_branch_coverage=1 00:02:11.622 --rc genhtml_function_coverage=1 00:02:11.622 --rc genhtml_legend=1 00:02:11.622 --rc geninfo_all_blocks=1 00:02:11.622 --rc geninfo_unexecuted_blocks=1 00:02:11.622 00:02:11.622 ' 00:02:11.622 15:07:13 -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:02:11.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:11.622 --rc genhtml_branch_coverage=1 00:02:11.622 --rc genhtml_function_coverage=1 00:02:11.622 --rc genhtml_legend=1 00:02:11.622 --rc geninfo_all_blocks=1 00:02:11.622 --rc geninfo_unexecuted_blocks=1 00:02:11.622 00:02:11.622 ' 00:02:11.622 15:07:13 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:02:11.622 15:07:13 -- nvmf/common.sh@7 -- # uname -s 00:02:11.622 15:07:13 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:11.622 15:07:13 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:11.622 15:07:13 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:11.622 15:07:13 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:11.622 15:07:13 -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:11.622 15:07:13 -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:02:11.623 15:07:13 -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:11.623 15:07:13 -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:02:11.623 15:07:13 -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:02:11.623 15:07:13 -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:02:11.623 15:07:13 -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:11.623 15:07:13 -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:02:11.623 15:07:13 -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:02:11.623 15:07:13 -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:11.623 15:07:13 -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:02:11.623 15:07:13 -- scripts/common.sh@15 -- # shopt -s extglob 00:02:11.623 15:07:13 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:11.623 15:07:13 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:11.623 15:07:13 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:11.623 15:07:13 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.623 15:07:13 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.623 15:07:13 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.623 15:07:13 -- paths/export.sh@5 -- # export PATH 00:02:11.623 15:07:13 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:11.623 15:07:13 -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:02:11.623 15:07:13 -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:02:11.623 15:07:13 -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:02:11.623 15:07:13 -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:02:11.623 15:07:13 -- nvmf/common.sh@50 -- # : 0 00:02:11.623 15:07:13 -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:02:11.623 15:07:13 -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:02:11.623 15:07:13 -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:02:11.623 15:07:13 -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:11.623 15:07:13 -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:11.623 15:07:13 -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:02:11.623 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:02:11.623 15:07:13 -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:02:11.623 15:07:13 -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:02:11.623 15:07:13 -- nvmf/common.sh@54 -- # have_pci_nics=0 00:02:11.623 15:07:13 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:11.623 15:07:13 -- spdk/autotest.sh@32 -- # uname -s 00:02:11.623 15:07:13 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:11.623 15:07:13 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:11.623 15:07:13 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/coredumps 00:02:11.623 15:07:13 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:11.623 15:07:13 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/coredumps 00:02:11.623 15:07:13 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:11.623 15:07:13 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:11.623 15:07:13 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:11.623 15:07:13 -- spdk/autotest.sh@48 -- # udevadm_pid=1627077 00:02:11.623 15:07:13 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:11.623 15:07:13 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:11.623 15:07:13 -- pm/common@17 -- # local monitor 00:02:11.623 15:07:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.623 15:07:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.623 15:07:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.623 15:07:13 -- pm/common@21 -- # date +%s 00:02:11.623 15:07:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.623 15:07:13 -- pm/common@21 -- # date +%s 00:02:11.623 15:07:13 -- pm/common@25 -- # sleep 1 00:02:11.623 15:07:13 -- pm/common@21 -- # date +%s 00:02:11.623 15:07:13 -- pm/common@21 -- # date +%s 00:02:11.623 15:07:13 -- pm/common@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1727442433 00:02:11.623 15:07:13 -- pm/common@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1727442433 00:02:11.623 15:07:13 -- pm/common@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1727442433 00:02:11.623 15:07:13 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1727442433 00:02:11.882 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autotest.sh.1727442433_collect-vmstat.pm.log 00:02:11.882 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autotest.sh.1727442433_collect-cpu-load.pm.log 00:02:11.882 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autotest.sh.1727442433_collect-cpu-temp.pm.log 00:02:11.882 Redirecting to /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power/monitor.autotest.sh.1727442433_collect-bmc-pm.bmc.pm.log 00:02:12.820 15:07:14 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:12.820 15:07:14 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:12.820 15:07:14 -- common/autotest_common.sh@724 -- # xtrace_disable 00:02:12.820 15:07:14 -- common/autotest_common.sh@10 -- # set +x 00:02:12.820 15:07:14 -- spdk/autotest.sh@59 -- # create_test_list 00:02:12.820 15:07:14 -- common/autotest_common.sh@748 -- # xtrace_disable 00:02:12.820 15:07:14 -- common/autotest_common.sh@10 -- # set +x 00:02:12.820 15:07:14 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/nvmf-phy-autotest/spdk/autotest.sh 00:02:12.820 15:07:14 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:02:12.820 15:07:14 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/nvmf-phy-autotest/spdk 00:02:12.820 15:07:14 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/nvmf-phy-autotest/spdk/../output 00:02:12.820 15:07:14 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:02:12.820 15:07:14 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:12.820 15:07:14 -- common/autotest_common.sh@1455 -- # uname 00:02:12.820 15:07:14 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:02:12.820 15:07:14 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:12.820 15:07:14 -- common/autotest_common.sh@1475 -- # uname 00:02:12.820 15:07:14 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:02:12.820 15:07:14 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:02:12.820 15:07:14 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --version 00:02:12.820 lcov: LCOV version 1.15 00:02:12.820 15:07:14 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/nvmf-phy-autotest/spdk -o /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/cov_base.info 00:02:25.030 /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:25.031 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:02:39.911 15:07:39 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:02:39.911 15:07:39 -- common/autotest_common.sh@724 -- # xtrace_disable 00:02:39.911 15:07:39 -- common/autotest_common.sh@10 -- # set +x 00:02:39.911 15:07:39 -- spdk/autotest.sh@78 -- # rm -f 00:02:39.911 15:07:39 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh reset 00:02:41.291 0000:5e:00.0 (144d a80a): Already using the nvme driver 00:02:41.551 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:41.551 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:41.811 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:42.071 15:07:43 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:02:42.071 15:07:43 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:02:42.071 15:07:43 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:02:42.071 15:07:43 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:02:42.071 15:07:43 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:02:42.071 15:07:43 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:02:42.071 15:07:43 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:02:42.071 15:07:43 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:42.071 15:07:43 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:02:42.071 15:07:43 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:02:42.071 15:07:43 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:02:42.071 15:07:43 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:02:42.071 15:07:43 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:02:42.071 15:07:43 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:02:42.071 15:07:43 -- scripts/common.sh@390 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:42.071 No valid GPT data, bailing 00:02:42.071 15:07:43 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:42.071 15:07:43 -- scripts/common.sh@394 -- # pt= 00:02:42.071 15:07:43 -- scripts/common.sh@395 -- # return 1 00:02:42.071 15:07:43 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:42.071 1+0 records in 00:02:42.071 1+0 records out 00:02:42.071 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00198231 s, 529 MB/s 00:02:42.071 15:07:43 -- spdk/autotest.sh@105 -- # sync 00:02:42.071 15:07:43 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:42.071 15:07:43 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:42.071 15:07:43 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:47.347 15:07:49 -- spdk/autotest.sh@111 -- # uname -s 00:02:47.347 15:07:49 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:02:47.347 15:07:49 -- spdk/autotest.sh@111 -- # [[ 0 -eq 1 ]] 00:02:47.347 15:07:49 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh status 00:02:50.638 Hugepages 00:02:50.638 node hugesize free / total 00:02:50.638 node0 1048576kB 0 / 0 00:02:50.638 node0 2048kB 0 / 0 00:02:50.638 node1 1048576kB 0 / 0 00:02:50.638 node1 2048kB 0 / 0 00:02:50.638 00:02:50.638 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:50.638 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:02:50.638 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:02:50.898 NVMe 0000:5e:00.0 144d a80a 0 nvme nvme0 nvme0n1 00:02:50.898 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:02:50.898 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:02:50.898 15:07:52 -- spdk/autotest.sh@117 -- # uname -s 00:02:50.898 15:07:52 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:02:50.898 15:07:52 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:02:50.898 15:07:52 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh 00:02:54.194 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:02:54.194 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:02:55.746 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:02:56.013 15:07:57 -- common/autotest_common.sh@1515 -- # sleep 1 00:02:56.948 15:07:58 -- common/autotest_common.sh@1516 -- # bdfs=() 00:02:56.948 15:07:58 -- common/autotest_common.sh@1516 -- # local bdfs 00:02:56.948 15:07:58 -- common/autotest_common.sh@1518 -- # bdfs=($(get_nvme_bdfs)) 00:02:56.948 15:07:58 -- common/autotest_common.sh@1518 -- # get_nvme_bdfs 00:02:56.948 15:07:58 -- common/autotest_common.sh@1496 -- # bdfs=() 00:02:56.948 15:07:58 -- common/autotest_common.sh@1496 -- # local bdfs 00:02:56.948 15:07:58 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:02:56.948 15:07:58 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/gen_nvme.sh 00:02:56.948 15:07:58 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:02:57.206 15:07:58 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:02:57.206 15:07:58 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:5e:00.0 00:02:57.206 15:07:58 -- common/autotest_common.sh@1520 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh reset 00:03:00.496 Waiting for block devices as requested 00:03:00.496 0000:5e:00.0 (144d a80a): vfio-pci -> nvme 00:03:00.496 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:03:00.496 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:03:00.496 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:03:00.755 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:03:00.755 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:03:00.755 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:03:01.015 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:03:01.015 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:03:01.015 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:03:01.274 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:03:01.274 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:03:01.274 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:03:01.533 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:03:01.533 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:03:01.533 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:03:01.792 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:03:01.792 15:08:03 -- common/autotest_common.sh@1522 -- # for bdf in "${bdfs[@]}" 00:03:01.792 15:08:03 -- common/autotest_common.sh@1523 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1485 -- # readlink -f /sys/class/nvme/nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1485 -- # grep 0000:5e:00.0/nvme/nvme 00:03:01.792 15:08:03 -- common/autotest_common.sh@1485 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1486 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:03:01.792 15:08:03 -- common/autotest_common.sh@1490 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1490 -- # printf '%s\n' nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1523 -- # nvme_ctrlr=/dev/nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1524 -- # [[ -z /dev/nvme0 ]] 00:03:01.792 15:08:03 -- common/autotest_common.sh@1529 -- # nvme id-ctrl /dev/nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1529 -- # grep oacs 00:03:01.792 15:08:03 -- common/autotest_common.sh@1529 -- # cut -d: -f2 00:03:01.792 15:08:03 -- common/autotest_common.sh@1529 -- # oacs=' 0x5f' 00:03:01.792 15:08:03 -- common/autotest_common.sh@1530 -- # oacs_ns_manage=8 00:03:01.792 15:08:03 -- common/autotest_common.sh@1532 -- # [[ 8 -ne 0 ]] 00:03:01.792 15:08:03 -- common/autotest_common.sh@1538 -- # nvme id-ctrl /dev/nvme0 00:03:01.792 15:08:03 -- common/autotest_common.sh@1538 -- # grep unvmcap 00:03:01.792 15:08:03 -- common/autotest_common.sh@1538 -- # cut -d: -f2 00:03:01.792 15:08:03 -- common/autotest_common.sh@1538 -- # unvmcap=' 0' 00:03:01.792 15:08:03 -- common/autotest_common.sh@1539 -- # [[ 0 -eq 0 ]] 00:03:01.792 15:08:03 -- common/autotest_common.sh@1541 -- # continue 00:03:01.792 15:08:03 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:03:01.792 15:08:03 -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:01.792 15:08:03 -- common/autotest_common.sh@10 -- # set +x 00:03:02.051 15:08:03 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:03:02.051 15:08:03 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:02.051 15:08:03 -- common/autotest_common.sh@10 -- # set +x 00:03:02.051 15:08:03 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh 00:03:05.346 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:03:05.346 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:05.346 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:05.606 15:08:07 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:03:05.606 15:08:07 -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:05.606 15:08:07 -- common/autotest_common.sh@10 -- # set +x 00:03:05.606 15:08:07 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:03:05.606 15:08:07 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:03:05.606 15:08:07 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:03:05.606 15:08:07 -- common/autotest_common.sh@1561 -- # bdfs=() 00:03:05.606 15:08:07 -- common/autotest_common.sh@1561 -- # _bdfs=() 00:03:05.606 15:08:07 -- common/autotest_common.sh@1561 -- # local bdfs _bdfs 00:03:05.606 15:08:07 -- common/autotest_common.sh@1562 -- # _bdfs=($(get_nvme_bdfs)) 00:03:05.606 15:08:07 -- common/autotest_common.sh@1562 -- # get_nvme_bdfs 00:03:05.606 15:08:07 -- common/autotest_common.sh@1496 -- # bdfs=() 00:03:05.606 15:08:07 -- common/autotest_common.sh@1496 -- # local bdfs 00:03:05.606 15:08:07 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:03:05.606 15:08:07 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/gen_nvme.sh 00:03:05.606 15:08:07 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:03:05.606 15:08:07 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:03:05.606 15:08:07 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:5e:00.0 00:03:05.606 15:08:07 -- common/autotest_common.sh@1563 -- # for bdf in "${_bdfs[@]}" 00:03:05.606 15:08:07 -- common/autotest_common.sh@1564 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:03:05.606 15:08:07 -- common/autotest_common.sh@1564 -- # device=0xa80a 00:03:05.606 15:08:07 -- common/autotest_common.sh@1565 -- # [[ 0xa80a == \0\x\0\a\5\4 ]] 00:03:05.606 15:08:07 -- common/autotest_common.sh@1570 -- # (( 0 > 0 )) 00:03:05.606 15:08:07 -- common/autotest_common.sh@1570 -- # return 0 00:03:05.606 15:08:07 -- common/autotest_common.sh@1577 -- # [[ -z '' ]] 00:03:05.606 15:08:07 -- common/autotest_common.sh@1578 -- # return 0 00:03:05.606 15:08:07 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:03:05.606 15:08:07 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:03:05.606 15:08:07 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:03:05.606 15:08:07 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:03:05.606 15:08:07 -- spdk/autotest.sh@149 -- # timing_enter lib 00:03:05.606 15:08:07 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:05.606 15:08:07 -- common/autotest_common.sh@10 -- # set +x 00:03:05.606 15:08:07 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:03:05.606 15:08:07 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/env.sh 00:03:05.606 15:08:07 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:05.606 15:08:07 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:05.606 15:08:07 -- common/autotest_common.sh@10 -- # set +x 00:03:05.606 ************************************ 00:03:05.606 START TEST env 00:03:05.606 ************************************ 00:03:05.606 15:08:07 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/env.sh 00:03:05.866 * Looking for test storage... 00:03:05.866 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env 00:03:05.866 15:08:07 env -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:05.866 15:08:07 env -- common/autotest_common.sh@1681 -- # lcov --version 00:03:05.866 15:08:07 env -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:05.866 15:08:07 env -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:05.866 15:08:07 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:05.866 15:08:07 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:05.867 15:08:07 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:05.867 15:08:07 env -- scripts/common.sh@336 -- # IFS=.-: 00:03:05.867 15:08:07 env -- scripts/common.sh@336 -- # read -ra ver1 00:03:05.867 15:08:07 env -- scripts/common.sh@337 -- # IFS=.-: 00:03:05.867 15:08:07 env -- scripts/common.sh@337 -- # read -ra ver2 00:03:05.867 15:08:07 env -- scripts/common.sh@338 -- # local 'op=<' 00:03:05.867 15:08:07 env -- scripts/common.sh@340 -- # ver1_l=2 00:03:05.867 15:08:07 env -- scripts/common.sh@341 -- # ver2_l=1 00:03:05.867 15:08:07 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:05.867 15:08:07 env -- scripts/common.sh@344 -- # case "$op" in 00:03:05.867 15:08:07 env -- scripts/common.sh@345 -- # : 1 00:03:05.867 15:08:07 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:05.867 15:08:07 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:05.867 15:08:07 env -- scripts/common.sh@365 -- # decimal 1 00:03:05.867 15:08:07 env -- scripts/common.sh@353 -- # local d=1 00:03:05.867 15:08:07 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:05.867 15:08:07 env -- scripts/common.sh@355 -- # echo 1 00:03:05.867 15:08:07 env -- scripts/common.sh@365 -- # ver1[v]=1 00:03:05.867 15:08:07 env -- scripts/common.sh@366 -- # decimal 2 00:03:05.867 15:08:07 env -- scripts/common.sh@353 -- # local d=2 00:03:05.867 15:08:07 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:05.867 15:08:07 env -- scripts/common.sh@355 -- # echo 2 00:03:05.867 15:08:07 env -- scripts/common.sh@366 -- # ver2[v]=2 00:03:05.867 15:08:07 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:05.867 15:08:07 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:05.867 15:08:07 env -- scripts/common.sh@368 -- # return 0 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:05.867 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:05.867 --rc genhtml_branch_coverage=1 00:03:05.867 --rc genhtml_function_coverage=1 00:03:05.867 --rc genhtml_legend=1 00:03:05.867 --rc geninfo_all_blocks=1 00:03:05.867 --rc geninfo_unexecuted_blocks=1 00:03:05.867 00:03:05.867 ' 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:05.867 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:05.867 --rc genhtml_branch_coverage=1 00:03:05.867 --rc genhtml_function_coverage=1 00:03:05.867 --rc genhtml_legend=1 00:03:05.867 --rc geninfo_all_blocks=1 00:03:05.867 --rc geninfo_unexecuted_blocks=1 00:03:05.867 00:03:05.867 ' 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:05.867 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:05.867 --rc genhtml_branch_coverage=1 00:03:05.867 --rc genhtml_function_coverage=1 00:03:05.867 --rc genhtml_legend=1 00:03:05.867 --rc geninfo_all_blocks=1 00:03:05.867 --rc geninfo_unexecuted_blocks=1 00:03:05.867 00:03:05.867 ' 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:05.867 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:05.867 --rc genhtml_branch_coverage=1 00:03:05.867 --rc genhtml_function_coverage=1 00:03:05.867 --rc genhtml_legend=1 00:03:05.867 --rc geninfo_all_blocks=1 00:03:05.867 --rc geninfo_unexecuted_blocks=1 00:03:05.867 00:03:05.867 ' 00:03:05.867 15:08:07 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/memory/memory_ut 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:05.867 15:08:07 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:05.867 15:08:07 env -- common/autotest_common.sh@10 -- # set +x 00:03:05.867 ************************************ 00:03:05.867 START TEST env_memory 00:03:05.867 ************************************ 00:03:05.867 15:08:07 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/memory/memory_ut 00:03:05.867 00:03:05.867 00:03:05.867 CUnit - A unit testing framework for C - Version 2.1-3 00:03:05.867 http://cunit.sourceforge.net/ 00:03:05.867 00:03:05.867 00:03:05.867 Suite: memory 00:03:06.126 Test: alloc and free memory map ...[2024-09-27 15:08:07.724386] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:03:06.126 passed 00:03:06.126 Test: mem map translation ...[2024-09-27 15:08:07.742901] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 595:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:03:06.126 [2024-09-27 15:08:07.742918] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 595:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:03:06.126 [2024-09-27 15:08:07.742954] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:03:06.126 [2024-09-27 15:08:07.742962] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:03:06.126 passed 00:03:06.126 Test: mem map registration ...[2024-09-27 15:08:07.781079] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:03:06.126 [2024-09-27 15:08:07.781095] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:03:06.126 passed 00:03:06.126 Test: mem map adjacent registrations ...passed 00:03:06.126 00:03:06.126 Run Summary: Type Total Ran Passed Failed Inactive 00:03:06.126 suites 1 1 n/a 0 0 00:03:06.126 tests 4 4 4 0 0 00:03:06.126 asserts 152 152 152 0 n/a 00:03:06.126 00:03:06.126 Elapsed time = 0.137 seconds 00:03:06.126 00:03:06.126 real 0m0.152s 00:03:06.126 user 0m0.141s 00:03:06.126 sys 0m0.010s 00:03:06.126 15:08:07 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:06.126 15:08:07 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:03:06.126 ************************************ 00:03:06.126 END TEST env_memory 00:03:06.126 ************************************ 00:03:06.126 15:08:07 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/vtophys/vtophys 00:03:06.126 15:08:07 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:06.126 15:08:07 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:06.126 15:08:07 env -- common/autotest_common.sh@10 -- # set +x 00:03:06.126 ************************************ 00:03:06.126 START TEST env_vtophys 00:03:06.126 ************************************ 00:03:06.126 15:08:07 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/vtophys/vtophys 00:03:06.126 EAL: lib.eal log level changed from notice to debug 00:03:06.126 EAL: Detected lcore 0 as core 0 on socket 0 00:03:06.126 EAL: Detected lcore 1 as core 1 on socket 0 00:03:06.126 EAL: Detected lcore 2 as core 2 on socket 0 00:03:06.126 EAL: Detected lcore 3 as core 3 on socket 0 00:03:06.126 EAL: Detected lcore 4 as core 4 on socket 0 00:03:06.126 EAL: Detected lcore 5 as core 8 on socket 0 00:03:06.126 EAL: Detected lcore 6 as core 9 on socket 0 00:03:06.126 EAL: Detected lcore 7 as core 10 on socket 0 00:03:06.126 EAL: Detected lcore 8 as core 11 on socket 0 00:03:06.127 EAL: Detected lcore 9 as core 16 on socket 0 00:03:06.127 EAL: Detected lcore 10 as core 17 on socket 0 00:03:06.127 EAL: Detected lcore 11 as core 18 on socket 0 00:03:06.127 EAL: Detected lcore 12 as core 19 on socket 0 00:03:06.127 EAL: Detected lcore 13 as core 20 on socket 0 00:03:06.127 EAL: Detected lcore 14 as core 24 on socket 0 00:03:06.127 EAL: Detected lcore 15 as core 25 on socket 0 00:03:06.127 EAL: Detected lcore 16 as core 26 on socket 0 00:03:06.127 EAL: Detected lcore 17 as core 27 on socket 0 00:03:06.127 EAL: Detected lcore 18 as core 0 on socket 1 00:03:06.127 EAL: Detected lcore 19 as core 1 on socket 1 00:03:06.127 EAL: Detected lcore 20 as core 2 on socket 1 00:03:06.127 EAL: Detected lcore 21 as core 3 on socket 1 00:03:06.127 EAL: Detected lcore 22 as core 4 on socket 1 00:03:06.127 EAL: Detected lcore 23 as core 8 on socket 1 00:03:06.127 EAL: Detected lcore 24 as core 9 on socket 1 00:03:06.127 EAL: Detected lcore 25 as core 10 on socket 1 00:03:06.127 EAL: Detected lcore 26 as core 11 on socket 1 00:03:06.127 EAL: Detected lcore 27 as core 16 on socket 1 00:03:06.127 EAL: Detected lcore 28 as core 17 on socket 1 00:03:06.127 EAL: Detected lcore 29 as core 18 on socket 1 00:03:06.127 EAL: Detected lcore 30 as core 19 on socket 1 00:03:06.127 EAL: Detected lcore 31 as core 20 on socket 1 00:03:06.127 EAL: Detected lcore 32 as core 24 on socket 1 00:03:06.127 EAL: Detected lcore 33 as core 25 on socket 1 00:03:06.127 EAL: Detected lcore 34 as core 26 on socket 1 00:03:06.127 EAL: Detected lcore 35 as core 27 on socket 1 00:03:06.127 EAL: Detected lcore 36 as core 0 on socket 0 00:03:06.127 EAL: Detected lcore 37 as core 1 on socket 0 00:03:06.127 EAL: Detected lcore 38 as core 2 on socket 0 00:03:06.127 EAL: Detected lcore 39 as core 3 on socket 0 00:03:06.127 EAL: Detected lcore 40 as core 4 on socket 0 00:03:06.127 EAL: Detected lcore 41 as core 8 on socket 0 00:03:06.127 EAL: Detected lcore 42 as core 9 on socket 0 00:03:06.127 EAL: Detected lcore 43 as core 10 on socket 0 00:03:06.127 EAL: Detected lcore 44 as core 11 on socket 0 00:03:06.127 EAL: Detected lcore 45 as core 16 on socket 0 00:03:06.127 EAL: Detected lcore 46 as core 17 on socket 0 00:03:06.127 EAL: Detected lcore 47 as core 18 on socket 0 00:03:06.127 EAL: Detected lcore 48 as core 19 on socket 0 00:03:06.127 EAL: Detected lcore 49 as core 20 on socket 0 00:03:06.127 EAL: Detected lcore 50 as core 24 on socket 0 00:03:06.127 EAL: Detected lcore 51 as core 25 on socket 0 00:03:06.127 EAL: Detected lcore 52 as core 26 on socket 0 00:03:06.127 EAL: Detected lcore 53 as core 27 on socket 0 00:03:06.127 EAL: Detected lcore 54 as core 0 on socket 1 00:03:06.127 EAL: Detected lcore 55 as core 1 on socket 1 00:03:06.127 EAL: Detected lcore 56 as core 2 on socket 1 00:03:06.127 EAL: Detected lcore 57 as core 3 on socket 1 00:03:06.127 EAL: Detected lcore 58 as core 4 on socket 1 00:03:06.127 EAL: Detected lcore 59 as core 8 on socket 1 00:03:06.127 EAL: Detected lcore 60 as core 9 on socket 1 00:03:06.127 EAL: Detected lcore 61 as core 10 on socket 1 00:03:06.127 EAL: Detected lcore 62 as core 11 on socket 1 00:03:06.127 EAL: Detected lcore 63 as core 16 on socket 1 00:03:06.127 EAL: Detected lcore 64 as core 17 on socket 1 00:03:06.127 EAL: Detected lcore 65 as core 18 on socket 1 00:03:06.127 EAL: Detected lcore 66 as core 19 on socket 1 00:03:06.127 EAL: Detected lcore 67 as core 20 on socket 1 00:03:06.127 EAL: Detected lcore 68 as core 24 on socket 1 00:03:06.127 EAL: Detected lcore 69 as core 25 on socket 1 00:03:06.127 EAL: Detected lcore 70 as core 26 on socket 1 00:03:06.127 EAL: Detected lcore 71 as core 27 on socket 1 00:03:06.127 EAL: Maximum logical cores by configuration: 128 00:03:06.127 EAL: Detected CPU lcores: 72 00:03:06.127 EAL: Detected NUMA nodes: 2 00:03:06.127 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:03:06.127 EAL: Detected shared linkage of DPDK 00:03:06.127 EAL: No shared files mode enabled, IPC will be disabled 00:03:06.127 EAL: Bus pci wants IOVA as 'DC' 00:03:06.127 EAL: Buses did not request a specific IOVA mode. 00:03:06.127 EAL: IOMMU is available, selecting IOVA as VA mode. 00:03:06.127 EAL: Selected IOVA mode 'VA' 00:03:06.127 EAL: Probing VFIO support... 00:03:06.127 EAL: IOMMU type 1 (Type 1) is supported 00:03:06.127 EAL: IOMMU type 7 (sPAPR) is not supported 00:03:06.127 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:03:06.127 EAL: VFIO support initialized 00:03:06.127 EAL: Ask a virtual area of 0x2e000 bytes 00:03:06.127 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:03:06.127 EAL: Setting up physically contiguous memory... 00:03:06.127 EAL: Setting maximum number of open files to 524288 00:03:06.127 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:03:06.127 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:03:06.127 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:03:06.127 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:03:06.127 EAL: Ask a virtual area of 0x61000 bytes 00:03:06.127 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:03:06.127 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:03:06.127 EAL: Ask a virtual area of 0x400000000 bytes 00:03:06.127 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:03:06.127 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:03:06.127 EAL: Hugepages will be freed exactly as allocated. 00:03:06.127 EAL: No shared files mode enabled, IPC is disabled 00:03:06.127 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: TSC frequency is ~2300000 KHz 00:03:06.387 EAL: Main lcore 0 is ready (tid=7f3166183a00;cpuset=[0]) 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 0 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 2MB 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: No PCI address specified using 'addr=' in: bus=pci 00:03:06.387 EAL: Mem event callback 'spdk:(nil)' registered 00:03:06.387 00:03:06.387 00:03:06.387 CUnit - A unit testing framework for C - Version 2.1-3 00:03:06.387 http://cunit.sourceforge.net/ 00:03:06.387 00:03:06.387 00:03:06.387 Suite: components_suite 00:03:06.387 Test: vtophys_malloc_test ...passed 00:03:06.387 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 4MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 4MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 6MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 6MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 10MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 10MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 18MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 18MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 34MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 34MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 66MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 66MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 130MB 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was shrunk by 130MB 00:03:06.387 EAL: Trying to obtain current memory policy. 00:03:06.387 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.387 EAL: Restoring previous memory policy: 4 00:03:06.387 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.387 EAL: request: mp_malloc_sync 00:03:06.387 EAL: No shared files mode enabled, IPC is disabled 00:03:06.387 EAL: Heap on socket 0 was expanded by 258MB 00:03:06.648 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.648 EAL: request: mp_malloc_sync 00:03:06.648 EAL: No shared files mode enabled, IPC is disabled 00:03:06.648 EAL: Heap on socket 0 was shrunk by 258MB 00:03:06.648 EAL: Trying to obtain current memory policy. 00:03:06.648 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:06.648 EAL: Restoring previous memory policy: 4 00:03:06.648 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.648 EAL: request: mp_malloc_sync 00:03:06.648 EAL: No shared files mode enabled, IPC is disabled 00:03:06.648 EAL: Heap on socket 0 was expanded by 514MB 00:03:06.908 EAL: Calling mem event callback 'spdk:(nil)' 00:03:06.908 EAL: request: mp_malloc_sync 00:03:06.908 EAL: No shared files mode enabled, IPC is disabled 00:03:06.908 EAL: Heap on socket 0 was shrunk by 514MB 00:03:06.908 EAL: Trying to obtain current memory policy. 00:03:06.908 EAL: Setting policy MPOL_PREFERRED for socket 0 00:03:07.167 EAL: Restoring previous memory policy: 4 00:03:07.167 EAL: Calling mem event callback 'spdk:(nil)' 00:03:07.167 EAL: request: mp_malloc_sync 00:03:07.167 EAL: No shared files mode enabled, IPC is disabled 00:03:07.167 EAL: Heap on socket 0 was expanded by 1026MB 00:03:07.167 EAL: Calling mem event callback 'spdk:(nil)' 00:03:07.427 EAL: request: mp_malloc_sync 00:03:07.427 EAL: No shared files mode enabled, IPC is disabled 00:03:07.427 EAL: Heap on socket 0 was shrunk by 1026MB 00:03:07.427 passed 00:03:07.427 00:03:07.427 Run Summary: Type Total Ran Passed Failed Inactive 00:03:07.427 suites 1 1 n/a 0 0 00:03:07.427 tests 2 2 2 0 0 00:03:07.427 asserts 497 497 497 0 n/a 00:03:07.427 00:03:07.427 Elapsed time = 1.142 seconds 00:03:07.427 EAL: Calling mem event callback 'spdk:(nil)' 00:03:07.427 EAL: request: mp_malloc_sync 00:03:07.427 EAL: No shared files mode enabled, IPC is disabled 00:03:07.427 EAL: Heap on socket 0 was shrunk by 2MB 00:03:07.427 EAL: No shared files mode enabled, IPC is disabled 00:03:07.427 EAL: No shared files mode enabled, IPC is disabled 00:03:07.427 EAL: No shared files mode enabled, IPC is disabled 00:03:07.427 00:03:07.427 real 0m1.290s 00:03:07.427 user 0m0.741s 00:03:07.427 sys 0m0.513s 00:03:07.427 15:08:09 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:07.427 15:08:09 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:03:07.427 ************************************ 00:03:07.427 END TEST env_vtophys 00:03:07.427 ************************************ 00:03:07.427 15:08:09 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/pci/pci_ut 00:03:07.427 15:08:09 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:07.427 15:08:09 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:07.427 15:08:09 env -- common/autotest_common.sh@10 -- # set +x 00:03:07.687 ************************************ 00:03:07.687 START TEST env_pci 00:03:07.687 ************************************ 00:03:07.687 15:08:09 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/pci/pci_ut 00:03:07.687 00:03:07.687 00:03:07.687 CUnit - A unit testing framework for C - Version 2.1-3 00:03:07.687 http://cunit.sourceforge.net/ 00:03:07.687 00:03:07.687 00:03:07.687 Suite: pci 00:03:07.687 Test: pci_hook ...[2024-09-27 15:08:09.312830] /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk/pci.c:1049:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1639935 has claimed it 00:03:07.687 EAL: Cannot find device (10000:00:01.0) 00:03:07.687 EAL: Failed to attach device on primary process 00:03:07.687 passed 00:03:07.687 00:03:07.687 Run Summary: Type Total Ran Passed Failed Inactive 00:03:07.687 suites 1 1 n/a 0 0 00:03:07.687 tests 1 1 1 0 0 00:03:07.687 asserts 25 25 25 0 n/a 00:03:07.687 00:03:07.687 Elapsed time = 0.032 seconds 00:03:07.687 00:03:07.687 real 0m0.055s 00:03:07.687 user 0m0.015s 00:03:07.687 sys 0m0.039s 00:03:07.687 15:08:09 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:07.687 15:08:09 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:03:07.687 ************************************ 00:03:07.687 END TEST env_pci 00:03:07.687 ************************************ 00:03:07.687 15:08:09 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:03:07.687 15:08:09 env -- env/env.sh@15 -- # uname 00:03:07.687 15:08:09 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:03:07.687 15:08:09 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:03:07.687 15:08:09 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:03:07.687 15:08:09 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:03:07.687 15:08:09 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:07.687 15:08:09 env -- common/autotest_common.sh@10 -- # set +x 00:03:07.687 ************************************ 00:03:07.687 START TEST env_dpdk_post_init 00:03:07.687 ************************************ 00:03:07.687 15:08:09 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:03:07.687 EAL: Detected CPU lcores: 72 00:03:07.687 EAL: Detected NUMA nodes: 2 00:03:07.687 EAL: Detected shared linkage of DPDK 00:03:07.687 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:03:07.687 EAL: Selected IOVA mode 'VA' 00:03:07.687 EAL: VFIO support initialized 00:03:07.687 TELEMETRY: No legacy callbacks, legacy socket not created 00:03:07.947 EAL: Using IOMMU type 1 (Type 1) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:03:07.947 EAL: Ignore mapping IO port bar(1) 00:03:07.947 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:03:08.206 EAL: Probe PCI driver: spdk_nvme (144d:a80a) device: 0000:5e:00.0 (socket 0) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:03:08.206 EAL: Ignore mapping IO port bar(1) 00:03:08.206 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:03:08.206 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:03:08.206 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001020000 00:03:08.466 Starting DPDK initialization... 00:03:08.466 Starting SPDK post initialization... 00:03:08.466 SPDK NVMe probe 00:03:08.466 Attaching to 0000:5e:00.0 00:03:08.466 Attached to 0000:5e:00.0 00:03:08.466 Cleaning up... 00:03:08.466 00:03:08.466 real 0m0.701s 00:03:08.466 user 0m0.101s 00:03:08.466 sys 0m0.190s 00:03:08.466 15:08:10 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:08.466 15:08:10 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:03:08.466 ************************************ 00:03:08.466 END TEST env_dpdk_post_init 00:03:08.466 ************************************ 00:03:08.466 15:08:10 env -- env/env.sh@26 -- # uname 00:03:08.466 15:08:10 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:03:08.466 15:08:10 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:03:08.466 15:08:10 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:08.466 15:08:10 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:08.466 15:08:10 env -- common/autotest_common.sh@10 -- # set +x 00:03:08.466 ************************************ 00:03:08.466 START TEST env_mem_callbacks 00:03:08.466 ************************************ 00:03:08.466 15:08:10 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:03:08.466 EAL: Detected CPU lcores: 72 00:03:08.466 EAL: Detected NUMA nodes: 2 00:03:08.466 EAL: Detected shared linkage of DPDK 00:03:08.466 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:03:08.466 EAL: Selected IOVA mode 'VA' 00:03:08.466 EAL: VFIO support initialized 00:03:08.466 TELEMETRY: No legacy callbacks, legacy socket not created 00:03:08.466 00:03:08.466 00:03:08.466 CUnit - A unit testing framework for C - Version 2.1-3 00:03:08.466 http://cunit.sourceforge.net/ 00:03:08.466 00:03:08.466 00:03:08.466 Suite: memory 00:03:08.466 Test: test ... 00:03:08.466 register 0x200000200000 2097152 00:03:08.466 malloc 3145728 00:03:08.466 register 0x200000400000 4194304 00:03:08.466 buf 0x200000500000 len 3145728 PASSED 00:03:08.466 malloc 64 00:03:08.466 buf 0x2000004fff40 len 64 PASSED 00:03:08.466 malloc 4194304 00:03:08.466 register 0x200000800000 6291456 00:03:08.466 buf 0x200000a00000 len 4194304 PASSED 00:03:08.466 free 0x200000500000 3145728 00:03:08.466 free 0x2000004fff40 64 00:03:08.466 unregister 0x200000400000 4194304 PASSED 00:03:08.466 free 0x200000a00000 4194304 00:03:08.466 unregister 0x200000800000 6291456 PASSED 00:03:08.466 malloc 8388608 00:03:08.466 register 0x200000400000 10485760 00:03:08.466 buf 0x200000600000 len 8388608 PASSED 00:03:08.466 free 0x200000600000 8388608 00:03:08.466 unregister 0x200000400000 10485760 PASSED 00:03:08.725 passed 00:03:08.725 00:03:08.725 Run Summary: Type Total Ran Passed Failed Inactive 00:03:08.725 suites 1 1 n/a 0 0 00:03:08.725 tests 1 1 1 0 0 00:03:08.725 asserts 15 15 15 0 n/a 00:03:08.725 00:03:08.725 Elapsed time = 0.010 seconds 00:03:08.725 00:03:08.725 real 0m0.075s 00:03:08.725 user 0m0.022s 00:03:08.725 sys 0m0.053s 00:03:08.725 15:08:10 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:08.725 15:08:10 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:03:08.725 ************************************ 00:03:08.725 END TEST env_mem_callbacks 00:03:08.725 ************************************ 00:03:08.725 00:03:08.725 real 0m2.929s 00:03:08.725 user 0m1.306s 00:03:08.725 sys 0m1.226s 00:03:08.725 15:08:10 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:08.725 15:08:10 env -- common/autotest_common.sh@10 -- # set +x 00:03:08.725 ************************************ 00:03:08.725 END TEST env 00:03:08.725 ************************************ 00:03:08.725 15:08:10 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/rpc.sh 00:03:08.725 15:08:10 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:08.725 15:08:10 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:08.725 15:08:10 -- common/autotest_common.sh@10 -- # set +x 00:03:08.725 ************************************ 00:03:08.725 START TEST rpc 00:03:08.725 ************************************ 00:03:08.725 15:08:10 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/rpc.sh 00:03:08.725 * Looking for test storage... 00:03:08.725 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc 00:03:08.725 15:08:10 rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:08.725 15:08:10 rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:03:08.725 15:08:10 rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:08.984 15:08:10 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:08.984 15:08:10 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:03:08.984 15:08:10 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:03:08.984 15:08:10 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:03:08.984 15:08:10 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:08.984 15:08:10 rpc -- scripts/common.sh@344 -- # case "$op" in 00:03:08.984 15:08:10 rpc -- scripts/common.sh@345 -- # : 1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:08.984 15:08:10 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:08.984 15:08:10 rpc -- scripts/common.sh@365 -- # decimal 1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@353 -- # local d=1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:08.984 15:08:10 rpc -- scripts/common.sh@355 -- # echo 1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:03:08.984 15:08:10 rpc -- scripts/common.sh@366 -- # decimal 2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@353 -- # local d=2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:08.984 15:08:10 rpc -- scripts/common.sh@355 -- # echo 2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:03:08.984 15:08:10 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:08.984 15:08:10 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:08.984 15:08:10 rpc -- scripts/common.sh@368 -- # return 0 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:08.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:08.984 --rc genhtml_branch_coverage=1 00:03:08.984 --rc genhtml_function_coverage=1 00:03:08.984 --rc genhtml_legend=1 00:03:08.984 --rc geninfo_all_blocks=1 00:03:08.984 --rc geninfo_unexecuted_blocks=1 00:03:08.984 00:03:08.984 ' 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:08.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:08.984 --rc genhtml_branch_coverage=1 00:03:08.984 --rc genhtml_function_coverage=1 00:03:08.984 --rc genhtml_legend=1 00:03:08.984 --rc geninfo_all_blocks=1 00:03:08.984 --rc geninfo_unexecuted_blocks=1 00:03:08.984 00:03:08.984 ' 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:08.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:08.984 --rc genhtml_branch_coverage=1 00:03:08.984 --rc genhtml_function_coverage=1 00:03:08.984 --rc genhtml_legend=1 00:03:08.984 --rc geninfo_all_blocks=1 00:03:08.984 --rc geninfo_unexecuted_blocks=1 00:03:08.984 00:03:08.984 ' 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:08.984 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:08.984 --rc genhtml_branch_coverage=1 00:03:08.984 --rc genhtml_function_coverage=1 00:03:08.984 --rc genhtml_legend=1 00:03:08.984 --rc geninfo_all_blocks=1 00:03:08.984 --rc geninfo_unexecuted_blocks=1 00:03:08.984 00:03:08.984 ' 00:03:08.984 15:08:10 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1640248 00:03:08.984 15:08:10 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:03:08.984 15:08:10 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:03:08.984 15:08:10 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1640248 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@831 -- # '[' -z 1640248 ']' 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:08.984 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:08.984 15:08:10 rpc -- common/autotest_common.sh@10 -- # set +x 00:03:08.984 [2024-09-27 15:08:10.710969] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:08.984 [2024-09-27 15:08:10.711024] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640248 ] 00:03:08.984 [2024-09-27 15:08:10.775419] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:09.264 [2024-09-27 15:08:10.861850] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:03:09.264 [2024-09-27 15:08:10.861899] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1640248' to capture a snapshot of events at runtime. 00:03:09.264 [2024-09-27 15:08:10.861911] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:03:09.264 [2024-09-27 15:08:10.861920] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:03:09.264 [2024-09-27 15:08:10.861929] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1640248 for offline analysis/debug. 00:03:09.264 [2024-09-27 15:08:10.861956] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:09.834 15:08:11 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:09.834 15:08:11 rpc -- common/autotest_common.sh@864 -- # return 0 00:03:09.834 15:08:11 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc 00:03:09.834 15:08:11 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc 00:03:09.834 15:08:11 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:03:09.834 15:08:11 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:03:09.834 15:08:11 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:09.834 15:08:11 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:09.834 15:08:11 rpc -- common/autotest_common.sh@10 -- # set +x 00:03:09.834 ************************************ 00:03:09.834 START TEST rpc_integrity 00:03:09.834 ************************************ 00:03:09.834 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:03:09.834 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:03:09.834 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:09.834 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:09.834 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:09.834 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:03:09.834 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:03:09.834 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:03:09.834 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:03:09.834 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:09.834 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.094 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.094 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:03:10.094 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:03:10.094 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.094 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.094 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.094 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:03:10.094 { 00:03:10.094 "name": "Malloc0", 00:03:10.094 "aliases": [ 00:03:10.094 "ebefc6e0-6454-4b7e-b9e5-275a33dd3c56" 00:03:10.094 ], 00:03:10.094 "product_name": "Malloc disk", 00:03:10.094 "block_size": 512, 00:03:10.094 "num_blocks": 16384, 00:03:10.094 "uuid": "ebefc6e0-6454-4b7e-b9e5-275a33dd3c56", 00:03:10.094 "assigned_rate_limits": { 00:03:10.094 "rw_ios_per_sec": 0, 00:03:10.094 "rw_mbytes_per_sec": 0, 00:03:10.094 "r_mbytes_per_sec": 0, 00:03:10.094 "w_mbytes_per_sec": 0 00:03:10.094 }, 00:03:10.094 "claimed": false, 00:03:10.094 "zoned": false, 00:03:10.094 "supported_io_types": { 00:03:10.094 "read": true, 00:03:10.094 "write": true, 00:03:10.094 "unmap": true, 00:03:10.094 "flush": true, 00:03:10.094 "reset": true, 00:03:10.094 "nvme_admin": false, 00:03:10.094 "nvme_io": false, 00:03:10.094 "nvme_io_md": false, 00:03:10.094 "write_zeroes": true, 00:03:10.094 "zcopy": true, 00:03:10.094 "get_zone_info": false, 00:03:10.094 "zone_management": false, 00:03:10.094 "zone_append": false, 00:03:10.094 "compare": false, 00:03:10.094 "compare_and_write": false, 00:03:10.094 "abort": true, 00:03:10.094 "seek_hole": false, 00:03:10.094 "seek_data": false, 00:03:10.094 "copy": true, 00:03:10.094 "nvme_iov_md": false 00:03:10.094 }, 00:03:10.094 "memory_domains": [ 00:03:10.094 { 00:03:10.094 "dma_device_id": "system", 00:03:10.094 "dma_device_type": 1 00:03:10.094 }, 00:03:10.094 { 00:03:10.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:10.094 "dma_device_type": 2 00:03:10.094 } 00:03:10.094 ], 00:03:10.094 "driver_specific": {} 00:03:10.094 } 00:03:10.094 ]' 00:03:10.094 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:03:10.094 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:03:10.094 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.095 [2024-09-27 15:08:11.762813] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:03:10.095 [2024-09-27 15:08:11.762850] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:03:10.095 [2024-09-27 15:08:11.762864] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1322c90 00:03:10.095 [2024-09-27 15:08:11.762873] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:03:10.095 [2024-09-27 15:08:11.764037] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:03:10.095 [2024-09-27 15:08:11.764064] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:03:10.095 Passthru0 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:03:10.095 { 00:03:10.095 "name": "Malloc0", 00:03:10.095 "aliases": [ 00:03:10.095 "ebefc6e0-6454-4b7e-b9e5-275a33dd3c56" 00:03:10.095 ], 00:03:10.095 "product_name": "Malloc disk", 00:03:10.095 "block_size": 512, 00:03:10.095 "num_blocks": 16384, 00:03:10.095 "uuid": "ebefc6e0-6454-4b7e-b9e5-275a33dd3c56", 00:03:10.095 "assigned_rate_limits": { 00:03:10.095 "rw_ios_per_sec": 0, 00:03:10.095 "rw_mbytes_per_sec": 0, 00:03:10.095 "r_mbytes_per_sec": 0, 00:03:10.095 "w_mbytes_per_sec": 0 00:03:10.095 }, 00:03:10.095 "claimed": true, 00:03:10.095 "claim_type": "exclusive_write", 00:03:10.095 "zoned": false, 00:03:10.095 "supported_io_types": { 00:03:10.095 "read": true, 00:03:10.095 "write": true, 00:03:10.095 "unmap": true, 00:03:10.095 "flush": true, 00:03:10.095 "reset": true, 00:03:10.095 "nvme_admin": false, 00:03:10.095 "nvme_io": false, 00:03:10.095 "nvme_io_md": false, 00:03:10.095 "write_zeroes": true, 00:03:10.095 "zcopy": true, 00:03:10.095 "get_zone_info": false, 00:03:10.095 "zone_management": false, 00:03:10.095 "zone_append": false, 00:03:10.095 "compare": false, 00:03:10.095 "compare_and_write": false, 00:03:10.095 "abort": true, 00:03:10.095 "seek_hole": false, 00:03:10.095 "seek_data": false, 00:03:10.095 "copy": true, 00:03:10.095 "nvme_iov_md": false 00:03:10.095 }, 00:03:10.095 "memory_domains": [ 00:03:10.095 { 00:03:10.095 "dma_device_id": "system", 00:03:10.095 "dma_device_type": 1 00:03:10.095 }, 00:03:10.095 { 00:03:10.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:10.095 "dma_device_type": 2 00:03:10.095 } 00:03:10.095 ], 00:03:10.095 "driver_specific": {} 00:03:10.095 }, 00:03:10.095 { 00:03:10.095 "name": "Passthru0", 00:03:10.095 "aliases": [ 00:03:10.095 "9293d3a8-7c26-5cd1-b5fd-eff0b81dee57" 00:03:10.095 ], 00:03:10.095 "product_name": "passthru", 00:03:10.095 "block_size": 512, 00:03:10.095 "num_blocks": 16384, 00:03:10.095 "uuid": "9293d3a8-7c26-5cd1-b5fd-eff0b81dee57", 00:03:10.095 "assigned_rate_limits": { 00:03:10.095 "rw_ios_per_sec": 0, 00:03:10.095 "rw_mbytes_per_sec": 0, 00:03:10.095 "r_mbytes_per_sec": 0, 00:03:10.095 "w_mbytes_per_sec": 0 00:03:10.095 }, 00:03:10.095 "claimed": false, 00:03:10.095 "zoned": false, 00:03:10.095 "supported_io_types": { 00:03:10.095 "read": true, 00:03:10.095 "write": true, 00:03:10.095 "unmap": true, 00:03:10.095 "flush": true, 00:03:10.095 "reset": true, 00:03:10.095 "nvme_admin": false, 00:03:10.095 "nvme_io": false, 00:03:10.095 "nvme_io_md": false, 00:03:10.095 "write_zeroes": true, 00:03:10.095 "zcopy": true, 00:03:10.095 "get_zone_info": false, 00:03:10.095 "zone_management": false, 00:03:10.095 "zone_append": false, 00:03:10.095 "compare": false, 00:03:10.095 "compare_and_write": false, 00:03:10.095 "abort": true, 00:03:10.095 "seek_hole": false, 00:03:10.095 "seek_data": false, 00:03:10.095 "copy": true, 00:03:10.095 "nvme_iov_md": false 00:03:10.095 }, 00:03:10.095 "memory_domains": [ 00:03:10.095 { 00:03:10.095 "dma_device_id": "system", 00:03:10.095 "dma_device_type": 1 00:03:10.095 }, 00:03:10.095 { 00:03:10.095 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:10.095 "dma_device_type": 2 00:03:10.095 } 00:03:10.095 ], 00:03:10.095 "driver_specific": { 00:03:10.095 "passthru": { 00:03:10.095 "name": "Passthru0", 00:03:10.095 "base_bdev_name": "Malloc0" 00:03:10.095 } 00:03:10.095 } 00:03:10.095 } 00:03:10.095 ]' 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:03:10.095 15:08:11 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:03:10.095 00:03:10.095 real 0m0.298s 00:03:10.095 user 0m0.178s 00:03:10.095 sys 0m0.054s 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:10.095 15:08:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.095 ************************************ 00:03:10.095 END TEST rpc_integrity 00:03:10.095 ************************************ 00:03:10.354 15:08:11 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:03:10.354 15:08:11 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:10.355 15:08:11 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:10.355 15:08:11 rpc -- common/autotest_common.sh@10 -- # set +x 00:03:10.355 ************************************ 00:03:10.355 START TEST rpc_plugins 00:03:10.355 ************************************ 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:03:10.355 { 00:03:10.355 "name": "Malloc1", 00:03:10.355 "aliases": [ 00:03:10.355 "670c0fa9-a831-46d8-9b63-7fd705e34211" 00:03:10.355 ], 00:03:10.355 "product_name": "Malloc disk", 00:03:10.355 "block_size": 4096, 00:03:10.355 "num_blocks": 256, 00:03:10.355 "uuid": "670c0fa9-a831-46d8-9b63-7fd705e34211", 00:03:10.355 "assigned_rate_limits": { 00:03:10.355 "rw_ios_per_sec": 0, 00:03:10.355 "rw_mbytes_per_sec": 0, 00:03:10.355 "r_mbytes_per_sec": 0, 00:03:10.355 "w_mbytes_per_sec": 0 00:03:10.355 }, 00:03:10.355 "claimed": false, 00:03:10.355 "zoned": false, 00:03:10.355 "supported_io_types": { 00:03:10.355 "read": true, 00:03:10.355 "write": true, 00:03:10.355 "unmap": true, 00:03:10.355 "flush": true, 00:03:10.355 "reset": true, 00:03:10.355 "nvme_admin": false, 00:03:10.355 "nvme_io": false, 00:03:10.355 "nvme_io_md": false, 00:03:10.355 "write_zeroes": true, 00:03:10.355 "zcopy": true, 00:03:10.355 "get_zone_info": false, 00:03:10.355 "zone_management": false, 00:03:10.355 "zone_append": false, 00:03:10.355 "compare": false, 00:03:10.355 "compare_and_write": false, 00:03:10.355 "abort": true, 00:03:10.355 "seek_hole": false, 00:03:10.355 "seek_data": false, 00:03:10.355 "copy": true, 00:03:10.355 "nvme_iov_md": false 00:03:10.355 }, 00:03:10.355 "memory_domains": [ 00:03:10.355 { 00:03:10.355 "dma_device_id": "system", 00:03:10.355 "dma_device_type": 1 00:03:10.355 }, 00:03:10.355 { 00:03:10.355 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:10.355 "dma_device_type": 2 00:03:10.355 } 00:03:10.355 ], 00:03:10.355 "driver_specific": {} 00:03:10.355 } 00:03:10.355 ]' 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:03:10.355 15:08:12 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:03:10.355 00:03:10.355 real 0m0.145s 00:03:10.355 user 0m0.091s 00:03:10.355 sys 0m0.021s 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:10.355 15:08:12 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:03:10.355 ************************************ 00:03:10.355 END TEST rpc_plugins 00:03:10.355 ************************************ 00:03:10.355 15:08:12 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:03:10.355 15:08:12 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:10.355 15:08:12 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:10.355 15:08:12 rpc -- common/autotest_common.sh@10 -- # set +x 00:03:10.613 ************************************ 00:03:10.613 START TEST rpc_trace_cmd_test 00:03:10.613 ************************************ 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.613 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:03:10.613 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1640248", 00:03:10.613 "tpoint_group_mask": "0x8", 00:03:10.614 "iscsi_conn": { 00:03:10.614 "mask": "0x2", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "scsi": { 00:03:10.614 "mask": "0x4", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "bdev": { 00:03:10.614 "mask": "0x8", 00:03:10.614 "tpoint_mask": "0xffffffffffffffff" 00:03:10.614 }, 00:03:10.614 "nvmf_rdma": { 00:03:10.614 "mask": "0x10", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "nvmf_tcp": { 00:03:10.614 "mask": "0x20", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "ftl": { 00:03:10.614 "mask": "0x40", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "blobfs": { 00:03:10.614 "mask": "0x80", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "dsa": { 00:03:10.614 "mask": "0x200", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "thread": { 00:03:10.614 "mask": "0x400", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "nvme_pcie": { 00:03:10.614 "mask": "0x800", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "iaa": { 00:03:10.614 "mask": "0x1000", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "nvme_tcp": { 00:03:10.614 "mask": "0x2000", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "bdev_nvme": { 00:03:10.614 "mask": "0x4000", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "sock": { 00:03:10.614 "mask": "0x8000", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "blob": { 00:03:10.614 "mask": "0x10000", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 }, 00:03:10.614 "bdev_raid": { 00:03:10.614 "mask": "0x20000", 00:03:10.614 "tpoint_mask": "0x0" 00:03:10.614 } 00:03:10.614 }' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 18 -gt 2 ']' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:03:10.614 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:03:10.872 15:08:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:03:10.872 00:03:10.872 real 0m0.240s 00:03:10.872 user 0m0.196s 00:03:10.872 sys 0m0.035s 00:03:10.872 15:08:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:10.872 15:08:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:03:10.872 ************************************ 00:03:10.872 END TEST rpc_trace_cmd_test 00:03:10.872 ************************************ 00:03:10.872 15:08:12 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:03:10.872 15:08:12 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:03:10.872 15:08:12 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:03:10.872 15:08:12 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:10.872 15:08:12 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:10.872 15:08:12 rpc -- common/autotest_common.sh@10 -- # set +x 00:03:10.872 ************************************ 00:03:10.872 START TEST rpc_daemon_integrity 00:03:10.872 ************************************ 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:03:10.872 { 00:03:10.872 "name": "Malloc2", 00:03:10.872 "aliases": [ 00:03:10.872 "c807ecd2-fc08-4097-a6a6-bc13ad9fc9af" 00:03:10.872 ], 00:03:10.872 "product_name": "Malloc disk", 00:03:10.872 "block_size": 512, 00:03:10.872 "num_blocks": 16384, 00:03:10.872 "uuid": "c807ecd2-fc08-4097-a6a6-bc13ad9fc9af", 00:03:10.872 "assigned_rate_limits": { 00:03:10.872 "rw_ios_per_sec": 0, 00:03:10.872 "rw_mbytes_per_sec": 0, 00:03:10.872 "r_mbytes_per_sec": 0, 00:03:10.872 "w_mbytes_per_sec": 0 00:03:10.872 }, 00:03:10.872 "claimed": false, 00:03:10.872 "zoned": false, 00:03:10.872 "supported_io_types": { 00:03:10.872 "read": true, 00:03:10.872 "write": true, 00:03:10.872 "unmap": true, 00:03:10.872 "flush": true, 00:03:10.872 "reset": true, 00:03:10.872 "nvme_admin": false, 00:03:10.872 "nvme_io": false, 00:03:10.872 "nvme_io_md": false, 00:03:10.872 "write_zeroes": true, 00:03:10.872 "zcopy": true, 00:03:10.872 "get_zone_info": false, 00:03:10.872 "zone_management": false, 00:03:10.872 "zone_append": false, 00:03:10.872 "compare": false, 00:03:10.872 "compare_and_write": false, 00:03:10.872 "abort": true, 00:03:10.872 "seek_hole": false, 00:03:10.872 "seek_data": false, 00:03:10.872 "copy": true, 00:03:10.872 "nvme_iov_md": false 00:03:10.872 }, 00:03:10.872 "memory_domains": [ 00:03:10.872 { 00:03:10.872 "dma_device_id": "system", 00:03:10.872 "dma_device_type": 1 00:03:10.872 }, 00:03:10.872 { 00:03:10.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:10.872 "dma_device_type": 2 00:03:10.872 } 00:03:10.872 ], 00:03:10.872 "driver_specific": {} 00:03:10.872 } 00:03:10.872 ]' 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:10.872 [2024-09-27 15:08:12.697409] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:03:10.872 [2024-09-27 15:08:12.697446] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:03:10.872 [2024-09-27 15:08:12.697464] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13224f0 00:03:10.872 [2024-09-27 15:08:12.697474] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:03:10.872 [2024-09-27 15:08:12.698485] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:03:10.872 [2024-09-27 15:08:12.698510] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:03:10.872 Passthru0 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:10.872 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:03:10.873 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:10.873 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:03:11.132 { 00:03:11.132 "name": "Malloc2", 00:03:11.132 "aliases": [ 00:03:11.132 "c807ecd2-fc08-4097-a6a6-bc13ad9fc9af" 00:03:11.132 ], 00:03:11.132 "product_name": "Malloc disk", 00:03:11.132 "block_size": 512, 00:03:11.132 "num_blocks": 16384, 00:03:11.132 "uuid": "c807ecd2-fc08-4097-a6a6-bc13ad9fc9af", 00:03:11.132 "assigned_rate_limits": { 00:03:11.132 "rw_ios_per_sec": 0, 00:03:11.132 "rw_mbytes_per_sec": 0, 00:03:11.132 "r_mbytes_per_sec": 0, 00:03:11.132 "w_mbytes_per_sec": 0 00:03:11.132 }, 00:03:11.132 "claimed": true, 00:03:11.132 "claim_type": "exclusive_write", 00:03:11.132 "zoned": false, 00:03:11.132 "supported_io_types": { 00:03:11.132 "read": true, 00:03:11.132 "write": true, 00:03:11.132 "unmap": true, 00:03:11.132 "flush": true, 00:03:11.132 "reset": true, 00:03:11.132 "nvme_admin": false, 00:03:11.132 "nvme_io": false, 00:03:11.132 "nvme_io_md": false, 00:03:11.132 "write_zeroes": true, 00:03:11.132 "zcopy": true, 00:03:11.132 "get_zone_info": false, 00:03:11.132 "zone_management": false, 00:03:11.132 "zone_append": false, 00:03:11.132 "compare": false, 00:03:11.132 "compare_and_write": false, 00:03:11.132 "abort": true, 00:03:11.132 "seek_hole": false, 00:03:11.132 "seek_data": false, 00:03:11.132 "copy": true, 00:03:11.132 "nvme_iov_md": false 00:03:11.132 }, 00:03:11.132 "memory_domains": [ 00:03:11.132 { 00:03:11.132 "dma_device_id": "system", 00:03:11.132 "dma_device_type": 1 00:03:11.132 }, 00:03:11.132 { 00:03:11.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:11.132 "dma_device_type": 2 00:03:11.132 } 00:03:11.132 ], 00:03:11.132 "driver_specific": {} 00:03:11.132 }, 00:03:11.132 { 00:03:11.132 "name": "Passthru0", 00:03:11.132 "aliases": [ 00:03:11.132 "3198ce45-e3bb-5bd4-bc74-78b106baefe7" 00:03:11.132 ], 00:03:11.132 "product_name": "passthru", 00:03:11.132 "block_size": 512, 00:03:11.132 "num_blocks": 16384, 00:03:11.132 "uuid": "3198ce45-e3bb-5bd4-bc74-78b106baefe7", 00:03:11.132 "assigned_rate_limits": { 00:03:11.132 "rw_ios_per_sec": 0, 00:03:11.132 "rw_mbytes_per_sec": 0, 00:03:11.132 "r_mbytes_per_sec": 0, 00:03:11.132 "w_mbytes_per_sec": 0 00:03:11.132 }, 00:03:11.132 "claimed": false, 00:03:11.132 "zoned": false, 00:03:11.132 "supported_io_types": { 00:03:11.132 "read": true, 00:03:11.132 "write": true, 00:03:11.132 "unmap": true, 00:03:11.132 "flush": true, 00:03:11.132 "reset": true, 00:03:11.132 "nvme_admin": false, 00:03:11.132 "nvme_io": false, 00:03:11.132 "nvme_io_md": false, 00:03:11.132 "write_zeroes": true, 00:03:11.132 "zcopy": true, 00:03:11.132 "get_zone_info": false, 00:03:11.132 "zone_management": false, 00:03:11.132 "zone_append": false, 00:03:11.132 "compare": false, 00:03:11.132 "compare_and_write": false, 00:03:11.132 "abort": true, 00:03:11.132 "seek_hole": false, 00:03:11.132 "seek_data": false, 00:03:11.132 "copy": true, 00:03:11.132 "nvme_iov_md": false 00:03:11.132 }, 00:03:11.132 "memory_domains": [ 00:03:11.132 { 00:03:11.132 "dma_device_id": "system", 00:03:11.132 "dma_device_type": 1 00:03:11.132 }, 00:03:11.132 { 00:03:11.132 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:03:11.132 "dma_device_type": 2 00:03:11.132 } 00:03:11.132 ], 00:03:11.132 "driver_specific": { 00:03:11.132 "passthru": { 00:03:11.132 "name": "Passthru0", 00:03:11.132 "base_bdev_name": "Malloc2" 00:03:11.132 } 00:03:11.132 } 00:03:11.132 } 00:03:11.132 ]' 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:03:11.132 00:03:11.132 real 0m0.298s 00:03:11.132 user 0m0.187s 00:03:11.132 sys 0m0.053s 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:11.132 15:08:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:03:11.132 ************************************ 00:03:11.132 END TEST rpc_daemon_integrity 00:03:11.132 ************************************ 00:03:11.132 15:08:12 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:03:11.132 15:08:12 rpc -- rpc/rpc.sh@84 -- # killprocess 1640248 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@950 -- # '[' -z 1640248 ']' 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@954 -- # kill -0 1640248 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@955 -- # uname 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1640248 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1640248' 00:03:11.132 killing process with pid 1640248 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@969 -- # kill 1640248 00:03:11.132 15:08:12 rpc -- common/autotest_common.sh@974 -- # wait 1640248 00:03:11.702 00:03:11.702 real 0m2.871s 00:03:11.702 user 0m3.592s 00:03:11.702 sys 0m0.921s 00:03:11.702 15:08:13 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:11.702 15:08:13 rpc -- common/autotest_common.sh@10 -- # set +x 00:03:11.702 ************************************ 00:03:11.702 END TEST rpc 00:03:11.702 ************************************ 00:03:11.702 15:08:13 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:03:11.702 15:08:13 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:11.702 15:08:13 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:11.702 15:08:13 -- common/autotest_common.sh@10 -- # set +x 00:03:11.702 ************************************ 00:03:11.702 START TEST skip_rpc 00:03:11.702 ************************************ 00:03:11.702 15:08:13 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:03:11.702 * Looking for test storage... 00:03:11.702 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc 00:03:11.702 15:08:13 skip_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:11.702 15:08:13 skip_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:03:11.702 15:08:13 skip_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@345 -- # : 1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:11.962 15:08:13 skip_rpc -- scripts/common.sh@368 -- # return 0 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:11.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:11.962 --rc genhtml_branch_coverage=1 00:03:11.962 --rc genhtml_function_coverage=1 00:03:11.962 --rc genhtml_legend=1 00:03:11.962 --rc geninfo_all_blocks=1 00:03:11.962 --rc geninfo_unexecuted_blocks=1 00:03:11.962 00:03:11.962 ' 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:11.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:11.962 --rc genhtml_branch_coverage=1 00:03:11.962 --rc genhtml_function_coverage=1 00:03:11.962 --rc genhtml_legend=1 00:03:11.962 --rc geninfo_all_blocks=1 00:03:11.962 --rc geninfo_unexecuted_blocks=1 00:03:11.962 00:03:11.962 ' 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:11.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:11.962 --rc genhtml_branch_coverage=1 00:03:11.962 --rc genhtml_function_coverage=1 00:03:11.962 --rc genhtml_legend=1 00:03:11.962 --rc geninfo_all_blocks=1 00:03:11.962 --rc geninfo_unexecuted_blocks=1 00:03:11.962 00:03:11.962 ' 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:11.962 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:11.962 --rc genhtml_branch_coverage=1 00:03:11.962 --rc genhtml_function_coverage=1 00:03:11.962 --rc genhtml_legend=1 00:03:11.962 --rc geninfo_all_blocks=1 00:03:11.962 --rc geninfo_unexecuted_blocks=1 00:03:11.962 00:03:11.962 ' 00:03:11.962 15:08:13 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/config.json 00:03:11.962 15:08:13 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/log.txt 00:03:11.962 15:08:13 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:11.962 15:08:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:11.962 ************************************ 00:03:11.962 START TEST skip_rpc 00:03:11.962 ************************************ 00:03:11.962 15:08:13 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:03:11.962 15:08:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1640804 00:03:11.962 15:08:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:03:11.962 15:08:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:03:11.962 15:08:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:03:11.962 [2024-09-27 15:08:13.703987] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:11.962 [2024-09-27 15:08:13.704034] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640804 ] 00:03:11.962 [2024-09-27 15:08:13.786963] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:12.221 [2024-09-27 15:08:13.873669] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1640804 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 1640804 ']' 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 1640804 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1640804 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1640804' 00:03:17.504 killing process with pid 1640804 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 1640804 00:03:17.504 15:08:18 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 1640804 00:03:17.504 00:03:17.504 real 0m5.440s 00:03:17.504 user 0m5.143s 00:03:17.504 sys 0m0.339s 00:03:17.504 15:08:19 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:17.504 15:08:19 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:17.504 ************************************ 00:03:17.504 END TEST skip_rpc 00:03:17.504 ************************************ 00:03:17.504 15:08:19 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:03:17.504 15:08:19 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:17.504 15:08:19 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:17.504 15:08:19 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:17.504 ************************************ 00:03:17.504 START TEST skip_rpc_with_json 00:03:17.504 ************************************ 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1641558 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1641558 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 1641558 ']' 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:17.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:17.504 15:08:19 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:03:17.504 [2024-09-27 15:08:19.233092] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:17.504 [2024-09-27 15:08:19.233150] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1641558 ] 00:03:17.504 [2024-09-27 15:08:19.317952] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:17.763 [2024-09-27 15:08:19.405937] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:03:18.331 [2024-09-27 15:08:20.109857] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:03:18.331 request: 00:03:18.331 { 00:03:18.331 "trtype": "tcp", 00:03:18.331 "method": "nvmf_get_transports", 00:03:18.331 "req_id": 1 00:03:18.331 } 00:03:18.331 Got JSON-RPC error response 00:03:18.331 response: 00:03:18.331 { 00:03:18.331 "code": -19, 00:03:18.331 "message": "No such device" 00:03:18.331 } 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:03:18.331 [2024-09-27 15:08:20.121959] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:18.331 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:03:18.591 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:18.591 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/config.json 00:03:18.591 { 00:03:18.591 "subsystems": [ 00:03:18.591 { 00:03:18.591 "subsystem": "fsdev", 00:03:18.591 "config": [ 00:03:18.591 { 00:03:18.591 "method": "fsdev_set_opts", 00:03:18.591 "params": { 00:03:18.591 "fsdev_io_pool_size": 65535, 00:03:18.591 "fsdev_io_cache_size": 256 00:03:18.591 } 00:03:18.591 } 00:03:18.591 ] 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "subsystem": "keyring", 00:03:18.591 "config": [] 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "subsystem": "iobuf", 00:03:18.591 "config": [ 00:03:18.591 { 00:03:18.591 "method": "iobuf_set_options", 00:03:18.591 "params": { 00:03:18.591 "small_pool_count": 8192, 00:03:18.591 "large_pool_count": 1024, 00:03:18.591 "small_bufsize": 8192, 00:03:18.591 "large_bufsize": 135168 00:03:18.591 } 00:03:18.591 } 00:03:18.591 ] 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "subsystem": "sock", 00:03:18.591 "config": [ 00:03:18.591 { 00:03:18.591 "method": "sock_set_default_impl", 00:03:18.591 "params": { 00:03:18.591 "impl_name": "posix" 00:03:18.591 } 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "method": "sock_impl_set_options", 00:03:18.591 "params": { 00:03:18.591 "impl_name": "ssl", 00:03:18.591 "recv_buf_size": 4096, 00:03:18.591 "send_buf_size": 4096, 00:03:18.591 "enable_recv_pipe": true, 00:03:18.591 "enable_quickack": false, 00:03:18.591 "enable_placement_id": 0, 00:03:18.591 "enable_zerocopy_send_server": true, 00:03:18.591 "enable_zerocopy_send_client": false, 00:03:18.591 "zerocopy_threshold": 0, 00:03:18.591 "tls_version": 0, 00:03:18.591 "enable_ktls": false 00:03:18.591 } 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "method": "sock_impl_set_options", 00:03:18.591 "params": { 00:03:18.591 "impl_name": "posix", 00:03:18.591 "recv_buf_size": 2097152, 00:03:18.591 "send_buf_size": 2097152, 00:03:18.591 "enable_recv_pipe": true, 00:03:18.591 "enable_quickack": false, 00:03:18.591 "enable_placement_id": 0, 00:03:18.591 "enable_zerocopy_send_server": true, 00:03:18.591 "enable_zerocopy_send_client": false, 00:03:18.591 "zerocopy_threshold": 0, 00:03:18.591 "tls_version": 0, 00:03:18.591 "enable_ktls": false 00:03:18.591 } 00:03:18.591 } 00:03:18.591 ] 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "subsystem": "vmd", 00:03:18.591 "config": [] 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "subsystem": "accel", 00:03:18.591 "config": [ 00:03:18.591 { 00:03:18.591 "method": "accel_set_options", 00:03:18.591 "params": { 00:03:18.591 "small_cache_size": 128, 00:03:18.591 "large_cache_size": 16, 00:03:18.591 "task_count": 2048, 00:03:18.591 "sequence_count": 2048, 00:03:18.591 "buf_count": 2048 00:03:18.591 } 00:03:18.591 } 00:03:18.591 ] 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "subsystem": "bdev", 00:03:18.591 "config": [ 00:03:18.591 { 00:03:18.591 "method": "bdev_set_options", 00:03:18.591 "params": { 00:03:18.591 "bdev_io_pool_size": 65535, 00:03:18.591 "bdev_io_cache_size": 256, 00:03:18.591 "bdev_auto_examine": true, 00:03:18.591 "iobuf_small_cache_size": 128, 00:03:18.591 "iobuf_large_cache_size": 16 00:03:18.591 } 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "method": "bdev_raid_set_options", 00:03:18.591 "params": { 00:03:18.591 "process_window_size_kb": 1024, 00:03:18.591 "process_max_bandwidth_mb_sec": 0 00:03:18.591 } 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "method": "bdev_iscsi_set_options", 00:03:18.591 "params": { 00:03:18.591 "timeout_sec": 30 00:03:18.591 } 00:03:18.591 }, 00:03:18.591 { 00:03:18.591 "method": "bdev_nvme_set_options", 00:03:18.591 "params": { 00:03:18.591 "action_on_timeout": "none", 00:03:18.591 "timeout_us": 0, 00:03:18.591 "timeout_admin_us": 0, 00:03:18.591 "keep_alive_timeout_ms": 10000, 00:03:18.591 "arbitration_burst": 0, 00:03:18.591 "low_priority_weight": 0, 00:03:18.591 "medium_priority_weight": 0, 00:03:18.591 "high_priority_weight": 0, 00:03:18.591 "nvme_adminq_poll_period_us": 10000, 00:03:18.591 "nvme_ioq_poll_period_us": 0, 00:03:18.591 "io_queue_requests": 0, 00:03:18.591 "delay_cmd_submit": true, 00:03:18.591 "transport_retry_count": 4, 00:03:18.591 "bdev_retry_count": 3, 00:03:18.591 "transport_ack_timeout": 0, 00:03:18.592 "ctrlr_loss_timeout_sec": 0, 00:03:18.592 "reconnect_delay_sec": 0, 00:03:18.592 "fast_io_fail_timeout_sec": 0, 00:03:18.592 "disable_auto_failback": false, 00:03:18.592 "generate_uuids": false, 00:03:18.592 "transport_tos": 0, 00:03:18.592 "nvme_error_stat": false, 00:03:18.592 "rdma_srq_size": 0, 00:03:18.592 "io_path_stat": false, 00:03:18.592 "allow_accel_sequence": false, 00:03:18.592 "rdma_max_cq_size": 0, 00:03:18.592 "rdma_cm_event_timeout_ms": 0, 00:03:18.592 "dhchap_digests": [ 00:03:18.592 "sha256", 00:03:18.592 "sha384", 00:03:18.592 "sha512" 00:03:18.592 ], 00:03:18.592 "dhchap_dhgroups": [ 00:03:18.592 "null", 00:03:18.592 "ffdhe2048", 00:03:18.592 "ffdhe3072", 00:03:18.592 "ffdhe4096", 00:03:18.592 "ffdhe6144", 00:03:18.592 "ffdhe8192" 00:03:18.592 ] 00:03:18.592 } 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "method": "bdev_nvme_set_hotplug", 00:03:18.592 "params": { 00:03:18.592 "period_us": 100000, 00:03:18.592 "enable": false 00:03:18.592 } 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "method": "bdev_wait_for_examine" 00:03:18.592 } 00:03:18.592 ] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "scsi", 00:03:18.592 "config": null 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "scheduler", 00:03:18.592 "config": [ 00:03:18.592 { 00:03:18.592 "method": "framework_set_scheduler", 00:03:18.592 "params": { 00:03:18.592 "name": "static" 00:03:18.592 } 00:03:18.592 } 00:03:18.592 ] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "vhost_scsi", 00:03:18.592 "config": [] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "vhost_blk", 00:03:18.592 "config": [] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "ublk", 00:03:18.592 "config": [] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "nbd", 00:03:18.592 "config": [] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "nvmf", 00:03:18.592 "config": [ 00:03:18.592 { 00:03:18.592 "method": "nvmf_set_config", 00:03:18.592 "params": { 00:03:18.592 "discovery_filter": "match_any", 00:03:18.592 "admin_cmd_passthru": { 00:03:18.592 "identify_ctrlr": false 00:03:18.592 }, 00:03:18.592 "dhchap_digests": [ 00:03:18.592 "sha256", 00:03:18.592 "sha384", 00:03:18.592 "sha512" 00:03:18.592 ], 00:03:18.592 "dhchap_dhgroups": [ 00:03:18.592 "null", 00:03:18.592 "ffdhe2048", 00:03:18.592 "ffdhe3072", 00:03:18.592 "ffdhe4096", 00:03:18.592 "ffdhe6144", 00:03:18.592 "ffdhe8192" 00:03:18.592 ] 00:03:18.592 } 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "method": "nvmf_set_max_subsystems", 00:03:18.592 "params": { 00:03:18.592 "max_subsystems": 1024 00:03:18.592 } 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "method": "nvmf_set_crdt", 00:03:18.592 "params": { 00:03:18.592 "crdt1": 0, 00:03:18.592 "crdt2": 0, 00:03:18.592 "crdt3": 0 00:03:18.592 } 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "method": "nvmf_create_transport", 00:03:18.592 "params": { 00:03:18.592 "trtype": "TCP", 00:03:18.592 "max_queue_depth": 128, 00:03:18.592 "max_io_qpairs_per_ctrlr": 127, 00:03:18.592 "in_capsule_data_size": 4096, 00:03:18.592 "max_io_size": 131072, 00:03:18.592 "io_unit_size": 131072, 00:03:18.592 "max_aq_depth": 128, 00:03:18.592 "num_shared_buffers": 511, 00:03:18.592 "buf_cache_size": 4294967295, 00:03:18.592 "dif_insert_or_strip": false, 00:03:18.592 "zcopy": false, 00:03:18.592 "c2h_success": true, 00:03:18.592 "sock_priority": 0, 00:03:18.592 "abort_timeout_sec": 1, 00:03:18.592 "ack_timeout": 0, 00:03:18.592 "data_wr_pool_size": 0 00:03:18.592 } 00:03:18.592 } 00:03:18.592 ] 00:03:18.592 }, 00:03:18.592 { 00:03:18.592 "subsystem": "iscsi", 00:03:18.592 "config": [ 00:03:18.592 { 00:03:18.592 "method": "iscsi_set_options", 00:03:18.592 "params": { 00:03:18.592 "node_base": "iqn.2016-06.io.spdk", 00:03:18.592 "max_sessions": 128, 00:03:18.592 "max_connections_per_session": 2, 00:03:18.592 "max_queue_depth": 64, 00:03:18.592 "default_time2wait": 2, 00:03:18.592 "default_time2retain": 20, 00:03:18.592 "first_burst_length": 8192, 00:03:18.592 "immediate_data": true, 00:03:18.592 "allow_duplicated_isid": false, 00:03:18.592 "error_recovery_level": 0, 00:03:18.592 "nop_timeout": 60, 00:03:18.592 "nop_in_interval": 30, 00:03:18.592 "disable_chap": false, 00:03:18.592 "require_chap": false, 00:03:18.592 "mutual_chap": false, 00:03:18.592 "chap_group": 0, 00:03:18.592 "max_large_datain_per_connection": 64, 00:03:18.592 "max_r2t_per_connection": 4, 00:03:18.592 "pdu_pool_size": 36864, 00:03:18.592 "immediate_data_pool_size": 16384, 00:03:18.592 "data_out_pool_size": 2048 00:03:18.592 } 00:03:18.592 } 00:03:18.592 ] 00:03:18.592 } 00:03:18.592 ] 00:03:18.592 } 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1641558 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1641558 ']' 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1641558 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1641558 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1641558' 00:03:18.592 killing process with pid 1641558 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1641558 00:03:18.592 15:08:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1641558 00:03:19.163 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1641751 00:03:19.163 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/config.json 00:03:19.163 15:08:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1641751 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 1641751 ']' 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 1641751 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1641751 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1641751' 00:03:24.443 killing process with pid 1641751 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 1641751 00:03:24.443 15:08:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 1641751 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/log.txt 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/log.txt 00:03:24.443 00:03:24.443 real 0m6.997s 00:03:24.443 user 0m6.782s 00:03:24.443 sys 0m0.754s 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:03:24.443 ************************************ 00:03:24.443 END TEST skip_rpc_with_json 00:03:24.443 ************************************ 00:03:24.443 15:08:26 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:03:24.443 15:08:26 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:24.443 15:08:26 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:24.443 15:08:26 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:24.443 ************************************ 00:03:24.443 START TEST skip_rpc_with_delay 00:03:24.443 ************************************ 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:03:24.443 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:03:24.703 [2024-09-27 15:08:26.314152] app.c: 840:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:03:24.703 [2024-09-27 15:08:26.314223] app.c: 719:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:03:24.703 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:03:24.703 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:03:24.703 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:03:24.703 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:03:24.703 00:03:24.703 real 0m0.072s 00:03:24.703 user 0m0.037s 00:03:24.703 sys 0m0.035s 00:03:24.703 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:24.703 15:08:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:03:24.703 ************************************ 00:03:24.703 END TEST skip_rpc_with_delay 00:03:24.703 ************************************ 00:03:24.703 15:08:26 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:03:24.703 15:08:26 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:03:24.703 15:08:26 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:03:24.703 15:08:26 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:24.703 15:08:26 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:24.703 15:08:26 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:24.703 ************************************ 00:03:24.703 START TEST exit_on_failed_rpc_init 00:03:24.703 ************************************ 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1642536 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1642536 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 1642536 ']' 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:24.703 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:24.703 15:08:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:03:24.703 [2024-09-27 15:08:26.471600] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:24.703 [2024-09-27 15:08:26.471661] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1642536 ] 00:03:24.963 [2024-09-27 15:08:26.554615] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:24.963 [2024-09-27 15:08:26.644017] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:03:25.533 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:03:25.533 [2024-09-27 15:08:27.370916] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:25.533 [2024-09-27 15:08:27.370974] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1642721 ] 00:03:25.793 [2024-09-27 15:08:27.452814] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:25.793 [2024-09-27 15:08:27.536843] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:03:25.793 [2024-09-27 15:08:27.536914] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:03:25.793 [2024-09-27 15:08:27.536926] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:03:25.793 [2024-09-27 15:08:27.536933] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1642536 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 1642536 ']' 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 1642536 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:25.793 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1642536 00:03:26.052 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:26.052 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:26.052 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1642536' 00:03:26.052 killing process with pid 1642536 00:03:26.052 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 1642536 00:03:26.052 15:08:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 1642536 00:03:26.312 00:03:26.312 real 0m1.630s 00:03:26.312 user 0m1.828s 00:03:26.312 sys 0m0.527s 00:03:26.312 15:08:28 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:26.312 15:08:28 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:03:26.312 ************************************ 00:03:26.312 END TEST exit_on_failed_rpc_init 00:03:26.312 ************************************ 00:03:26.312 15:08:28 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc/config.json 00:03:26.312 00:03:26.312 real 0m14.686s 00:03:26.312 user 0m14.032s 00:03:26.312 sys 0m2.004s 00:03:26.312 15:08:28 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:26.312 15:08:28 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:26.312 ************************************ 00:03:26.312 END TEST skip_rpc 00:03:26.312 ************************************ 00:03:26.312 15:08:28 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:03:26.312 15:08:28 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:26.312 15:08:28 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:26.312 15:08:28 -- common/autotest_common.sh@10 -- # set +x 00:03:26.571 ************************************ 00:03:26.571 START TEST rpc_client 00:03:26.571 ************************************ 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:03:26.571 * Looking for test storage... 00:03:26.571 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_client 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1681 -- # lcov --version 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@345 -- # : 1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@353 -- # local d=1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@355 -- # echo 1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@353 -- # local d=2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@355 -- # echo 2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:26.571 15:08:28 rpc_client -- scripts/common.sh@368 -- # return 0 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:26.571 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.571 --rc genhtml_branch_coverage=1 00:03:26.571 --rc genhtml_function_coverage=1 00:03:26.571 --rc genhtml_legend=1 00:03:26.571 --rc geninfo_all_blocks=1 00:03:26.571 --rc geninfo_unexecuted_blocks=1 00:03:26.571 00:03:26.571 ' 00:03:26.571 15:08:28 rpc_client -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:26.571 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.571 --rc genhtml_branch_coverage=1 00:03:26.571 --rc genhtml_function_coverage=1 00:03:26.571 --rc genhtml_legend=1 00:03:26.572 --rc geninfo_all_blocks=1 00:03:26.572 --rc geninfo_unexecuted_blocks=1 00:03:26.572 00:03:26.572 ' 00:03:26.572 15:08:28 rpc_client -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:26.572 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.572 --rc genhtml_branch_coverage=1 00:03:26.572 --rc genhtml_function_coverage=1 00:03:26.572 --rc genhtml_legend=1 00:03:26.572 --rc geninfo_all_blocks=1 00:03:26.572 --rc geninfo_unexecuted_blocks=1 00:03:26.572 00:03:26.572 ' 00:03:26.572 15:08:28 rpc_client -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:26.572 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.572 --rc genhtml_branch_coverage=1 00:03:26.572 --rc genhtml_function_coverage=1 00:03:26.572 --rc genhtml_legend=1 00:03:26.572 --rc geninfo_all_blocks=1 00:03:26.572 --rc geninfo_unexecuted_blocks=1 00:03:26.572 00:03:26.572 ' 00:03:26.572 15:08:28 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:03:26.572 OK 00:03:26.572 15:08:28 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:03:26.572 00:03:26.572 real 0m0.230s 00:03:26.572 user 0m0.129s 00:03:26.572 sys 0m0.119s 00:03:26.572 15:08:28 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:26.572 15:08:28 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:03:26.572 ************************************ 00:03:26.572 END TEST rpc_client 00:03:26.572 ************************************ 00:03:26.831 15:08:28 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_config.sh 00:03:26.831 15:08:28 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:26.831 15:08:28 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:26.831 15:08:28 -- common/autotest_common.sh@10 -- # set +x 00:03:26.831 ************************************ 00:03:26.831 START TEST json_config 00:03:26.831 ************************************ 00:03:26.831 15:08:28 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_config.sh 00:03:26.831 15:08:28 json_config -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:26.831 15:08:28 json_config -- common/autotest_common.sh@1681 -- # lcov --version 00:03:26.831 15:08:28 json_config -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:26.831 15:08:28 json_config -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:26.831 15:08:28 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:26.831 15:08:28 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:26.831 15:08:28 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:26.831 15:08:28 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:03:26.831 15:08:28 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:03:26.831 15:08:28 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:03:26.831 15:08:28 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:03:26.831 15:08:28 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:03:26.831 15:08:28 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:26.831 15:08:28 json_config -- scripts/common.sh@344 -- # case "$op" in 00:03:26.831 15:08:28 json_config -- scripts/common.sh@345 -- # : 1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:26.831 15:08:28 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:26.831 15:08:28 json_config -- scripts/common.sh@365 -- # decimal 1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@353 -- # local d=1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:26.831 15:08:28 json_config -- scripts/common.sh@355 -- # echo 1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:03:26.831 15:08:28 json_config -- scripts/common.sh@366 -- # decimal 2 00:03:26.831 15:08:28 json_config -- scripts/common.sh@353 -- # local d=2 00:03:26.831 15:08:28 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:26.831 15:08:28 json_config -- scripts/common.sh@355 -- # echo 2 00:03:26.832 15:08:28 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:03:26.832 15:08:28 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:26.832 15:08:28 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:26.832 15:08:28 json_config -- scripts/common.sh@368 -- # return 0 00:03:26.832 15:08:28 json_config -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:26.832 15:08:28 json_config -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:26.832 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.832 --rc genhtml_branch_coverage=1 00:03:26.832 --rc genhtml_function_coverage=1 00:03:26.832 --rc genhtml_legend=1 00:03:26.832 --rc geninfo_all_blocks=1 00:03:26.832 --rc geninfo_unexecuted_blocks=1 00:03:26.832 00:03:26.832 ' 00:03:26.832 15:08:28 json_config -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:26.832 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.832 --rc genhtml_branch_coverage=1 00:03:26.832 --rc genhtml_function_coverage=1 00:03:26.832 --rc genhtml_legend=1 00:03:26.832 --rc geninfo_all_blocks=1 00:03:26.832 --rc geninfo_unexecuted_blocks=1 00:03:26.832 00:03:26.832 ' 00:03:26.832 15:08:28 json_config -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:26.832 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.832 --rc genhtml_branch_coverage=1 00:03:26.832 --rc genhtml_function_coverage=1 00:03:26.832 --rc genhtml_legend=1 00:03:26.832 --rc geninfo_all_blocks=1 00:03:26.832 --rc geninfo_unexecuted_blocks=1 00:03:26.832 00:03:26.832 ' 00:03:26.832 15:08:28 json_config -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:26.832 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.832 --rc genhtml_branch_coverage=1 00:03:26.832 --rc genhtml_function_coverage=1 00:03:26.832 --rc genhtml_legend=1 00:03:26.832 --rc geninfo_all_blocks=1 00:03:26.832 --rc geninfo_unexecuted_blocks=1 00:03:26.832 00:03:26.832 ' 00:03:26.832 15:08:28 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@7 -- # uname -s 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:26.832 15:08:28 json_config -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@19 -- # NET_TYPE=phy-fallback 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:03:27.092 15:08:28 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:03:27.092 15:08:28 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:27.092 15:08:28 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:27.092 15:08:28 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:27.092 15:08:28 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.092 15:08:28 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.092 15:08:28 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.092 15:08:28 json_config -- paths/export.sh@5 -- # export PATH 00:03:27.092 15:08:28 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:03:27.092 15:08:28 json_config -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:03:27.092 15:08:28 json_config -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:03:27.092 15:08:28 json_config -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@50 -- # : 0 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:03:27.092 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:03:27.092 15:08:28 json_config -- nvmf/common.sh@54 -- # have_pci_nics=0 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/common.sh 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:03:27.092 15:08:28 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_initiator_config.json') 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@362 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@363 -- # echo 'INFO: JSON configuration test init' 00:03:27.093 INFO: JSON configuration test init 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@364 -- # json_config_test_init 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@269 -- # timing_enter json_config_test_init 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@270 -- # timing_enter json_config_setup_target 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:27.093 15:08:28 json_config -- json_config/json_config.sh@272 -- # json_config_test_start_app target --wait-for-rpc 00:03:27.093 15:08:28 json_config -- json_config/common.sh@9 -- # local app=target 00:03:27.093 15:08:28 json_config -- json_config/common.sh@10 -- # shift 00:03:27.093 15:08:28 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:03:27.093 15:08:28 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:03:27.093 15:08:28 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:03:27.093 15:08:28 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:03:27.093 15:08:28 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:03:27.093 15:08:28 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1643029 00:03:27.093 15:08:28 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:03:27.093 Waiting for target to run... 00:03:27.093 15:08:28 json_config -- json_config/common.sh@25 -- # waitforlisten 1643029 /var/tmp/spdk_tgt.sock 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@831 -- # '[' -z 1643029 ']' 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:03:27.093 15:08:28 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:03:27.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:27.093 15:08:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:27.093 [2024-09-27 15:08:28.775618] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:27.093 [2024-09-27 15:08:28.775682] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1643029 ] 00:03:27.480 [2024-09-27 15:08:29.096221] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:27.480 [2024-09-27 15:08:29.168394] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:28.049 15:08:29 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:28.049 15:08:29 json_config -- common/autotest_common.sh@864 -- # return 0 00:03:28.049 15:08:29 json_config -- json_config/common.sh@26 -- # echo '' 00:03:28.049 00:03:28.049 15:08:29 json_config -- json_config/json_config.sh@276 -- # create_accel_config 00:03:28.049 15:08:29 json_config -- json_config/json_config.sh@100 -- # timing_enter create_accel_config 00:03:28.049 15:08:29 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:28.049 15:08:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:28.049 15:08:29 json_config -- json_config/json_config.sh@102 -- # [[ 0 -eq 1 ]] 00:03:28.049 15:08:29 json_config -- json_config/json_config.sh@108 -- # timing_exit create_accel_config 00:03:28.049 15:08:29 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:28.049 15:08:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:28.049 15:08:29 json_config -- json_config/json_config.sh@280 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:03:28.049 15:08:29 json_config -- json_config/json_config.sh@281 -- # tgt_rpc load_config 00:03:28.049 15:08:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@283 -- # tgt_check_notification_types 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:03:28.618 15:08:30 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:28.618 15:08:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@47 -- # [[ y == y ]] 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@48 -- # enabled_types+=("fsdev_register" "fsdev_unregister") 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@51 -- # tgt_rpc notify_get_types 00:03:28.618 15:08:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:03:28.618 15:08:30 json_config -- json_config/json_config.sh@51 -- # jq -r '.[]' 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@51 -- # get_types=('fsdev_register' 'fsdev_unregister' 'bdev_register' 'bdev_unregister') 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@51 -- # local get_types 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@53 -- # local type_diff 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@54 -- # echo bdev_register bdev_unregister fsdev_register fsdev_unregister fsdev_register fsdev_unregister bdev_register bdev_unregister 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@54 -- # tr ' ' '\n' 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@54 -- # sort 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@54 -- # uniq -u 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@54 -- # type_diff= 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@56 -- # [[ -n '' ]] 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@61 -- # timing_exit tgt_check_notification_types 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@62 -- # return 0 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@285 -- # [[ 0 -eq 1 ]] 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@289 -- # [[ 0 -eq 1 ]] 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@293 -- # [[ 0 -eq 1 ]] 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@297 -- # [[ 1 -eq 1 ]] 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@298 -- # create_nvmf_subsystem_config 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@237 -- # timing_enter create_nvmf_subsystem_config 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@239 -- # NVMF_FIRST_TARGET_IP=127.0.0.1 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@240 -- # [[ rdma == \r\d\m\a ]] 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@241 -- # TEST_TRANSPORT=rdma 00:03:28.877 15:08:30 json_config -- json_config/json_config.sh@241 -- # nvmftestinit 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@292 -- # prepare_net_devs 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@254 -- # local -g is_hw=no 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@256 -- # remove_target_ns 00:03:28.877 15:08:30 json_config -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 13> /dev/null' 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@22 -- # _remove_target_ns 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@258 -- # [[ phy-fallback != virt ]] 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:03:28.877 15:08:30 json_config -- nvmf/common.sh@125 -- # xtrace_disable 00:03:28.877 15:08:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@131 -- # pci_devs=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@131 -- # local -a pci_devs 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@132 -- # pci_net_devs=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@133 -- # pci_drivers=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@133 -- # local -A pci_drivers 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@135 -- # net_devs=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@135 -- # local -ga net_devs 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@136 -- # e810=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@136 -- # local -ga e810 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@137 -- # x722=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@137 -- # local -ga x722 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@138 -- # mlx=() 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@138 -- # local -ga mlx 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:03:35.450 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:03:35.450 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:03:35.450 Found net devices under 0000:18:00.0: mlx_0_0 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:03:35.450 Found net devices under 0000:18:00.1: mlx_0_1 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@249 -- # get_rdma_if_list 00:03:35.450 15:08:37 json_config -- nvmf/common.sh@75 -- # rdma_devs=() 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@89 -- # continue 2 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@89 -- # continue 2 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@258 -- # is_hw=yes 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@61 -- # uname 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@65 -- # modprobe ib_cm 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@66 -- # modprobe ib_core 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@67 -- # modprobe ib_umad 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@69 -- # modprobe iw_cm 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:03:35.451 15:08:37 json_config -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@27 -- # local -gA dev_map 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@28 -- # local -g _dev 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@44 -- # ips=() 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@58 -- # key_initiator=target1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@11 -- # local val=167772161 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:03:35.451 10.0.0.1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@11 -- # local val=167772162 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:03:35.451 10.0.0.2 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@38 -- # ping_ips 1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@168 -- # get_net_dev target0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@107 -- # local dev=target0 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:03:35.451 15:08:37 json_config -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:03:35.451 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:03:35.451 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.033 ms 00:03:35.451 00:03:35.451 --- 10.0.0.2 ping statistics --- 00:03:35.452 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:03:35.452 rtt min/avg/max/mdev = 0.033/0.033/0.033/0.000 ms 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@168 -- # get_net_dev target0 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@107 -- # local dev=target0 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:03:35.452 15:08:37 json_config -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:03:35.712 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:03:35.712 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.027 ms 00:03:35.712 00:03:35.712 --- 10.0.0.2 ping statistics --- 00:03:35.712 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:03:35.712 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@98 -- # (( pair++ )) 00:03:35.712 15:08:37 json_config -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@266 -- # return 0 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # get_net_dev target0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@107 -- # local dev=target0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # get_net_dev target1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@107 -- # local dev=target1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # get_net_dev target0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@107 -- # local dev=target0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # get_net_dev target1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@107 -- # local dev=target1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:03:35.713 15:08:37 json_config -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:03:35.713 15:08:37 json_config -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:03:35.713 15:08:37 json_config -- json_config/json_config.sh@244 -- # [[ -z 10.0.0.2 ]] 00:03:35.713 15:08:37 json_config -- json_config/json_config.sh@249 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocForNvmf0 00:03:35.713 15:08:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocForNvmf0 00:03:35.973 MallocForNvmf0 00:03:35.973 15:08:37 json_config -- json_config/json_config.sh@250 -- # tgt_rpc bdev_malloc_create 4 1024 --name MallocForNvmf1 00:03:35.973 15:08:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 4 1024 --name MallocForNvmf1 00:03:35.973 MallocForNvmf1 00:03:36.232 15:08:37 json_config -- json_config/json_config.sh@252 -- # tgt_rpc nvmf_create_transport -t rdma -u 8192 -c 0 00:03:36.232 15:08:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock nvmf_create_transport -t rdma -u 8192 -c 0 00:03:36.232 [2024-09-27 15:08:38.016546] rdma.c:2734:nvmf_rdma_create: *WARNING*: In capsule data size is set to 256, this is minimum size required to support msdbd=16 00:03:36.232 [2024-09-27 15:08:38.041471] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x16501a0/0x1661670) succeed. 00:03:36.232 [2024-09-27 15:08:38.053022] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1652390/0x16e1700) succeed. 00:03:36.491 15:08:38 json_config -- json_config/json_config.sh@253 -- # tgt_rpc nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:03:36.492 15:08:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:03:36.492 15:08:38 json_config -- json_config/json_config.sh@254 -- # tgt_rpc nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 MallocForNvmf0 00:03:36.492 15:08:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 MallocForNvmf0 00:03:36.751 15:08:38 json_config -- json_config/json_config.sh@255 -- # tgt_rpc nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 MallocForNvmf1 00:03:36.751 15:08:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 MallocForNvmf1 00:03:37.010 15:08:38 json_config -- json_config/json_config.sh@256 -- # tgt_rpc nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:03:37.010 15:08:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:03:37.269 [2024-09-27 15:08:38.880115] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:03:37.269 15:08:38 json_config -- json_config/json_config.sh@258 -- # timing_exit create_nvmf_subsystem_config 00:03:37.269 15:08:38 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:37.269 15:08:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:37.269 15:08:38 json_config -- json_config/json_config.sh@300 -- # timing_exit json_config_setup_target 00:03:37.269 15:08:38 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:37.269 15:08:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:37.269 15:08:39 json_config -- json_config/json_config.sh@302 -- # [[ 0 -eq 1 ]] 00:03:37.269 15:08:39 json_config -- json_config/json_config.sh@307 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:03:37.269 15:08:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:03:37.528 MallocBdevForConfigChangeCheck 00:03:37.528 15:08:39 json_config -- json_config/json_config.sh@309 -- # timing_exit json_config_test_init 00:03:37.528 15:08:39 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:37.528 15:08:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:37.528 15:08:39 json_config -- json_config/json_config.sh@366 -- # tgt_rpc save_config 00:03:37.528 15:08:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:03:37.787 15:08:39 json_config -- json_config/json_config.sh@368 -- # echo 'INFO: shutting down applications...' 00:03:37.787 INFO: shutting down applications... 00:03:37.787 15:08:39 json_config -- json_config/json_config.sh@369 -- # [[ 0 -eq 1 ]] 00:03:37.787 15:08:39 json_config -- json_config/json_config.sh@375 -- # json_config_clear target 00:03:37.787 15:08:39 json_config -- json_config/json_config.sh@339 -- # [[ -n 22 ]] 00:03:37.787 15:08:39 json_config -- json_config/json_config.sh@340 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:03:38.356 Calling clear_iscsi_subsystem 00:03:38.356 Calling clear_nvmf_subsystem 00:03:38.356 Calling clear_nbd_subsystem 00:03:38.356 Calling clear_ublk_subsystem 00:03:38.356 Calling clear_vhost_blk_subsystem 00:03:38.356 Calling clear_vhost_scsi_subsystem 00:03:38.356 Calling clear_bdev_subsystem 00:03:38.356 15:08:40 json_config -- json_config/json_config.sh@344 -- # local config_filter=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py 00:03:38.356 15:08:40 json_config -- json_config/json_config.sh@350 -- # count=100 00:03:38.356 15:08:40 json_config -- json_config/json_config.sh@351 -- # '[' 100 -gt 0 ']' 00:03:38.356 15:08:40 json_config -- json_config/json_config.sh@352 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:03:38.356 15:08:40 json_config -- json_config/json_config.sh@352 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:03:38.356 15:08:40 json_config -- json_config/json_config.sh@352 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:03:38.615 15:08:40 json_config -- json_config/json_config.sh@352 -- # break 00:03:38.615 15:08:40 json_config -- json_config/json_config.sh@357 -- # '[' 100 -eq 0 ']' 00:03:38.615 15:08:40 json_config -- json_config/json_config.sh@376 -- # json_config_test_shutdown_app target 00:03:38.615 15:08:40 json_config -- json_config/common.sh@31 -- # local app=target 00:03:38.615 15:08:40 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:03:38.615 15:08:40 json_config -- json_config/common.sh@35 -- # [[ -n 1643029 ]] 00:03:38.615 15:08:40 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1643029 00:03:38.615 15:08:40 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:03:38.615 15:08:40 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:03:38.615 15:08:40 json_config -- json_config/common.sh@41 -- # kill -0 1643029 00:03:38.616 15:08:40 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:03:39.186 15:08:40 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:03:39.186 15:08:40 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:03:39.186 15:08:40 json_config -- json_config/common.sh@41 -- # kill -0 1643029 00:03:39.186 15:08:40 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:03:39.186 15:08:40 json_config -- json_config/common.sh@43 -- # break 00:03:39.186 15:08:40 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:03:39.186 15:08:40 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:03:39.186 SPDK target shutdown done 00:03:39.186 15:08:40 json_config -- json_config/json_config.sh@378 -- # echo 'INFO: relaunching applications...' 00:03:39.186 INFO: relaunching applications... 00:03:39.186 15:08:40 json_config -- json_config/json_config.sh@379 -- # json_config_test_start_app target --json /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:39.186 15:08:40 json_config -- json_config/common.sh@9 -- # local app=target 00:03:39.186 15:08:40 json_config -- json_config/common.sh@10 -- # shift 00:03:39.186 15:08:40 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:03:39.186 15:08:40 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:03:39.186 15:08:40 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:03:39.186 15:08:40 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:03:39.186 15:08:40 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:03:39.186 15:08:40 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1646740 00:03:39.186 15:08:40 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:03:39.186 Waiting for target to run... 00:03:39.186 15:08:40 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:39.186 15:08:40 json_config -- json_config/common.sh@25 -- # waitforlisten 1646740 /var/tmp/spdk_tgt.sock 00:03:39.186 15:08:40 json_config -- common/autotest_common.sh@831 -- # '[' -z 1646740 ']' 00:03:39.186 15:08:40 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:03:39.186 15:08:40 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:39.186 15:08:40 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:03:39.186 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:03:39.186 15:08:40 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:39.186 15:08:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:39.186 [2024-09-27 15:08:40.983403] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:39.186 [2024-09-27 15:08:40.983467] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1646740 ] 00:03:39.755 [2024-09-27 15:08:41.549580] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:40.015 [2024-09-27 15:08:41.646771] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:40.584 [2024-09-27 15:08:42.178636] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x2638dd0/0x25d22b0) succeed. 00:03:40.584 [2024-09-27 15:08:42.189001] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x263afc0/0x2667300) succeed. 00:03:40.584 [2024-09-27 15:08:42.239056] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:03:40.584 15:08:42 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:40.584 15:08:42 json_config -- common/autotest_common.sh@864 -- # return 0 00:03:40.584 15:08:42 json_config -- json_config/common.sh@26 -- # echo '' 00:03:40.584 00:03:40.584 15:08:42 json_config -- json_config/json_config.sh@380 -- # [[ 0 -eq 1 ]] 00:03:40.584 15:08:42 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: Checking if target configuration is the same...' 00:03:40.584 INFO: Checking if target configuration is the same... 00:03:40.584 15:08:42 json_config -- json_config/json_config.sh@385 -- # tgt_rpc save_config 00:03:40.584 15:08:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:03:40.584 15:08:42 json_config -- json_config/json_config.sh@385 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:40.584 + '[' 2 -ne 2 ']' 00:03:40.584 +++ dirname /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_diff.sh 00:03:40.584 ++ readlink -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/../.. 00:03:40.584 + rootdir=/var/jenkins/workspace/nvmf-phy-autotest/spdk 00:03:40.584 +++ basename /dev/fd/62 00:03:40.584 ++ mktemp /tmp/62.XXX 00:03:40.584 + tmp_file_1=/tmp/62.SN4 00:03:40.584 +++ basename /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:40.584 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:03:40.584 + tmp_file_2=/tmp/spdk_tgt_config.json.lWF 00:03:40.584 + ret=0 00:03:40.584 + /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:03:40.844 + /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:03:40.844 + diff -u /tmp/62.SN4 /tmp/spdk_tgt_config.json.lWF 00:03:40.844 + echo 'INFO: JSON config files are the same' 00:03:40.844 INFO: JSON config files are the same 00:03:40.844 + rm /tmp/62.SN4 /tmp/spdk_tgt_config.json.lWF 00:03:40.844 + exit 0 00:03:40.844 15:08:42 json_config -- json_config/json_config.sh@386 -- # [[ 0 -eq 1 ]] 00:03:41.104 15:08:42 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:03:41.104 INFO: changing configuration and checking if this can be detected... 00:03:41.104 15:08:42 json_config -- json_config/json_config.sh@393 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:03:41.104 15:08:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:03:41.104 15:08:42 json_config -- json_config/json_config.sh@394 -- # tgt_rpc save_config 00:03:41.104 15:08:42 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:03:41.104 15:08:42 json_config -- json_config/json_config.sh@394 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:41.104 + '[' 2 -ne 2 ']' 00:03:41.104 +++ dirname /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_diff.sh 00:03:41.104 ++ readlink -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/../.. 00:03:41.104 + rootdir=/var/jenkins/workspace/nvmf-phy-autotest/spdk 00:03:41.104 +++ basename /dev/fd/62 00:03:41.104 ++ mktemp /tmp/62.XXX 00:03:41.104 + tmp_file_1=/tmp/62.7pe 00:03:41.104 +++ basename /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:41.104 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:03:41.104 + tmp_file_2=/tmp/spdk_tgt_config.json.vbA 00:03:41.104 + ret=0 00:03:41.104 + /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:03:41.673 + /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:03:41.673 + diff -u /tmp/62.7pe /tmp/spdk_tgt_config.json.vbA 00:03:41.673 + ret=1 00:03:41.673 + echo '=== Start of file: /tmp/62.7pe ===' 00:03:41.673 + cat /tmp/62.7pe 00:03:41.673 + echo '=== End of file: /tmp/62.7pe ===' 00:03:41.673 + echo '' 00:03:41.673 + echo '=== Start of file: /tmp/spdk_tgt_config.json.vbA ===' 00:03:41.673 + cat /tmp/spdk_tgt_config.json.vbA 00:03:41.673 + echo '=== End of file: /tmp/spdk_tgt_config.json.vbA ===' 00:03:41.673 + echo '' 00:03:41.673 + rm /tmp/62.7pe /tmp/spdk_tgt_config.json.vbA 00:03:41.673 + exit 1 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@398 -- # echo 'INFO: configuration change detected.' 00:03:41.673 INFO: configuration change detected. 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@401 -- # json_config_test_fini 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@313 -- # timing_enter json_config_test_fini 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@314 -- # local ret=0 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@316 -- # [[ -n '' ]] 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@324 -- # [[ -n 1646740 ]] 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@327 -- # cleanup_bdev_subsystem_config 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@191 -- # timing_enter cleanup_bdev_subsystem_config 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@193 -- # [[ 0 -eq 1 ]] 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@200 -- # uname -s 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@200 -- # [[ Linux = Linux ]] 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@201 -- # rm -f /sample_aio 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@204 -- # [[ 0 -eq 1 ]] 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@208 -- # timing_exit cleanup_bdev_subsystem_config 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:41.673 15:08:43 json_config -- json_config/json_config.sh@330 -- # killprocess 1646740 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@950 -- # '[' -z 1646740 ']' 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@954 -- # kill -0 1646740 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@955 -- # uname 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1646740 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1646740' 00:03:41.673 killing process with pid 1646740 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@969 -- # kill 1646740 00:03:41.673 15:08:43 json_config -- common/autotest_common.sh@974 -- # wait 1646740 00:03:42.243 15:08:43 json_config -- json_config/json_config.sh@333 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/nvmf-phy-autotest/spdk/spdk_tgt_config.json 00:03:42.243 15:08:43 json_config -- json_config/json_config.sh@334 -- # timing_exit json_config_test_fini 00:03:42.243 15:08:43 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:42.243 15:08:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:42.243 15:08:43 json_config -- json_config/json_config.sh@335 -- # return 0 00:03:42.243 15:08:43 json_config -- json_config/json_config.sh@403 -- # echo 'INFO: Success' 00:03:42.243 INFO: Success 00:03:42.243 15:08:43 json_config -- json_config/json_config.sh@1 -- # nvmftestfini 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@331 -- # nvmfcleanup 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@99 -- # sync 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@101 -- # '[' '' == tcp ']' 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@101 -- # '[' '' == rdma ']' 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@332 -- # '[' -n '' ']' 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@338 -- # nvmf_fini 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@264 -- # local dev 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@267 -- # remove_target_ns 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:03:42.243 15:08:43 json_config -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 13> /dev/null' 00:03:42.243 15:08:43 json_config -- common/autotest_common.sh@22 -- # _remove_target_ns 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@268 -- # delete_main_bridge 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@130 -- # return 0 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@41 -- # _dev=0 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@41 -- # dev_map=() 00:03:42.243 15:08:43 json_config -- nvmf/setup.sh@284 -- # iptr 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@538 -- # iptables-save 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:03:42.243 15:08:43 json_config -- nvmf/common.sh@538 -- # iptables-restore 00:03:42.244 00:03:42.244 real 0m15.425s 00:03:42.244 user 0m18.875s 00:03:42.244 sys 0m7.832s 00:03:42.244 15:08:43 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:42.244 15:08:43 json_config -- common/autotest_common.sh@10 -- # set +x 00:03:42.244 ************************************ 00:03:42.244 END TEST json_config 00:03:42.244 ************************************ 00:03:42.244 15:08:43 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:03:42.244 15:08:43 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:42.244 15:08:43 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:42.244 15:08:43 -- common/autotest_common.sh@10 -- # set +x 00:03:42.244 ************************************ 00:03:42.244 START TEST json_config_extra_key 00:03:42.244 ************************************ 00:03:42.244 15:08:44 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:03:42.244 15:08:44 json_config_extra_key -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1681 -- # lcov --version 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:42.504 15:08:44 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:42.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:42.504 --rc genhtml_branch_coverage=1 00:03:42.504 --rc genhtml_function_coverage=1 00:03:42.504 --rc genhtml_legend=1 00:03:42.504 --rc geninfo_all_blocks=1 00:03:42.504 --rc geninfo_unexecuted_blocks=1 00:03:42.504 00:03:42.504 ' 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:42.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:42.504 --rc genhtml_branch_coverage=1 00:03:42.504 --rc genhtml_function_coverage=1 00:03:42.504 --rc genhtml_legend=1 00:03:42.504 --rc geninfo_all_blocks=1 00:03:42.504 --rc geninfo_unexecuted_blocks=1 00:03:42.504 00:03:42.504 ' 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:42.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:42.504 --rc genhtml_branch_coverage=1 00:03:42.504 --rc genhtml_function_coverage=1 00:03:42.504 --rc genhtml_legend=1 00:03:42.504 --rc geninfo_all_blocks=1 00:03:42.504 --rc geninfo_unexecuted_blocks=1 00:03:42.504 00:03:42.504 ' 00:03:42.504 15:08:44 json_config_extra_key -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:42.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:42.504 --rc genhtml_branch_coverage=1 00:03:42.504 --rc genhtml_function_coverage=1 00:03:42.504 --rc genhtml_legend=1 00:03:42.504 --rc geninfo_all_blocks=1 00:03:42.504 --rc geninfo_unexecuted_blocks=1 00:03:42.504 00:03:42.504 ' 00:03:42.504 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:42.504 15:08:44 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@19 -- # NET_TYPE=phy-fallback 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:03:42.505 15:08:44 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:03:42.505 15:08:44 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:42.505 15:08:44 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:42.505 15:08:44 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:42.505 15:08:44 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:42.505 15:08:44 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:42.505 15:08:44 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:42.505 15:08:44 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:03:42.505 15:08:44 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@50 -- # : 0 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:03:42.505 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:03:42.505 15:08:44 json_config_extra_key -- nvmf/common.sh@54 -- # have_pci_nics=0 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/common.sh 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/extra_key.json') 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:03:42.505 INFO: launching applications... 00:03:42.505 15:08:44 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/extra_key.json 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1647238 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:03:42.505 Waiting for target to run... 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1647238 /var/tmp/spdk_tgt.sock 00:03:42.505 15:08:44 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 1647238 ']' 00:03:42.505 15:08:44 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/extra_key.json 00:03:42.505 15:08:44 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:03:42.505 15:08:44 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:42.505 15:08:44 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:03:42.505 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:03:42.505 15:08:44 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:42.505 15:08:44 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:03:42.505 [2024-09-27 15:08:44.277687] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:42.505 [2024-09-27 15:08:44.277756] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1647238 ] 00:03:42.765 [2024-09-27 15:08:44.585296] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:43.025 [2024-09-27 15:08:44.659625] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:43.285 15:08:45 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:43.285 15:08:45 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:03:43.285 00:03:43.285 15:08:45 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:03:43.285 INFO: shutting down applications... 00:03:43.285 15:08:45 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1647238 ]] 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1647238 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1647238 00:03:43.285 15:08:45 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1647238 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@43 -- # break 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:03:43.854 15:08:45 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:03:43.854 SPDK target shutdown done 00:03:43.854 15:08:45 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:03:43.854 Success 00:03:43.854 00:03:43.854 real 0m1.608s 00:03:43.854 user 0m1.402s 00:03:43.854 sys 0m0.464s 00:03:43.854 15:08:45 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:43.854 15:08:45 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:03:43.854 ************************************ 00:03:43.854 END TEST json_config_extra_key 00:03:43.854 ************************************ 00:03:43.854 15:08:45 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:03:43.854 15:08:45 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:43.854 15:08:45 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:43.854 15:08:45 -- common/autotest_common.sh@10 -- # set +x 00:03:44.116 ************************************ 00:03:44.116 START TEST alias_rpc 00:03:44.116 ************************************ 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:03:44.116 * Looking for test storage... 00:03:44.116 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/alias_rpc 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@345 -- # : 1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:44.116 15:08:45 alias_rpc -- scripts/common.sh@368 -- # return 0 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:44.116 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:44.116 --rc genhtml_branch_coverage=1 00:03:44.116 --rc genhtml_function_coverage=1 00:03:44.116 --rc genhtml_legend=1 00:03:44.116 --rc geninfo_all_blocks=1 00:03:44.116 --rc geninfo_unexecuted_blocks=1 00:03:44.116 00:03:44.116 ' 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:44.116 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:44.116 --rc genhtml_branch_coverage=1 00:03:44.116 --rc genhtml_function_coverage=1 00:03:44.116 --rc genhtml_legend=1 00:03:44.116 --rc geninfo_all_blocks=1 00:03:44.116 --rc geninfo_unexecuted_blocks=1 00:03:44.116 00:03:44.116 ' 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:44.116 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:44.116 --rc genhtml_branch_coverage=1 00:03:44.116 --rc genhtml_function_coverage=1 00:03:44.116 --rc genhtml_legend=1 00:03:44.116 --rc geninfo_all_blocks=1 00:03:44.116 --rc geninfo_unexecuted_blocks=1 00:03:44.116 00:03:44.116 ' 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:44.116 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:44.116 --rc genhtml_branch_coverage=1 00:03:44.116 --rc genhtml_function_coverage=1 00:03:44.116 --rc genhtml_legend=1 00:03:44.116 --rc geninfo_all_blocks=1 00:03:44.116 --rc geninfo_unexecuted_blocks=1 00:03:44.116 00:03:44.116 ' 00:03:44.116 15:08:45 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:03:44.116 15:08:45 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:44.116 15:08:45 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1647490 00:03:44.116 15:08:45 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1647490 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 1647490 ']' 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:44.116 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:44.116 15:08:45 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:44.116 [2024-09-27 15:08:45.948974] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:44.116 [2024-09-27 15:08:45.949037] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1647490 ] 00:03:44.377 [2024-09-27 15:08:46.036638] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:44.377 [2024-09-27 15:08:46.117885] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:45.316 15:08:46 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:45.316 15:08:46 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:03:45.316 15:08:46 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py load_config -i 00:03:45.316 15:08:47 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1647490 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 1647490 ']' 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 1647490 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1647490 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1647490' 00:03:45.316 killing process with pid 1647490 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@969 -- # kill 1647490 00:03:45.316 15:08:47 alias_rpc -- common/autotest_common.sh@974 -- # wait 1647490 00:03:45.886 00:03:45.886 real 0m1.740s 00:03:45.886 user 0m1.840s 00:03:45.886 sys 0m0.531s 00:03:45.886 15:08:47 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:45.886 15:08:47 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:03:45.886 ************************************ 00:03:45.886 END TEST alias_rpc 00:03:45.886 ************************************ 00:03:45.886 15:08:47 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:03:45.886 15:08:47 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/spdkcli/tcp.sh 00:03:45.886 15:08:47 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:45.886 15:08:47 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:45.886 15:08:47 -- common/autotest_common.sh@10 -- # set +x 00:03:45.886 ************************************ 00:03:45.886 START TEST spdkcli_tcp 00:03:45.886 ************************************ 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/spdkcli/tcp.sh 00:03:45.886 * Looking for test storage... 00:03:45.886 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/spdkcli 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lcov --version 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:45.886 15:08:47 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:45.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.886 --rc genhtml_branch_coverage=1 00:03:45.886 --rc genhtml_function_coverage=1 00:03:45.886 --rc genhtml_legend=1 00:03:45.886 --rc geninfo_all_blocks=1 00:03:45.886 --rc geninfo_unexecuted_blocks=1 00:03:45.886 00:03:45.886 ' 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:45.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.886 --rc genhtml_branch_coverage=1 00:03:45.886 --rc genhtml_function_coverage=1 00:03:45.886 --rc genhtml_legend=1 00:03:45.886 --rc geninfo_all_blocks=1 00:03:45.886 --rc geninfo_unexecuted_blocks=1 00:03:45.886 00:03:45.886 ' 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:45.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.886 --rc genhtml_branch_coverage=1 00:03:45.886 --rc genhtml_function_coverage=1 00:03:45.886 --rc genhtml_legend=1 00:03:45.886 --rc geninfo_all_blocks=1 00:03:45.886 --rc geninfo_unexecuted_blocks=1 00:03:45.886 00:03:45.886 ' 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:45.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.886 --rc genhtml_branch_coverage=1 00:03:45.886 --rc genhtml_function_coverage=1 00:03:45.886 --rc genhtml_legend=1 00:03:45.886 --rc geninfo_all_blocks=1 00:03:45.886 --rc geninfo_unexecuted_blocks=1 00:03:45.886 00:03:45.886 ' 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/spdkcli/common.sh 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/json_config/clear_config.py 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:45.886 15:08:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:03:45.886 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1647904 00:03:46.145 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1647904 00:03:46.145 15:08:47 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:03:46.145 15:08:47 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 1647904 ']' 00:03:46.145 15:08:47 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:46.145 15:08:47 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:46.145 15:08:47 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:46.145 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:46.145 15:08:47 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:46.145 15:08:47 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:03:46.145 [2024-09-27 15:08:47.787446] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:46.145 [2024-09-27 15:08:47.787512] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1647904 ] 00:03:46.145 [2024-09-27 15:08:47.872874] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:03:46.145 [2024-09-27 15:08:47.962720] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:03:46.145 [2024-09-27 15:08:47.962721] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:47.085 15:08:48 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:47.085 15:08:48 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:03:47.085 15:08:48 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1647921 00:03:47.085 15:08:48 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:03:47.085 15:08:48 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:03:47.085 [ 00:03:47.085 "bdev_malloc_delete", 00:03:47.085 "bdev_malloc_create", 00:03:47.085 "bdev_null_resize", 00:03:47.085 "bdev_null_delete", 00:03:47.085 "bdev_null_create", 00:03:47.085 "bdev_nvme_cuse_unregister", 00:03:47.085 "bdev_nvme_cuse_register", 00:03:47.085 "bdev_opal_new_user", 00:03:47.085 "bdev_opal_set_lock_state", 00:03:47.085 "bdev_opal_delete", 00:03:47.085 "bdev_opal_get_info", 00:03:47.085 "bdev_opal_create", 00:03:47.085 "bdev_nvme_opal_revert", 00:03:47.085 "bdev_nvme_opal_init", 00:03:47.085 "bdev_nvme_send_cmd", 00:03:47.085 "bdev_nvme_set_keys", 00:03:47.085 "bdev_nvme_get_path_iostat", 00:03:47.085 "bdev_nvme_get_mdns_discovery_info", 00:03:47.085 "bdev_nvme_stop_mdns_discovery", 00:03:47.085 "bdev_nvme_start_mdns_discovery", 00:03:47.085 "bdev_nvme_set_multipath_policy", 00:03:47.085 "bdev_nvme_set_preferred_path", 00:03:47.085 "bdev_nvme_get_io_paths", 00:03:47.085 "bdev_nvme_remove_error_injection", 00:03:47.085 "bdev_nvme_add_error_injection", 00:03:47.085 "bdev_nvme_get_discovery_info", 00:03:47.085 "bdev_nvme_stop_discovery", 00:03:47.085 "bdev_nvme_start_discovery", 00:03:47.085 "bdev_nvme_get_controller_health_info", 00:03:47.085 "bdev_nvme_disable_controller", 00:03:47.085 "bdev_nvme_enable_controller", 00:03:47.085 "bdev_nvme_reset_controller", 00:03:47.085 "bdev_nvme_get_transport_statistics", 00:03:47.085 "bdev_nvme_apply_firmware", 00:03:47.085 "bdev_nvme_detach_controller", 00:03:47.085 "bdev_nvme_get_controllers", 00:03:47.085 "bdev_nvme_attach_controller", 00:03:47.085 "bdev_nvme_set_hotplug", 00:03:47.085 "bdev_nvme_set_options", 00:03:47.085 "bdev_passthru_delete", 00:03:47.085 "bdev_passthru_create", 00:03:47.085 "bdev_lvol_set_parent_bdev", 00:03:47.085 "bdev_lvol_set_parent", 00:03:47.085 "bdev_lvol_check_shallow_copy", 00:03:47.085 "bdev_lvol_start_shallow_copy", 00:03:47.085 "bdev_lvol_grow_lvstore", 00:03:47.085 "bdev_lvol_get_lvols", 00:03:47.085 "bdev_lvol_get_lvstores", 00:03:47.085 "bdev_lvol_delete", 00:03:47.085 "bdev_lvol_set_read_only", 00:03:47.085 "bdev_lvol_resize", 00:03:47.085 "bdev_lvol_decouple_parent", 00:03:47.085 "bdev_lvol_inflate", 00:03:47.085 "bdev_lvol_rename", 00:03:47.085 "bdev_lvol_clone_bdev", 00:03:47.085 "bdev_lvol_clone", 00:03:47.085 "bdev_lvol_snapshot", 00:03:47.085 "bdev_lvol_create", 00:03:47.085 "bdev_lvol_delete_lvstore", 00:03:47.085 "bdev_lvol_rename_lvstore", 00:03:47.085 "bdev_lvol_create_lvstore", 00:03:47.085 "bdev_raid_set_options", 00:03:47.085 "bdev_raid_remove_base_bdev", 00:03:47.085 "bdev_raid_add_base_bdev", 00:03:47.085 "bdev_raid_delete", 00:03:47.085 "bdev_raid_create", 00:03:47.085 "bdev_raid_get_bdevs", 00:03:47.085 "bdev_error_inject_error", 00:03:47.085 "bdev_error_delete", 00:03:47.085 "bdev_error_create", 00:03:47.085 "bdev_split_delete", 00:03:47.085 "bdev_split_create", 00:03:47.085 "bdev_delay_delete", 00:03:47.085 "bdev_delay_create", 00:03:47.085 "bdev_delay_update_latency", 00:03:47.085 "bdev_zone_block_delete", 00:03:47.085 "bdev_zone_block_create", 00:03:47.085 "blobfs_create", 00:03:47.085 "blobfs_detect", 00:03:47.085 "blobfs_set_cache_size", 00:03:47.085 "bdev_aio_delete", 00:03:47.085 "bdev_aio_rescan", 00:03:47.085 "bdev_aio_create", 00:03:47.085 "bdev_ftl_set_property", 00:03:47.085 "bdev_ftl_get_properties", 00:03:47.085 "bdev_ftl_get_stats", 00:03:47.085 "bdev_ftl_unmap", 00:03:47.085 "bdev_ftl_unload", 00:03:47.085 "bdev_ftl_delete", 00:03:47.085 "bdev_ftl_load", 00:03:47.085 "bdev_ftl_create", 00:03:47.085 "bdev_virtio_attach_controller", 00:03:47.085 "bdev_virtio_scsi_get_devices", 00:03:47.085 "bdev_virtio_detach_controller", 00:03:47.085 "bdev_virtio_blk_set_hotplug", 00:03:47.085 "bdev_iscsi_delete", 00:03:47.085 "bdev_iscsi_create", 00:03:47.085 "bdev_iscsi_set_options", 00:03:47.085 "accel_error_inject_error", 00:03:47.085 "ioat_scan_accel_module", 00:03:47.085 "dsa_scan_accel_module", 00:03:47.085 "iaa_scan_accel_module", 00:03:47.085 "keyring_file_remove_key", 00:03:47.085 "keyring_file_add_key", 00:03:47.085 "keyring_linux_set_options", 00:03:47.085 "fsdev_aio_delete", 00:03:47.085 "fsdev_aio_create", 00:03:47.085 "iscsi_get_histogram", 00:03:47.085 "iscsi_enable_histogram", 00:03:47.085 "iscsi_set_options", 00:03:47.085 "iscsi_get_auth_groups", 00:03:47.085 "iscsi_auth_group_remove_secret", 00:03:47.085 "iscsi_auth_group_add_secret", 00:03:47.085 "iscsi_delete_auth_group", 00:03:47.085 "iscsi_create_auth_group", 00:03:47.085 "iscsi_set_discovery_auth", 00:03:47.085 "iscsi_get_options", 00:03:47.085 "iscsi_target_node_request_logout", 00:03:47.085 "iscsi_target_node_set_redirect", 00:03:47.085 "iscsi_target_node_set_auth", 00:03:47.085 "iscsi_target_node_add_lun", 00:03:47.085 "iscsi_get_stats", 00:03:47.086 "iscsi_get_connections", 00:03:47.086 "iscsi_portal_group_set_auth", 00:03:47.086 "iscsi_start_portal_group", 00:03:47.086 "iscsi_delete_portal_group", 00:03:47.086 "iscsi_create_portal_group", 00:03:47.086 "iscsi_get_portal_groups", 00:03:47.086 "iscsi_delete_target_node", 00:03:47.086 "iscsi_target_node_remove_pg_ig_maps", 00:03:47.086 "iscsi_target_node_add_pg_ig_maps", 00:03:47.086 "iscsi_create_target_node", 00:03:47.086 "iscsi_get_target_nodes", 00:03:47.086 "iscsi_delete_initiator_group", 00:03:47.086 "iscsi_initiator_group_remove_initiators", 00:03:47.086 "iscsi_initiator_group_add_initiators", 00:03:47.086 "iscsi_create_initiator_group", 00:03:47.086 "iscsi_get_initiator_groups", 00:03:47.086 "nvmf_set_crdt", 00:03:47.086 "nvmf_set_config", 00:03:47.086 "nvmf_set_max_subsystems", 00:03:47.086 "nvmf_stop_mdns_prr", 00:03:47.086 "nvmf_publish_mdns_prr", 00:03:47.086 "nvmf_subsystem_get_listeners", 00:03:47.086 "nvmf_subsystem_get_qpairs", 00:03:47.086 "nvmf_subsystem_get_controllers", 00:03:47.086 "nvmf_get_stats", 00:03:47.086 "nvmf_get_transports", 00:03:47.086 "nvmf_create_transport", 00:03:47.086 "nvmf_get_targets", 00:03:47.086 "nvmf_delete_target", 00:03:47.086 "nvmf_create_target", 00:03:47.086 "nvmf_subsystem_allow_any_host", 00:03:47.086 "nvmf_subsystem_set_keys", 00:03:47.086 "nvmf_subsystem_remove_host", 00:03:47.086 "nvmf_subsystem_add_host", 00:03:47.086 "nvmf_ns_remove_host", 00:03:47.086 "nvmf_ns_add_host", 00:03:47.086 "nvmf_subsystem_remove_ns", 00:03:47.086 "nvmf_subsystem_set_ns_ana_group", 00:03:47.086 "nvmf_subsystem_add_ns", 00:03:47.086 "nvmf_subsystem_listener_set_ana_state", 00:03:47.086 "nvmf_discovery_get_referrals", 00:03:47.086 "nvmf_discovery_remove_referral", 00:03:47.086 "nvmf_discovery_add_referral", 00:03:47.086 "nvmf_subsystem_remove_listener", 00:03:47.086 "nvmf_subsystem_add_listener", 00:03:47.086 "nvmf_delete_subsystem", 00:03:47.086 "nvmf_create_subsystem", 00:03:47.086 "nvmf_get_subsystems", 00:03:47.086 "env_dpdk_get_mem_stats", 00:03:47.086 "nbd_get_disks", 00:03:47.086 "nbd_stop_disk", 00:03:47.086 "nbd_start_disk", 00:03:47.086 "ublk_recover_disk", 00:03:47.086 "ublk_get_disks", 00:03:47.086 "ublk_stop_disk", 00:03:47.086 "ublk_start_disk", 00:03:47.086 "ublk_destroy_target", 00:03:47.086 "ublk_create_target", 00:03:47.086 "virtio_blk_create_transport", 00:03:47.086 "virtio_blk_get_transports", 00:03:47.086 "vhost_controller_set_coalescing", 00:03:47.086 "vhost_get_controllers", 00:03:47.086 "vhost_delete_controller", 00:03:47.086 "vhost_create_blk_controller", 00:03:47.086 "vhost_scsi_controller_remove_target", 00:03:47.086 "vhost_scsi_controller_add_target", 00:03:47.086 "vhost_start_scsi_controller", 00:03:47.086 "vhost_create_scsi_controller", 00:03:47.086 "thread_set_cpumask", 00:03:47.086 "scheduler_set_options", 00:03:47.086 "framework_get_governor", 00:03:47.086 "framework_get_scheduler", 00:03:47.086 "framework_set_scheduler", 00:03:47.086 "framework_get_reactors", 00:03:47.086 "thread_get_io_channels", 00:03:47.086 "thread_get_pollers", 00:03:47.086 "thread_get_stats", 00:03:47.086 "framework_monitor_context_switch", 00:03:47.086 "spdk_kill_instance", 00:03:47.086 "log_enable_timestamps", 00:03:47.086 "log_get_flags", 00:03:47.086 "log_clear_flag", 00:03:47.086 "log_set_flag", 00:03:47.086 "log_get_level", 00:03:47.086 "log_set_level", 00:03:47.086 "log_get_print_level", 00:03:47.086 "log_set_print_level", 00:03:47.086 "framework_enable_cpumask_locks", 00:03:47.086 "framework_disable_cpumask_locks", 00:03:47.086 "framework_wait_init", 00:03:47.086 "framework_start_init", 00:03:47.086 "scsi_get_devices", 00:03:47.086 "bdev_get_histogram", 00:03:47.086 "bdev_enable_histogram", 00:03:47.086 "bdev_set_qos_limit", 00:03:47.086 "bdev_set_qd_sampling_period", 00:03:47.086 "bdev_get_bdevs", 00:03:47.086 "bdev_reset_iostat", 00:03:47.086 "bdev_get_iostat", 00:03:47.086 "bdev_examine", 00:03:47.086 "bdev_wait_for_examine", 00:03:47.086 "bdev_set_options", 00:03:47.086 "accel_get_stats", 00:03:47.086 "accel_set_options", 00:03:47.086 "accel_set_driver", 00:03:47.086 "accel_crypto_key_destroy", 00:03:47.086 "accel_crypto_keys_get", 00:03:47.086 "accel_crypto_key_create", 00:03:47.086 "accel_assign_opc", 00:03:47.086 "accel_get_module_info", 00:03:47.086 "accel_get_opc_assignments", 00:03:47.086 "vmd_rescan", 00:03:47.086 "vmd_remove_device", 00:03:47.086 "vmd_enable", 00:03:47.086 "sock_get_default_impl", 00:03:47.086 "sock_set_default_impl", 00:03:47.086 "sock_impl_set_options", 00:03:47.086 "sock_impl_get_options", 00:03:47.086 "iobuf_get_stats", 00:03:47.086 "iobuf_set_options", 00:03:47.086 "keyring_get_keys", 00:03:47.086 "framework_get_pci_devices", 00:03:47.086 "framework_get_config", 00:03:47.086 "framework_get_subsystems", 00:03:47.086 "fsdev_set_opts", 00:03:47.086 "fsdev_get_opts", 00:03:47.086 "trace_get_info", 00:03:47.086 "trace_get_tpoint_group_mask", 00:03:47.086 "trace_disable_tpoint_group", 00:03:47.086 "trace_enable_tpoint_group", 00:03:47.086 "trace_clear_tpoint_mask", 00:03:47.086 "trace_set_tpoint_mask", 00:03:47.086 "notify_get_notifications", 00:03:47.086 "notify_get_types", 00:03:47.086 "spdk_get_version", 00:03:47.086 "rpc_get_methods" 00:03:47.086 ] 00:03:47.086 15:08:48 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:03:47.086 15:08:48 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:03:47.086 15:08:48 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1647904 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 1647904 ']' 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 1647904 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:47.086 15:08:48 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1647904 00:03:47.346 15:08:48 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:47.346 15:08:48 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:47.346 15:08:48 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1647904' 00:03:47.346 killing process with pid 1647904 00:03:47.346 15:08:48 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 1647904 00:03:47.346 15:08:48 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 1647904 00:03:47.606 00:03:47.606 real 0m1.789s 00:03:47.606 user 0m3.183s 00:03:47.606 sys 0m0.578s 00:03:47.606 15:08:49 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:47.606 15:08:49 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:03:47.606 ************************************ 00:03:47.606 END TEST spdkcli_tcp 00:03:47.606 ************************************ 00:03:47.606 15:08:49 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:03:47.606 15:08:49 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:47.606 15:08:49 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:47.606 15:08:49 -- common/autotest_common.sh@10 -- # set +x 00:03:47.606 ************************************ 00:03:47.606 START TEST dpdk_mem_utility 00:03:47.606 ************************************ 00:03:47.606 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:03:47.866 * Looking for test storage... 00:03:47.866 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/dpdk_memory_utility 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lcov --version 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:47.866 15:08:49 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:47.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.866 --rc genhtml_branch_coverage=1 00:03:47.866 --rc genhtml_function_coverage=1 00:03:47.866 --rc genhtml_legend=1 00:03:47.866 --rc geninfo_all_blocks=1 00:03:47.866 --rc geninfo_unexecuted_blocks=1 00:03:47.866 00:03:47.866 ' 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:47.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.866 --rc genhtml_branch_coverage=1 00:03:47.866 --rc genhtml_function_coverage=1 00:03:47.866 --rc genhtml_legend=1 00:03:47.866 --rc geninfo_all_blocks=1 00:03:47.866 --rc geninfo_unexecuted_blocks=1 00:03:47.866 00:03:47.866 ' 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:47.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.866 --rc genhtml_branch_coverage=1 00:03:47.866 --rc genhtml_function_coverage=1 00:03:47.866 --rc genhtml_legend=1 00:03:47.866 --rc geninfo_all_blocks=1 00:03:47.866 --rc geninfo_unexecuted_blocks=1 00:03:47.866 00:03:47.866 ' 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:47.866 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:47.866 --rc genhtml_branch_coverage=1 00:03:47.866 --rc genhtml_function_coverage=1 00:03:47.866 --rc genhtml_legend=1 00:03:47.866 --rc geninfo_all_blocks=1 00:03:47.866 --rc geninfo_unexecuted_blocks=1 00:03:47.866 00:03:47.866 ' 00:03:47.866 15:08:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:03:47.866 15:08:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1648163 00:03:47.866 15:08:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1648163 00:03:47.866 15:08:49 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 1648163 ']' 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:47.866 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:47.866 15:08:49 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:03:47.866 [2024-09-27 15:08:49.655880] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:47.866 [2024-09-27 15:08:49.655940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648163 ] 00:03:48.125 [2024-09-27 15:08:49.740249] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:48.125 [2024-09-27 15:08:49.821815] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:48.694 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:48.694 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:03:48.694 15:08:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:03:48.694 15:08:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:03:48.694 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:48.694 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:03:48.694 { 00:03:48.694 "filename": "/tmp/spdk_mem_dump.txt" 00:03:48.694 } 00:03:48.694 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:48.694 15:08:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:03:48.955 DPDK memory size 860.000000 MiB in 1 heap(s) 00:03:48.955 1 heaps totaling size 860.000000 MiB 00:03:48.955 size: 860.000000 MiB heap id: 0 00:03:48.955 end heaps---------- 00:03:48.955 9 mempools totaling size 642.649841 MiB 00:03:48.955 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:03:48.955 size: 158.602051 MiB name: PDU_data_out_Pool 00:03:48.955 size: 92.545471 MiB name: bdev_io_1648163 00:03:48.955 size: 51.011292 MiB name: evtpool_1648163 00:03:48.955 size: 50.003479 MiB name: msgpool_1648163 00:03:48.955 size: 36.509338 MiB name: fsdev_io_1648163 00:03:48.955 size: 21.763794 MiB name: PDU_Pool 00:03:48.955 size: 19.513306 MiB name: SCSI_TASK_Pool 00:03:48.955 size: 0.026123 MiB name: Session_Pool 00:03:48.955 end mempools------- 00:03:48.955 6 memzones totaling size 4.142822 MiB 00:03:48.955 size: 1.000366 MiB name: RG_ring_0_1648163 00:03:48.955 size: 1.000366 MiB name: RG_ring_1_1648163 00:03:48.955 size: 1.000366 MiB name: RG_ring_4_1648163 00:03:48.955 size: 1.000366 MiB name: RG_ring_5_1648163 00:03:48.955 size: 0.125366 MiB name: RG_ring_2_1648163 00:03:48.955 size: 0.015991 MiB name: RG_ring_3_1648163 00:03:48.955 end memzones------- 00:03:48.955 15:08:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:03:48.955 heap id: 0 total size: 860.000000 MiB number of busy elements: 44 number of free elements: 16 00:03:48.955 list of free elements. size: 13.984680 MiB 00:03:48.955 element at address: 0x200000400000 with size: 1.999512 MiB 00:03:48.955 element at address: 0x200000800000 with size: 1.996948 MiB 00:03:48.955 element at address: 0x20001bc00000 with size: 0.999878 MiB 00:03:48.955 element at address: 0x20001be00000 with size: 0.999878 MiB 00:03:48.955 element at address: 0x200034a00000 with size: 0.994446 MiB 00:03:48.955 element at address: 0x200009600000 with size: 0.959839 MiB 00:03:48.955 element at address: 0x200015e00000 with size: 0.954285 MiB 00:03:48.955 element at address: 0x20001c000000 with size: 0.936584 MiB 00:03:48.955 element at address: 0x200000200000 with size: 0.841614 MiB 00:03:48.955 element at address: 0x20001d800000 with size: 0.582886 MiB 00:03:48.955 element at address: 0x200003e00000 with size: 0.495605 MiB 00:03:48.955 element at address: 0x20000d800000 with size: 0.490723 MiB 00:03:48.955 element at address: 0x20001c200000 with size: 0.485657 MiB 00:03:48.955 element at address: 0x200007000000 with size: 0.481934 MiB 00:03:48.955 element at address: 0x20002ac00000 with size: 0.410034 MiB 00:03:48.955 element at address: 0x200003a00000 with size: 0.354858 MiB 00:03:48.955 list of standard malloc elements. size: 199.218628 MiB 00:03:48.955 element at address: 0x20000d9fff80 with size: 132.000122 MiB 00:03:48.955 element at address: 0x2000097fff80 with size: 64.000122 MiB 00:03:48.955 element at address: 0x20001bcfff80 with size: 1.000122 MiB 00:03:48.955 element at address: 0x20001befff80 with size: 1.000122 MiB 00:03:48.955 element at address: 0x20001c0fff80 with size: 1.000122 MiB 00:03:48.955 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:03:48.955 element at address: 0x20001c0eff00 with size: 0.062622 MiB 00:03:48.955 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:03:48.955 element at address: 0x20001c0efdc0 with size: 0.000305 MiB 00:03:48.955 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003a5ad80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003a5af80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003a5f240 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003a7f500 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003a7f5c0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003aff880 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003affa80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003affb40 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003e7ee00 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20000707b600 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20000707b6c0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000070fb980 with size: 0.000183 MiB 00:03:48.955 element at address: 0x2000096fdd80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20000d87da00 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20000d87dac0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20000d8fdd80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x200015ef44c0 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20001c0efc40 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20001c0efd00 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20001c2bc740 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20001d895380 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20001d895440 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20002ac68f80 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20002ac69040 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20002ac6fc40 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20002ac6fe40 with size: 0.000183 MiB 00:03:48.955 element at address: 0x20002ac6ff00 with size: 0.000183 MiB 00:03:48.955 list of memzone associated elements. size: 646.796692 MiB 00:03:48.955 element at address: 0x20001d895500 with size: 211.416748 MiB 00:03:48.955 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:03:48.955 element at address: 0x20002ac6ffc0 with size: 157.562561 MiB 00:03:48.955 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:03:48.955 element at address: 0x200015ff4780 with size: 92.045044 MiB 00:03:48.955 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_1648163_0 00:03:48.955 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:03:48.955 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1648163_0 00:03:48.955 element at address: 0x200003fff380 with size: 48.003052 MiB 00:03:48.955 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1648163_0 00:03:48.955 element at address: 0x2000071fdb80 with size: 36.008911 MiB 00:03:48.955 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_1648163_0 00:03:48.955 element at address: 0x20001c3be940 with size: 20.255554 MiB 00:03:48.955 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:03:48.955 element at address: 0x200034bfeb40 with size: 18.005066 MiB 00:03:48.955 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:03:48.955 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:03:48.955 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1648163 00:03:48.955 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:03:48.955 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1648163 00:03:48.955 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:03:48.955 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1648163 00:03:48.955 element at address: 0x20000d8fde40 with size: 1.008118 MiB 00:03:48.955 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:03:48.955 element at address: 0x20001c2bc800 with size: 1.008118 MiB 00:03:48.955 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:03:48.955 element at address: 0x2000096fde40 with size: 1.008118 MiB 00:03:48.955 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:03:48.955 element at address: 0x2000070fba40 with size: 1.008118 MiB 00:03:48.955 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:03:48.955 element at address: 0x200003eff180 with size: 1.000488 MiB 00:03:48.955 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1648163 00:03:48.955 element at address: 0x200003affc00 with size: 1.000488 MiB 00:03:48.955 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1648163 00:03:48.955 element at address: 0x200015ef4580 with size: 1.000488 MiB 00:03:48.955 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1648163 00:03:48.955 element at address: 0x200034afe940 with size: 1.000488 MiB 00:03:48.955 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1648163 00:03:48.955 element at address: 0x200003a7f680 with size: 0.500488 MiB 00:03:48.955 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_1648163 00:03:48.955 element at address: 0x200003e7eec0 with size: 0.500488 MiB 00:03:48.955 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1648163 00:03:48.955 element at address: 0x20000d87db80 with size: 0.500488 MiB 00:03:48.955 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:03:48.955 element at address: 0x20000707b780 with size: 0.500488 MiB 00:03:48.955 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:03:48.955 element at address: 0x20001c27c540 with size: 0.250488 MiB 00:03:48.955 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:03:48.955 element at address: 0x200003a5f300 with size: 0.125488 MiB 00:03:48.955 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1648163 00:03:48.955 element at address: 0x2000096f5b80 with size: 0.031738 MiB 00:03:48.955 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:03:48.955 element at address: 0x20002ac69100 with size: 0.023743 MiB 00:03:48.955 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:03:48.955 element at address: 0x200003a5b040 with size: 0.016113 MiB 00:03:48.955 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1648163 00:03:48.955 element at address: 0x20002ac6f240 with size: 0.002441 MiB 00:03:48.955 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:03:48.955 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:03:48.955 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1648163 00:03:48.956 element at address: 0x200003aff940 with size: 0.000305 MiB 00:03:48.956 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_1648163 00:03:48.956 element at address: 0x200003a5ae40 with size: 0.000305 MiB 00:03:48.956 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1648163 00:03:48.956 element at address: 0x20002ac6fd00 with size: 0.000305 MiB 00:03:48.956 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:03:48.956 15:08:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:03:48.956 15:08:50 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1648163 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 1648163 ']' 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 1648163 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1648163 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1648163' 00:03:48.956 killing process with pid 1648163 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 1648163 00:03:48.956 15:08:50 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 1648163 00:03:49.215 00:03:49.215 real 0m1.641s 00:03:49.215 user 0m1.658s 00:03:49.215 sys 0m0.530s 00:03:49.215 15:08:51 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:49.215 15:08:51 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:03:49.215 ************************************ 00:03:49.215 END TEST dpdk_mem_utility 00:03:49.215 ************************************ 00:03:49.475 15:08:51 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/event.sh 00:03:49.475 15:08:51 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:49.475 15:08:51 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:49.475 15:08:51 -- common/autotest_common.sh@10 -- # set +x 00:03:49.475 ************************************ 00:03:49.475 START TEST event 00:03:49.475 ************************************ 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/event.sh 00:03:49.475 * Looking for test storage... 00:03:49.475 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1681 -- # lcov --version 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:49.475 15:08:51 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:49.475 15:08:51 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:49.475 15:08:51 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:49.475 15:08:51 event -- scripts/common.sh@336 -- # IFS=.-: 00:03:49.475 15:08:51 event -- scripts/common.sh@336 -- # read -ra ver1 00:03:49.475 15:08:51 event -- scripts/common.sh@337 -- # IFS=.-: 00:03:49.475 15:08:51 event -- scripts/common.sh@337 -- # read -ra ver2 00:03:49.475 15:08:51 event -- scripts/common.sh@338 -- # local 'op=<' 00:03:49.475 15:08:51 event -- scripts/common.sh@340 -- # ver1_l=2 00:03:49.475 15:08:51 event -- scripts/common.sh@341 -- # ver2_l=1 00:03:49.475 15:08:51 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:49.475 15:08:51 event -- scripts/common.sh@344 -- # case "$op" in 00:03:49.475 15:08:51 event -- scripts/common.sh@345 -- # : 1 00:03:49.475 15:08:51 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:49.475 15:08:51 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:49.475 15:08:51 event -- scripts/common.sh@365 -- # decimal 1 00:03:49.475 15:08:51 event -- scripts/common.sh@353 -- # local d=1 00:03:49.475 15:08:51 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:49.475 15:08:51 event -- scripts/common.sh@355 -- # echo 1 00:03:49.475 15:08:51 event -- scripts/common.sh@365 -- # ver1[v]=1 00:03:49.475 15:08:51 event -- scripts/common.sh@366 -- # decimal 2 00:03:49.475 15:08:51 event -- scripts/common.sh@353 -- # local d=2 00:03:49.475 15:08:51 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:49.475 15:08:51 event -- scripts/common.sh@355 -- # echo 2 00:03:49.475 15:08:51 event -- scripts/common.sh@366 -- # ver2[v]=2 00:03:49.475 15:08:51 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:49.475 15:08:51 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:49.475 15:08:51 event -- scripts/common.sh@368 -- # return 0 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:49.475 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:49.475 --rc genhtml_branch_coverage=1 00:03:49.475 --rc genhtml_function_coverage=1 00:03:49.475 --rc genhtml_legend=1 00:03:49.475 --rc geninfo_all_blocks=1 00:03:49.475 --rc geninfo_unexecuted_blocks=1 00:03:49.475 00:03:49.475 ' 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:49.475 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:49.475 --rc genhtml_branch_coverage=1 00:03:49.475 --rc genhtml_function_coverage=1 00:03:49.475 --rc genhtml_legend=1 00:03:49.475 --rc geninfo_all_blocks=1 00:03:49.475 --rc geninfo_unexecuted_blocks=1 00:03:49.475 00:03:49.475 ' 00:03:49.475 15:08:51 event -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:49.476 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:49.476 --rc genhtml_branch_coverage=1 00:03:49.476 --rc genhtml_function_coverage=1 00:03:49.476 --rc genhtml_legend=1 00:03:49.476 --rc geninfo_all_blocks=1 00:03:49.476 --rc geninfo_unexecuted_blocks=1 00:03:49.476 00:03:49.476 ' 00:03:49.476 15:08:51 event -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:49.476 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:49.476 --rc genhtml_branch_coverage=1 00:03:49.476 --rc genhtml_function_coverage=1 00:03:49.476 --rc genhtml_legend=1 00:03:49.476 --rc geninfo_all_blocks=1 00:03:49.476 --rc geninfo_unexecuted_blocks=1 00:03:49.476 00:03:49.476 ' 00:03:49.476 15:08:51 event -- event/event.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/bdev/nbd_common.sh 00:03:49.476 15:08:51 event -- bdev/nbd_common.sh@6 -- # set -e 00:03:49.476 15:08:51 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:03:49.476 15:08:51 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:03:49.476 15:08:51 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:49.476 15:08:51 event -- common/autotest_common.sh@10 -- # set +x 00:03:49.735 ************************************ 00:03:49.735 START TEST event_perf 00:03:49.735 ************************************ 00:03:49.735 15:08:51 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:03:49.735 Running I/O for 1 seconds...[2024-09-27 15:08:51.383000] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:49.735 [2024-09-27 15:08:51.383085] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648420 ] 00:03:49.735 [2024-09-27 15:08:51.474553] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:03:49.735 [2024-09-27 15:08:51.564919] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:03:49.735 [2024-09-27 15:08:51.565018] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:03:49.735 [2024-09-27 15:08:51.565051] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:49.735 [2024-09-27 15:08:51.565053] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:03:51.116 Running I/O for 1 seconds... 00:03:51.116 lcore 0: 212363 00:03:51.116 lcore 1: 212361 00:03:51.116 lcore 2: 212361 00:03:51.116 lcore 3: 212361 00:03:51.116 done. 00:03:51.116 00:03:51.116 real 0m1.284s 00:03:51.116 user 0m4.156s 00:03:51.116 sys 0m0.123s 00:03:51.116 15:08:52 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:51.116 15:08:52 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:03:51.116 ************************************ 00:03:51.116 END TEST event_perf 00:03:51.116 ************************************ 00:03:51.116 15:08:52 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:03:51.116 15:08:52 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:03:51.116 15:08:52 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:51.116 15:08:52 event -- common/autotest_common.sh@10 -- # set +x 00:03:51.116 ************************************ 00:03:51.116 START TEST event_reactor 00:03:51.116 ************************************ 00:03:51.116 15:08:52 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:03:51.116 [2024-09-27 15:08:52.749088] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:51.116 [2024-09-27 15:08:52.749167] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648630 ] 00:03:51.116 [2024-09-27 15:08:52.839367] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:51.116 [2024-09-27 15:08:52.923629] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:52.496 test_start 00:03:52.496 oneshot 00:03:52.496 tick 100 00:03:52.496 tick 100 00:03:52.496 tick 250 00:03:52.496 tick 100 00:03:52.496 tick 100 00:03:52.496 tick 100 00:03:52.496 tick 250 00:03:52.496 tick 500 00:03:52.496 tick 100 00:03:52.496 tick 100 00:03:52.496 tick 250 00:03:52.496 tick 100 00:03:52.496 tick 100 00:03:52.496 test_end 00:03:52.496 00:03:52.496 real 0m1.273s 00:03:52.496 user 0m1.167s 00:03:52.496 sys 0m0.100s 00:03:52.496 15:08:53 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:52.496 15:08:53 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:03:52.496 ************************************ 00:03:52.496 END TEST event_reactor 00:03:52.496 ************************************ 00:03:52.496 15:08:54 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:03:52.496 15:08:54 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:03:52.496 15:08:54 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:52.496 15:08:54 event -- common/autotest_common.sh@10 -- # set +x 00:03:52.496 ************************************ 00:03:52.496 START TEST event_reactor_perf 00:03:52.496 ************************************ 00:03:52.496 15:08:54 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:03:52.496 [2024-09-27 15:08:54.102419] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:52.496 [2024-09-27 15:08:54.102506] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648831 ] 00:03:52.496 [2024-09-27 15:08:54.190094] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:03:52.496 [2024-09-27 15:08:54.274951] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:53.877 test_start 00:03:53.877 test_end 00:03:53.877 Performance: 516353 events per second 00:03:53.877 00:03:53.877 real 0m1.274s 00:03:53.877 user 0m1.165s 00:03:53.877 sys 0m0.104s 00:03:53.877 15:08:55 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:53.877 15:08:55 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:03:53.877 ************************************ 00:03:53.877 END TEST event_reactor_perf 00:03:53.877 ************************************ 00:03:53.877 15:08:55 event -- event/event.sh@49 -- # uname -s 00:03:53.877 15:08:55 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:03:53.877 15:08:55 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:03:53.877 15:08:55 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:53.877 15:08:55 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:53.877 15:08:55 event -- common/autotest_common.sh@10 -- # set +x 00:03:53.877 ************************************ 00:03:53.877 START TEST event_scheduler 00:03:53.877 ************************************ 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:03:53.877 * Looking for test storage... 00:03:53.877 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/scheduler 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1681 -- # lcov --version 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:53.877 15:08:55 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:53.877 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.877 --rc genhtml_branch_coverage=1 00:03:53.877 --rc genhtml_function_coverage=1 00:03:53.877 --rc genhtml_legend=1 00:03:53.877 --rc geninfo_all_blocks=1 00:03:53.877 --rc geninfo_unexecuted_blocks=1 00:03:53.877 00:03:53.877 ' 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:53.877 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.877 --rc genhtml_branch_coverage=1 00:03:53.877 --rc genhtml_function_coverage=1 00:03:53.877 --rc genhtml_legend=1 00:03:53.877 --rc geninfo_all_blocks=1 00:03:53.877 --rc geninfo_unexecuted_blocks=1 00:03:53.877 00:03:53.877 ' 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:53.877 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.877 --rc genhtml_branch_coverage=1 00:03:53.877 --rc genhtml_function_coverage=1 00:03:53.877 --rc genhtml_legend=1 00:03:53.877 --rc geninfo_all_blocks=1 00:03:53.877 --rc geninfo_unexecuted_blocks=1 00:03:53.877 00:03:53.877 ' 00:03:53.877 15:08:55 event.event_scheduler -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:53.878 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:53.878 --rc genhtml_branch_coverage=1 00:03:53.878 --rc genhtml_function_coverage=1 00:03:53.878 --rc genhtml_legend=1 00:03:53.878 --rc geninfo_all_blocks=1 00:03:53.878 --rc geninfo_unexecuted_blocks=1 00:03:53.878 00:03:53.878 ' 00:03:53.878 15:08:55 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:03:53.878 15:08:55 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1649096 00:03:53.878 15:08:55 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:03:53.878 15:08:55 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:03:53.878 15:08:55 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1649096 00:03:53.878 15:08:55 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 1649096 ']' 00:03:53.878 15:08:55 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:03:53.878 15:08:55 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:53.878 15:08:55 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:03:53.878 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:03:53.878 15:08:55 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:53.878 15:08:55 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:03:53.878 [2024-09-27 15:08:55.684145] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:53.878 [2024-09-27 15:08:55.684210] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1649096 ] 00:03:54.140 [2024-09-27 15:08:55.767283] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:03:54.140 [2024-09-27 15:08:55.861291] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:54.140 [2024-09-27 15:08:55.861403] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:03:54.140 [2024-09-27 15:08:55.861443] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:03:54.140 [2024-09-27 15:08:55.861442] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:03:54.710 15:08:56 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:03:54.710 [2024-09-27 15:08:56.547989] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:03:54.710 [2024-09-27 15:08:56.548014] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:03:54.710 [2024-09-27 15:08:56.548025] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:03:54.710 [2024-09-27 15:08:56.548033] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:03:54.710 [2024-09-27 15:08:56.548040] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.710 15:08:56 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.710 15:08:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:03:54.969 [2024-09-27 15:08:56.624977] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:03:54.969 15:08:56 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.969 15:08:56 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:03:54.969 15:08:56 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:54.969 15:08:56 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 ************************************ 00:03:54.970 START TEST scheduler_create_thread 00:03:54.970 ************************************ 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 2 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 3 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 4 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 5 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 6 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 7 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 8 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 9 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 10 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:54.970 15:08:56 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:55.908 15:08:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:55.908 15:08:57 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:03:55.908 15:08:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:55.908 15:08:57 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:57.355 15:08:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:57.355 15:08:59 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:03:57.355 15:08:59 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:03:57.355 15:08:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:03:57.355 15:08:59 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:58.294 15:09:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:03:58.294 00:03:58.294 real 0m3.381s 00:03:58.294 user 0m0.021s 00:03:58.294 sys 0m0.010s 00:03:58.294 15:09:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:58.294 15:09:00 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:03:58.294 ************************************ 00:03:58.294 END TEST scheduler_create_thread 00:03:58.294 ************************************ 00:03:58.294 15:09:00 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:03:58.294 15:09:00 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1649096 00:03:58.294 15:09:00 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 1649096 ']' 00:03:58.294 15:09:00 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 1649096 00:03:58.294 15:09:00 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:03:58.294 15:09:00 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:03:58.294 15:09:00 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1649096 00:03:58.555 15:09:00 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:03:58.555 15:09:00 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:03:58.555 15:09:00 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1649096' 00:03:58.555 killing process with pid 1649096 00:03:58.555 15:09:00 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 1649096 00:03:58.555 15:09:00 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 1649096 00:03:58.815 [2024-09-27 15:09:00.425243] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:03:59.074 00:03:59.074 real 0m5.238s 00:03:59.074 user 0m10.635s 00:03:59.074 sys 0m0.483s 00:03:59.074 15:09:00 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:03:59.074 15:09:00 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:03:59.074 ************************************ 00:03:59.074 END TEST event_scheduler 00:03:59.074 ************************************ 00:03:59.074 15:09:00 event -- event/event.sh@51 -- # modprobe -n nbd 00:03:59.074 15:09:00 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:03:59.074 15:09:00 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:03:59.074 15:09:00 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:03:59.074 15:09:00 event -- common/autotest_common.sh@10 -- # set +x 00:03:59.074 ************************************ 00:03:59.074 START TEST app_repeat 00:03:59.074 ************************************ 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1649899 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1649899' 00:03:59.074 Process app_repeat pid: 1649899 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:03:59.074 spdk_app_start Round 0 00:03:59.074 15:09:00 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1649899 /var/tmp/spdk-nbd.sock 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1649899 ']' 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:03:59.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:03:59.074 15:09:00 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:03:59.074 [2024-09-27 15:09:00.805264] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:03:59.074 [2024-09-27 15:09:00.805318] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1649899 ] 00:03:59.074 [2024-09-27 15:09:00.891626] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:03:59.333 [2024-09-27 15:09:00.985764] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:03:59.333 [2024-09-27 15:09:00.985765] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:03:59.900 15:09:01 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:03:59.900 15:09:01 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:03:59.900 15:09:01 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:00.158 Malloc0 00:04:00.158 15:09:01 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:00.416 Malloc1 00:04:00.416 15:09:02 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:00.416 15:09:02 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:04:00.674 /dev/nbd0 00:04:00.675 15:09:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:04:00.675 15:09:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:00.675 1+0 records in 00:04:00.675 1+0 records out 00:04:00.675 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224701 s, 18.2 MB/s 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:04:00.675 15:09:02 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:04:00.675 15:09:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:00.675 15:09:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:00.675 15:09:02 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:04:00.934 /dev/nbd1 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:00.934 1+0 records in 00:04:00.934 1+0 records out 00:04:00.934 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258149 s, 15.9 MB/s 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:04:00.934 15:09:02 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:00.934 15:09:02 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:04:01.194 { 00:04:01.194 "nbd_device": "/dev/nbd0", 00:04:01.194 "bdev_name": "Malloc0" 00:04:01.194 }, 00:04:01.194 { 00:04:01.194 "nbd_device": "/dev/nbd1", 00:04:01.194 "bdev_name": "Malloc1" 00:04:01.194 } 00:04:01.194 ]' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:04:01.194 { 00:04:01.194 "nbd_device": "/dev/nbd0", 00:04:01.194 "bdev_name": "Malloc0" 00:04:01.194 }, 00:04:01.194 { 00:04:01.194 "nbd_device": "/dev/nbd1", 00:04:01.194 "bdev_name": "Malloc1" 00:04:01.194 } 00:04:01.194 ]' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:04:01.194 /dev/nbd1' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:04:01.194 /dev/nbd1' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:04:01.194 256+0 records in 00:04:01.194 256+0 records out 00:04:01.194 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115882 s, 90.5 MB/s 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:04:01.194 256+0 records in 00:04:01.194 256+0 records out 00:04:01.194 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197913 s, 53.0 MB/s 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:01.194 15:09:02 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:04:01.194 256+0 records in 00:04:01.194 256+0 records out 00:04:01.194 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209102 s, 50.1 MB/s 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:01.195 15:09:02 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:01.454 15:09:03 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:01.714 15:09:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:04:01.972 15:09:03 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:04:01.972 15:09:03 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:04:02.232 15:09:03 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:04:02.516 [2024-09-27 15:09:04.120833] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:02.516 [2024-09-27 15:09:04.204172] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:02.516 [2024-09-27 15:09:04.204173] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:04:02.516 [2024-09-27 15:09:04.251670] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:04:02.516 [2024-09-27 15:09:04.251721] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:04:05.805 15:09:06 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:04:05.805 15:09:06 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:04:05.805 spdk_app_start Round 1 00:04:05.805 15:09:06 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1649899 /var/tmp/spdk-nbd.sock 00:04:05.805 15:09:06 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1649899 ']' 00:04:05.805 15:09:06 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:04:05.805 15:09:06 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:05.805 15:09:06 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:04:05.805 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:04:05.805 15:09:06 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:05.805 15:09:06 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:04:05.805 15:09:07 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:05.805 15:09:07 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:04:05.805 15:09:07 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:05.805 Malloc0 00:04:05.805 15:09:07 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:05.805 Malloc1 00:04:05.805 15:09:07 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:05.805 15:09:07 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:04:06.065 /dev/nbd0 00:04:06.065 15:09:07 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:04:06.065 15:09:07 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:06.065 1+0 records in 00:04:06.065 1+0 records out 00:04:06.065 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251358 s, 16.3 MB/s 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:04:06.065 15:09:07 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:04:06.065 15:09:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:06.065 15:09:07 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:06.065 15:09:07 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:04:06.325 /dev/nbd1 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:06.325 1+0 records in 00:04:06.325 1+0 records out 00:04:06.325 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258193 s, 15.9 MB/s 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:04:06.325 15:09:08 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:06.325 15:09:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:04:06.585 { 00:04:06.585 "nbd_device": "/dev/nbd0", 00:04:06.585 "bdev_name": "Malloc0" 00:04:06.585 }, 00:04:06.585 { 00:04:06.585 "nbd_device": "/dev/nbd1", 00:04:06.585 "bdev_name": "Malloc1" 00:04:06.585 } 00:04:06.585 ]' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:04:06.585 { 00:04:06.585 "nbd_device": "/dev/nbd0", 00:04:06.585 "bdev_name": "Malloc0" 00:04:06.585 }, 00:04:06.585 { 00:04:06.585 "nbd_device": "/dev/nbd1", 00:04:06.585 "bdev_name": "Malloc1" 00:04:06.585 } 00:04:06.585 ]' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:04:06.585 /dev/nbd1' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:04:06.585 /dev/nbd1' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:04:06.585 256+0 records in 00:04:06.585 256+0 records out 00:04:06.585 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00540808 s, 194 MB/s 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:04:06.585 256+0 records in 00:04:06.585 256+0 records out 00:04:06.585 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200348 s, 52.3 MB/s 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:04:06.585 256+0 records in 00:04:06.585 256+0 records out 00:04:06.585 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210639 s, 49.8 MB/s 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:06.585 15:09:08 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:06.844 15:09:08 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:07.103 15:09:08 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:04:07.362 15:09:09 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:04:07.362 15:09:09 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:04:07.620 15:09:09 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:04:07.879 [2024-09-27 15:09:09.528743] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:07.879 [2024-09-27 15:09:09.610909] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:04:07.879 [2024-09-27 15:09:09.610910] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:07.879 [2024-09-27 15:09:09.659183] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:04:07.879 [2024-09-27 15:09:09.659233] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:04:11.173 15:09:12 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:04:11.173 15:09:12 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:04:11.173 spdk_app_start Round 2 00:04:11.173 15:09:12 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1649899 /var/tmp/spdk-nbd.sock 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1649899 ']' 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:04:11.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:11.173 15:09:12 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:04:11.173 15:09:12 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:11.173 Malloc0 00:04:11.173 15:09:12 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:11.173 Malloc1 00:04:11.173 15:09:12 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:11.173 15:09:12 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:04:11.432 /dev/nbd0 00:04:11.432 15:09:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:04:11.432 15:09:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:11.432 1+0 records in 00:04:11.432 1+0 records out 00:04:11.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278104 s, 14.7 MB/s 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:04:11.432 15:09:13 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:04:11.432 15:09:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:11.432 15:09:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:11.432 15:09:13 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:04:11.691 /dev/nbd1 00:04:11.691 15:09:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:04:11.691 15:09:13 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:04:11.691 15:09:13 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:04:11.692 15:09:13 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:11.692 1+0 records in 00:04:11.692 1+0 records out 00:04:11.692 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235757 s, 17.4 MB/s 00:04:11.692 15:09:13 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:11.692 15:09:13 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:04:11.692 15:09:13 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdtest 00:04:11.692 15:09:13 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:04:11.692 15:09:13 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:04:11.692 15:09:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:11.692 15:09:13 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:11.692 15:09:13 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:11.692 15:09:13 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:11.692 15:09:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:11.950 15:09:13 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:04:11.950 { 00:04:11.950 "nbd_device": "/dev/nbd0", 00:04:11.950 "bdev_name": "Malloc0" 00:04:11.950 }, 00:04:11.951 { 00:04:11.951 "nbd_device": "/dev/nbd1", 00:04:11.951 "bdev_name": "Malloc1" 00:04:11.951 } 00:04:11.951 ]' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:04:11.951 { 00:04:11.951 "nbd_device": "/dev/nbd0", 00:04:11.951 "bdev_name": "Malloc0" 00:04:11.951 }, 00:04:11.951 { 00:04:11.951 "nbd_device": "/dev/nbd1", 00:04:11.951 "bdev_name": "Malloc1" 00:04:11.951 } 00:04:11.951 ]' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:04:11.951 /dev/nbd1' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:04:11.951 /dev/nbd1' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:04:11.951 256+0 records in 00:04:11.951 256+0 records out 00:04:11.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106714 s, 98.3 MB/s 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:04:11.951 256+0 records in 00:04:11.951 256+0 records out 00:04:11.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199833 s, 52.5 MB/s 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:11.951 15:09:13 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:04:12.210 256+0 records in 00:04:12.210 256+0 records out 00:04:12.210 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210018 s, 49.9 MB/s 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/nbdrandtest 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:12.210 15:09:13 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:12.210 15:09:14 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:12.470 15:09:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:04:12.729 15:09:14 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:04:12.730 15:09:14 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:04:12.730 15:09:14 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:04:12.730 15:09:14 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:04:12.989 15:09:14 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:04:13.249 [2024-09-27 15:09:14.916237] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:13.249 [2024-09-27 15:09:14.998767] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:13.249 [2024-09-27 15:09:14.998768] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:04:13.249 [2024-09-27 15:09:15.046548] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:04:13.249 [2024-09-27 15:09:15.046598] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:04:16.540 15:09:17 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1649899 /var/tmp/spdk-nbd.sock 00:04:16.540 15:09:17 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 1649899 ']' 00:04:16.540 15:09:17 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:04:16.540 15:09:17 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:16.540 15:09:17 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:04:16.540 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:04:16.541 15:09:17 event.app_repeat -- event/event.sh@39 -- # killprocess 1649899 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 1649899 ']' 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 1649899 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:16.541 15:09:17 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1649899 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1649899' 00:04:16.541 killing process with pid 1649899 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@969 -- # kill 1649899 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@974 -- # wait 1649899 00:04:16.541 spdk_app_start is called in Round 0. 00:04:16.541 Shutdown signal received, stop current app iteration 00:04:16.541 Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 reinitialization... 00:04:16.541 spdk_app_start is called in Round 1. 00:04:16.541 Shutdown signal received, stop current app iteration 00:04:16.541 Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 reinitialization... 00:04:16.541 spdk_app_start is called in Round 2. 00:04:16.541 Shutdown signal received, stop current app iteration 00:04:16.541 Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 reinitialization... 00:04:16.541 spdk_app_start is called in Round 3. 00:04:16.541 Shutdown signal received, stop current app iteration 00:04:16.541 15:09:18 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:04:16.541 15:09:18 event.app_repeat -- event/event.sh@42 -- # return 0 00:04:16.541 00:04:16.541 real 0m17.406s 00:04:16.541 user 0m37.403s 00:04:16.541 sys 0m3.291s 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:16.541 15:09:18 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:04:16.541 ************************************ 00:04:16.541 END TEST app_repeat 00:04:16.541 ************************************ 00:04:16.541 15:09:18 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:04:16.541 15:09:18 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/cpu_locks.sh 00:04:16.541 15:09:18 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:16.541 15:09:18 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:16.541 15:09:18 event -- common/autotest_common.sh@10 -- # set +x 00:04:16.541 ************************************ 00:04:16.541 START TEST cpu_locks 00:04:16.541 ************************************ 00:04:16.541 15:09:18 event.cpu_locks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event/cpu_locks.sh 00:04:16.541 * Looking for test storage... 00:04:16.541 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/event 00:04:16.541 15:09:18 event.cpu_locks -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:16.541 15:09:18 event.cpu_locks -- common/autotest_common.sh@1681 -- # lcov --version 00:04:16.541 15:09:18 event.cpu_locks -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:16.801 15:09:18 event.cpu_locks -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:16.801 15:09:18 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:04:16.801 15:09:18 event.cpu_locks -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:16.801 15:09:18 event.cpu_locks -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:16.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.801 --rc genhtml_branch_coverage=1 00:04:16.801 --rc genhtml_function_coverage=1 00:04:16.801 --rc genhtml_legend=1 00:04:16.801 --rc geninfo_all_blocks=1 00:04:16.801 --rc geninfo_unexecuted_blocks=1 00:04:16.801 00:04:16.801 ' 00:04:16.801 15:09:18 event.cpu_locks -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:16.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.801 --rc genhtml_branch_coverage=1 00:04:16.801 --rc genhtml_function_coverage=1 00:04:16.801 --rc genhtml_legend=1 00:04:16.801 --rc geninfo_all_blocks=1 00:04:16.801 --rc geninfo_unexecuted_blocks=1 00:04:16.801 00:04:16.801 ' 00:04:16.801 15:09:18 event.cpu_locks -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:16.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.801 --rc genhtml_branch_coverage=1 00:04:16.801 --rc genhtml_function_coverage=1 00:04:16.801 --rc genhtml_legend=1 00:04:16.801 --rc geninfo_all_blocks=1 00:04:16.801 --rc geninfo_unexecuted_blocks=1 00:04:16.801 00:04:16.801 ' 00:04:16.801 15:09:18 event.cpu_locks -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:16.801 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:16.801 --rc genhtml_branch_coverage=1 00:04:16.801 --rc genhtml_function_coverage=1 00:04:16.801 --rc genhtml_legend=1 00:04:16.802 --rc geninfo_all_blocks=1 00:04:16.802 --rc geninfo_unexecuted_blocks=1 00:04:16.802 00:04:16.802 ' 00:04:16.802 15:09:18 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:04:16.802 15:09:18 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:04:16.802 15:09:18 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:04:16.802 15:09:18 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:04:16.802 15:09:18 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:16.802 15:09:18 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:16.802 15:09:18 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:16.802 ************************************ 00:04:16.802 START TEST default_locks 00:04:16.802 ************************************ 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # default_locks 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1652970 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 1652970 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 1652970 ']' 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:16.802 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:16.802 15:09:18 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:04:16.802 [2024-09-27 15:09:18.550494] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:16.802 [2024-09-27 15:09:18.550552] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1652970 ] 00:04:16.802 [2024-09-27 15:09:18.632905] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:17.061 [2024-09-27 15:09:18.721738] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:17.632 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:17.632 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 0 00:04:17.632 15:09:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 1652970 00:04:17.632 15:09:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 1652970 00:04:17.632 15:09:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:04:18.201 lslocks: write error 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 1652970 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@950 -- # '[' -z 1652970 ']' 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # kill -0 1652970 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # uname 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1652970 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1652970' 00:04:18.201 killing process with pid 1652970 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@969 -- # kill 1652970 00:04:18.201 15:09:19 event.cpu_locks.default_locks -- common/autotest_common.sh@974 -- # wait 1652970 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1652970 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@650 -- # local es=0 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1652970 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # waitforlisten 1652970 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 1652970 ']' 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:18.461 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:04:18.461 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (1652970) - No such process 00:04:18.461 ERROR: process (pid: 1652970) is no longer running 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 1 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # es=1 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:04:18.461 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:04:18.462 00:04:18.462 real 0m1.771s 00:04:18.462 user 0m1.839s 00:04:18.462 sys 0m0.634s 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:18.462 15:09:20 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:04:18.462 ************************************ 00:04:18.462 END TEST default_locks 00:04:18.462 ************************************ 00:04:18.721 15:09:20 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:04:18.721 15:09:20 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:18.721 15:09:20 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:18.721 15:09:20 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:18.721 ************************************ 00:04:18.721 START TEST default_locks_via_rpc 00:04:18.721 ************************************ 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # default_locks_via_rpc 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1653188 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 1653188 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1653188 ']' 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:18.721 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:18.721 15:09:20 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:18.721 [2024-09-27 15:09:20.407058] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:18.721 [2024-09-27 15:09:20.407114] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653188 ] 00:04:18.721 [2024-09-27 15:09:20.489622] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:18.981 [2024-09-27 15:09:20.575959] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 1653188 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 1653188 00:04:19.552 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 1653188 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@950 -- # '[' -z 1653188 ']' 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # kill -0 1653188 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # uname 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1653188 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1653188' 00:04:20.121 killing process with pid 1653188 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@969 -- # kill 1653188 00:04:20.121 15:09:21 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@974 -- # wait 1653188 00:04:20.690 00:04:20.690 real 0m1.976s 00:04:20.690 user 0m2.081s 00:04:20.690 sys 0m0.687s 00:04:20.690 15:09:22 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:20.690 15:09:22 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:20.690 ************************************ 00:04:20.690 END TEST default_locks_via_rpc 00:04:20.690 ************************************ 00:04:20.690 15:09:22 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:04:20.690 15:09:22 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:20.690 15:09:22 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:20.690 15:09:22 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:20.690 ************************************ 00:04:20.690 START TEST non_locking_app_on_locked_coremask 00:04:20.690 ************************************ 00:04:20.690 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # non_locking_app_on_locked_coremask 00:04:20.690 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1653569 00:04:20.690 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 1653569 /var/tmp/spdk.sock 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1653569 ']' 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:20.691 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:20.691 15:09:22 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:20.691 [2024-09-27 15:09:22.476362] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:20.691 [2024-09-27 15:09:22.476421] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653569 ] 00:04:20.950 [2024-09-27 15:09:22.558105] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:20.950 [2024-09-27 15:09:22.647709] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:21.519 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:21.519 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:04:21.519 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1653625 00:04:21.519 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 1653625 /var/tmp/spdk2.sock 00:04:21.519 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:04:21.519 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1653625 ']' 00:04:21.520 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:04:21.520 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:21.520 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:04:21.520 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:04:21.520 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:21.520 15:09:23 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:21.520 [2024-09-27 15:09:23.363792] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:21.520 [2024-09-27 15:09:23.363848] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1653625 ] 00:04:21.779 [2024-09-27 15:09:23.463263] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:04:21.779 [2024-09-27 15:09:23.463301] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:22.039 [2024-09-27 15:09:23.636954] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:22.608 15:09:24 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:22.608 15:09:24 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:04:22.608 15:09:24 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 1653569 00:04:22.608 15:09:24 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1653569 00:04:22.608 15:09:24 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:04:23.545 lslocks: write error 00:04:23.545 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 1653569 00:04:23.545 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1653569 ']' 00:04:23.545 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 1653569 00:04:23.545 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:04:23.545 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:23.545 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1653569 00:04:23.805 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:23.805 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:23.805 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1653569' 00:04:23.805 killing process with pid 1653569 00:04:23.805 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 1653569 00:04:23.806 15:09:25 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 1653569 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 1653625 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1653625 ']' 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 1653625 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1653625 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1653625' 00:04:24.375 killing process with pid 1653625 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 1653625 00:04:24.375 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 1653625 00:04:25.007 00:04:25.007 real 0m4.150s 00:04:25.007 user 0m4.414s 00:04:25.007 sys 0m1.402s 00:04:25.007 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:25.007 15:09:26 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:25.007 ************************************ 00:04:25.007 END TEST non_locking_app_on_locked_coremask 00:04:25.007 ************************************ 00:04:25.007 15:09:26 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:04:25.007 15:09:26 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:25.007 15:09:26 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:25.007 15:09:26 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:25.007 ************************************ 00:04:25.007 START TEST locking_app_on_unlocked_coremask 00:04:25.007 ************************************ 00:04:25.007 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_unlocked_coremask 00:04:25.007 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1654159 00:04:25.007 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:04:25.007 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 1654159 /var/tmp/spdk.sock 00:04:25.007 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1654159 ']' 00:04:25.008 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:25.008 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:25.008 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:25.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:25.008 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:25.008 15:09:26 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:25.008 [2024-09-27 15:09:26.713643] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:25.008 [2024-09-27 15:09:26.713705] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654159 ] 00:04:25.008 [2024-09-27 15:09:26.799918] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:04:25.008 [2024-09-27 15:09:26.799949] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:25.268 [2024-09-27 15:09:26.889473] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1654293 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 1654293 /var/tmp/spdk2.sock 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1654293 ']' 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:25.836 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:04:25.836 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:04:25.837 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:25.837 15:09:27 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:25.837 [2024-09-27 15:09:27.620416] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:25.837 [2024-09-27 15:09:27.620476] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654293 ] 00:04:26.095 [2024-09-27 15:09:27.716687] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:26.095 [2024-09-27 15:09:27.883378] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:27.034 15:09:28 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:27.034 15:09:28 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:04:27.034 15:09:28 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 1654293 00:04:27.034 15:09:28 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1654293 00:04:27.034 15:09:28 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:04:27.973 lslocks: write error 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 1654159 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1654159 ']' 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 1654159 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1654159 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1654159' 00:04:27.973 killing process with pid 1654159 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 1654159 00:04:27.973 15:09:29 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 1654159 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 1654293 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1654293 ']' 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 1654293 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1654293 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1654293' 00:04:28.911 killing process with pid 1654293 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 1654293 00:04:28.911 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 1654293 00:04:29.170 00:04:29.170 real 0m4.167s 00:04:29.170 user 0m4.526s 00:04:29.170 sys 0m1.382s 00:04:29.170 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:29.170 15:09:30 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:29.170 ************************************ 00:04:29.170 END TEST locking_app_on_unlocked_coremask 00:04:29.170 ************************************ 00:04:29.170 15:09:30 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:04:29.170 15:09:30 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:29.170 15:09:30 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:29.170 15:09:30 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:29.170 ************************************ 00:04:29.170 START TEST locking_app_on_locked_coremask 00:04:29.170 ************************************ 00:04:29.170 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_locked_coremask 00:04:29.170 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1654757 00:04:29.170 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 1654757 /var/tmp/spdk.sock 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1654757 ']' 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:29.171 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:29.171 15:09:30 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:29.171 [2024-09-27 15:09:30.971175] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:29.171 [2024-09-27 15:09:30.971233] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654757 ] 00:04:29.430 [2024-09-27 15:09:31.055891] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:29.430 [2024-09-27 15:09:31.146249] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1654927 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1654927 /var/tmp/spdk2.sock 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@650 -- # local es=0 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1654927 /var/tmp/spdk2.sock 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # waitforlisten 1654927 /var/tmp/spdk2.sock 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 1654927 ']' 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:04:29.999 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:29.999 15:09:31 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:30.260 [2024-09-27 15:09:31.864909] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:30.260 [2024-09-27 15:09:31.864965] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1654927 ] 00:04:30.260 [2024-09-27 15:09:31.961188] app.c: 779:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1654757 has claimed it. 00:04:30.260 [2024-09-27 15:09:31.961228] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:04:30.829 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (1654927) - No such process 00:04:30.829 ERROR: process (pid: 1654927) is no longer running 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 1 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # es=1 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 1654757 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 1654757 00:04:30.829 15:09:32 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:04:31.397 lslocks: write error 00:04:31.397 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 1654757 00:04:31.397 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 1654757 ']' 00:04:31.397 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 1654757 00:04:31.397 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:04:31.397 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:31.397 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1654757 00:04:31.656 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:31.656 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:31.656 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1654757' 00:04:31.656 killing process with pid 1654757 00:04:31.656 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 1654757 00:04:31.656 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 1654757 00:04:31.916 00:04:31.916 real 0m2.716s 00:04:31.916 user 0m2.961s 00:04:31.916 sys 0m0.918s 00:04:31.916 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:31.916 15:09:33 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:31.916 ************************************ 00:04:31.916 END TEST locking_app_on_locked_coremask 00:04:31.916 ************************************ 00:04:31.916 15:09:33 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:04:31.916 15:09:33 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:31.916 15:09:33 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:31.916 15:09:33 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:31.916 ************************************ 00:04:31.916 START TEST locking_overlapped_coremask 00:04:31.916 ************************************ 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1655153 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 1655153 /var/tmp/spdk.sock 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 1655153 ']' 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:31.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:31.916 15:09:33 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:31.916 [2024-09-27 15:09:33.762200] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:31.916 [2024-09-27 15:09:33.762254] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655153 ] 00:04:32.174 [2024-09-27 15:09:33.846989] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:04:32.174 [2024-09-27 15:09:33.940116] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:04:32.174 [2024-09-27 15:09:33.940150] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:32.174 [2024-09-27 15:09:33.940151] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 0 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1655334 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1655334 /var/tmp/spdk2.sock 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@650 -- # local es=0 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1655334 /var/tmp/spdk2.sock 00:04:33.112 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # waitforlisten 1655334 /var/tmp/spdk2.sock 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 1655334 ']' 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:04:33.113 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:33.113 15:09:34 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:33.113 [2024-09-27 15:09:34.683663] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:33.113 [2024-09-27 15:09:34.683723] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655334 ] 00:04:33.113 [2024-09-27 15:09:34.783851] app.c: 779:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1655153 has claimed it. 00:04:33.113 [2024-09-27 15:09:34.783889] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:04:33.682 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (1655334) - No such process 00:04:33.682 ERROR: process (pid: 1655334) is no longer running 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 1 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # es=1 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 1655153 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@950 -- # '[' -z 1655153 ']' 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # kill -0 1655153 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # uname 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1655153 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1655153' 00:04:33.682 killing process with pid 1655153 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@969 -- # kill 1655153 00:04:33.682 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@974 -- # wait 1655153 00:04:33.941 00:04:33.941 real 0m2.033s 00:04:33.941 user 0m5.668s 00:04:33.941 sys 0m0.508s 00:04:33.941 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:33.941 15:09:35 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:04:33.941 ************************************ 00:04:33.941 END TEST locking_overlapped_coremask 00:04:33.941 ************************************ 00:04:34.199 15:09:35 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:04:34.199 15:09:35 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:34.199 15:09:35 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:34.199 15:09:35 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:34.199 ************************************ 00:04:34.199 START TEST locking_overlapped_coremask_via_rpc 00:04:34.199 ************************************ 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask_via_rpc 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1655477 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 1655477 /var/tmp/spdk.sock 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1655477 ']' 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:34.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:34.199 15:09:35 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:34.199 [2024-09-27 15:09:35.891708] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:34.200 [2024-09-27 15:09:35.891770] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655477 ] 00:04:34.200 [2024-09-27 15:09:35.974439] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:04:34.200 [2024-09-27 15:09:35.974472] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:04:34.459 [2024-09-27 15:09:36.065115] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:04:34.459 [2024-09-27 15:09:36.065218] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:34.459 [2024-09-27 15:09:36.065218] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1655565 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 1655565 /var/tmp/spdk2.sock 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1655565 ']' 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:04:35.027 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:35.027 15:09:36 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:35.027 [2024-09-27 15:09:36.790384] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:35.027 [2024-09-27 15:09:36.790447] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655565 ] 00:04:35.287 [2024-09-27 15:09:36.892204] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:04:35.287 [2024-09-27 15:09:36.892236] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:04:35.287 [2024-09-27 15:09:37.053458] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:04:35.287 [2024-09-27 15:09:37.057395] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:04:35.287 [2024-09-27 15:09:37.057396] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@650 -- # local es=0 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:35.856 [2024-09-27 15:09:37.652419] app.c: 779:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1655477 has claimed it. 00:04:35.856 request: 00:04:35.856 { 00:04:35.856 "method": "framework_enable_cpumask_locks", 00:04:35.856 "req_id": 1 00:04:35.856 } 00:04:35.856 Got JSON-RPC error response 00:04:35.856 response: 00:04:35.856 { 00:04:35.856 "code": -32603, 00:04:35.856 "message": "Failed to claim CPU core: 2" 00:04:35.856 } 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # es=1 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 1655477 /var/tmp/spdk.sock 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1655477 ']' 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:35.856 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:35.856 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 1655565 /var/tmp/spdk2.sock 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 1655565 ']' 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:04:36.116 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:36.116 15:09:37 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:04:36.375 00:04:36.375 real 0m2.233s 00:04:36.375 user 0m0.997s 00:04:36.375 sys 0m0.171s 00:04:36.375 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:36.376 15:09:38 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:04:36.376 ************************************ 00:04:36.376 END TEST locking_overlapped_coremask_via_rpc 00:04:36.376 ************************************ 00:04:36.376 15:09:38 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:04:36.376 15:09:38 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1655477 ]] 00:04:36.376 15:09:38 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1655477 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1655477 ']' 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1655477 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1655477 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1655477' 00:04:36.376 killing process with pid 1655477 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 1655477 00:04:36.376 15:09:38 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 1655477 00:04:36.946 15:09:38 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1655565 ]] 00:04:36.946 15:09:38 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1655565 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1655565 ']' 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1655565 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1655565 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1655565' 00:04:36.946 killing process with pid 1655565 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 1655565 00:04:36.946 15:09:38 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 1655565 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 1655477 ]] 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 1655477 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1655477 ']' 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1655477 00:04:37.205 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1655477) - No such process 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 1655477 is not found' 00:04:37.205 Process with pid 1655477 is not found 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 1655565 ]] 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 1655565 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 1655565 ']' 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 1655565 00:04:37.205 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1655565) - No such process 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 1655565 is not found' 00:04:37.205 Process with pid 1655565 is not found 00:04:37.205 15:09:38 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:04:37.205 00:04:37.205 real 0m20.721s 00:04:37.205 user 0m33.933s 00:04:37.205 sys 0m6.913s 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:37.205 15:09:38 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:04:37.205 ************************************ 00:04:37.205 END TEST cpu_locks 00:04:37.205 ************************************ 00:04:37.205 00:04:37.205 real 0m47.901s 00:04:37.205 user 1m28.728s 00:04:37.205 sys 0m11.502s 00:04:37.205 15:09:39 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:37.205 15:09:39 event -- common/autotest_common.sh@10 -- # set +x 00:04:37.205 ************************************ 00:04:37.205 END TEST event 00:04:37.205 ************************************ 00:04:37.465 15:09:39 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread/thread.sh 00:04:37.465 15:09:39 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:37.465 15:09:39 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:37.465 15:09:39 -- common/autotest_common.sh@10 -- # set +x 00:04:37.465 ************************************ 00:04:37.465 START TEST thread 00:04:37.465 ************************************ 00:04:37.465 15:09:39 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread/thread.sh 00:04:37.465 * Looking for test storage... 00:04:37.465 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread 00:04:37.465 15:09:39 thread -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:37.465 15:09:39 thread -- common/autotest_common.sh@1681 -- # lcov --version 00:04:37.465 15:09:39 thread -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:37.465 15:09:39 thread -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:37.465 15:09:39 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:37.465 15:09:39 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:37.465 15:09:39 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:37.465 15:09:39 thread -- scripts/common.sh@336 -- # IFS=.-: 00:04:37.465 15:09:39 thread -- scripts/common.sh@336 -- # read -ra ver1 00:04:37.465 15:09:39 thread -- scripts/common.sh@337 -- # IFS=.-: 00:04:37.465 15:09:39 thread -- scripts/common.sh@337 -- # read -ra ver2 00:04:37.465 15:09:39 thread -- scripts/common.sh@338 -- # local 'op=<' 00:04:37.465 15:09:39 thread -- scripts/common.sh@340 -- # ver1_l=2 00:04:37.466 15:09:39 thread -- scripts/common.sh@341 -- # ver2_l=1 00:04:37.466 15:09:39 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:37.466 15:09:39 thread -- scripts/common.sh@344 -- # case "$op" in 00:04:37.466 15:09:39 thread -- scripts/common.sh@345 -- # : 1 00:04:37.466 15:09:39 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:37.466 15:09:39 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:37.466 15:09:39 thread -- scripts/common.sh@365 -- # decimal 1 00:04:37.466 15:09:39 thread -- scripts/common.sh@353 -- # local d=1 00:04:37.466 15:09:39 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:37.466 15:09:39 thread -- scripts/common.sh@355 -- # echo 1 00:04:37.466 15:09:39 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:04:37.466 15:09:39 thread -- scripts/common.sh@366 -- # decimal 2 00:04:37.466 15:09:39 thread -- scripts/common.sh@353 -- # local d=2 00:04:37.466 15:09:39 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:37.466 15:09:39 thread -- scripts/common.sh@355 -- # echo 2 00:04:37.466 15:09:39 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:04:37.466 15:09:39 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:37.466 15:09:39 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:37.466 15:09:39 thread -- scripts/common.sh@368 -- # return 0 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:37.466 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.466 --rc genhtml_branch_coverage=1 00:04:37.466 --rc genhtml_function_coverage=1 00:04:37.466 --rc genhtml_legend=1 00:04:37.466 --rc geninfo_all_blocks=1 00:04:37.466 --rc geninfo_unexecuted_blocks=1 00:04:37.466 00:04:37.466 ' 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:37.466 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.466 --rc genhtml_branch_coverage=1 00:04:37.466 --rc genhtml_function_coverage=1 00:04:37.466 --rc genhtml_legend=1 00:04:37.466 --rc geninfo_all_blocks=1 00:04:37.466 --rc geninfo_unexecuted_blocks=1 00:04:37.466 00:04:37.466 ' 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:37.466 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.466 --rc genhtml_branch_coverage=1 00:04:37.466 --rc genhtml_function_coverage=1 00:04:37.466 --rc genhtml_legend=1 00:04:37.466 --rc geninfo_all_blocks=1 00:04:37.466 --rc geninfo_unexecuted_blocks=1 00:04:37.466 00:04:37.466 ' 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:37.466 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.466 --rc genhtml_branch_coverage=1 00:04:37.466 --rc genhtml_function_coverage=1 00:04:37.466 --rc genhtml_legend=1 00:04:37.466 --rc geninfo_all_blocks=1 00:04:37.466 --rc geninfo_unexecuted_blocks=1 00:04:37.466 00:04:37.466 ' 00:04:37.466 15:09:39 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:37.466 15:09:39 thread -- common/autotest_common.sh@10 -- # set +x 00:04:37.726 ************************************ 00:04:37.726 START TEST thread_poller_perf 00:04:37.726 ************************************ 00:04:37.726 15:09:39 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:04:37.726 [2024-09-27 15:09:39.365848] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:37.726 [2024-09-27 15:09:39.365933] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656037 ] 00:04:37.726 [2024-09-27 15:09:39.452260] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:37.726 [2024-09-27 15:09:39.537350] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:37.726 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:04:39.108 ====================================== 00:04:39.108 busy:2306573084 (cyc) 00:04:39.108 total_run_count: 410000 00:04:39.108 tsc_hz: 2300000000 (cyc) 00:04:39.108 ====================================== 00:04:39.108 poller_cost: 5625 (cyc), 2445 (nsec) 00:04:39.108 00:04:39.108 real 0m1.280s 00:04:39.108 user 0m1.171s 00:04:39.108 sys 0m0.103s 00:04:39.108 15:09:40 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:39.108 15:09:40 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:04:39.108 ************************************ 00:04:39.108 END TEST thread_poller_perf 00:04:39.108 ************************************ 00:04:39.108 15:09:40 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:04:39.108 15:09:40 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:04:39.108 15:09:40 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:39.108 15:09:40 thread -- common/autotest_common.sh@10 -- # set +x 00:04:39.108 ************************************ 00:04:39.108 START TEST thread_poller_perf 00:04:39.108 ************************************ 00:04:39.108 15:09:40 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:04:39.108 [2024-09-27 15:09:40.731480] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:39.108 [2024-09-27 15:09:40.731558] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656235 ] 00:04:39.108 [2024-09-27 15:09:40.819841] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:39.108 [2024-09-27 15:09:40.909208] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:39.108 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:04:40.489 ====================================== 00:04:40.489 busy:2301980192 (cyc) 00:04:40.489 total_run_count: 5537000 00:04:40.489 tsc_hz: 2300000000 (cyc) 00:04:40.489 ====================================== 00:04:40.489 poller_cost: 415 (cyc), 180 (nsec) 00:04:40.489 00:04:40.489 real 0m1.279s 00:04:40.489 user 0m1.168s 00:04:40.489 sys 0m0.105s 00:04:40.489 15:09:41 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:40.489 15:09:41 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:04:40.489 ************************************ 00:04:40.489 END TEST thread_poller_perf 00:04:40.489 ************************************ 00:04:40.489 15:09:42 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:04:40.489 00:04:40.489 real 0m2.921s 00:04:40.489 user 0m2.505s 00:04:40.489 sys 0m0.435s 00:04:40.490 15:09:42 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:40.490 15:09:42 thread -- common/autotest_common.sh@10 -- # set +x 00:04:40.490 ************************************ 00:04:40.490 END TEST thread 00:04:40.490 ************************************ 00:04:40.490 15:09:42 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:04:40.490 15:09:42 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app/cmdline.sh 00:04:40.490 15:09:42 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:40.490 15:09:42 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:40.490 15:09:42 -- common/autotest_common.sh@10 -- # set +x 00:04:40.490 ************************************ 00:04:40.490 START TEST app_cmdline 00:04:40.490 ************************************ 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app/cmdline.sh 00:04:40.490 * Looking for test storage... 00:04:40.490 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1681 -- # lcov --version 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@345 -- # : 1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:40.490 15:09:42 app_cmdline -- scripts/common.sh@368 -- # return 0 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:40.490 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.490 --rc genhtml_branch_coverage=1 00:04:40.490 --rc genhtml_function_coverage=1 00:04:40.490 --rc genhtml_legend=1 00:04:40.490 --rc geninfo_all_blocks=1 00:04:40.490 --rc geninfo_unexecuted_blocks=1 00:04:40.490 00:04:40.490 ' 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:40.490 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.490 --rc genhtml_branch_coverage=1 00:04:40.490 --rc genhtml_function_coverage=1 00:04:40.490 --rc genhtml_legend=1 00:04:40.490 --rc geninfo_all_blocks=1 00:04:40.490 --rc geninfo_unexecuted_blocks=1 00:04:40.490 00:04:40.490 ' 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:40.490 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.490 --rc genhtml_branch_coverage=1 00:04:40.490 --rc genhtml_function_coverage=1 00:04:40.490 --rc genhtml_legend=1 00:04:40.490 --rc geninfo_all_blocks=1 00:04:40.490 --rc geninfo_unexecuted_blocks=1 00:04:40.490 00:04:40.490 ' 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:40.490 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:40.490 --rc genhtml_branch_coverage=1 00:04:40.490 --rc genhtml_function_coverage=1 00:04:40.490 --rc genhtml_legend=1 00:04:40.490 --rc geninfo_all_blocks=1 00:04:40.490 --rc geninfo_unexecuted_blocks=1 00:04:40.490 00:04:40.490 ' 00:04:40.490 15:09:42 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:04:40.490 15:09:42 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1656496 00:04:40.490 15:09:42 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1656496 00:04:40.490 15:09:42 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 1656496 ']' 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:40.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:40.490 15:09:42 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:04:40.750 [2024-09-27 15:09:42.372160] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:40.750 [2024-09-27 15:09:42.372223] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656496 ] 00:04:40.750 [2024-09-27 15:09:42.457434] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:40.750 [2024-09-27 15:09:42.537668] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:04:41.689 { 00:04:41.689 "version": "SPDK v25.01-pre git sha1 71dc0c1e9", 00:04:41.689 "fields": { 00:04:41.689 "major": 25, 00:04:41.689 "minor": 1, 00:04:41.689 "patch": 0, 00:04:41.689 "suffix": "-pre", 00:04:41.689 "commit": "71dc0c1e9" 00:04:41.689 } 00:04:41.689 } 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@26 -- # sort 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:04:41.689 15:09:43 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py ]] 00:04:41.689 15:09:43 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:04:41.949 request: 00:04:41.949 { 00:04:41.949 "method": "env_dpdk_get_mem_stats", 00:04:41.949 "req_id": 1 00:04:41.949 } 00:04:41.949 Got JSON-RPC error response 00:04:41.949 response: 00:04:41.949 { 00:04:41.949 "code": -32601, 00:04:41.949 "message": "Method not found" 00:04:41.949 } 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:04:41.949 15:09:43 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1656496 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 1656496 ']' 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 1656496 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1656496 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1656496' 00:04:41.949 killing process with pid 1656496 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@969 -- # kill 1656496 00:04:41.949 15:09:43 app_cmdline -- common/autotest_common.sh@974 -- # wait 1656496 00:04:42.209 00:04:42.209 real 0m1.934s 00:04:42.209 user 0m2.228s 00:04:42.209 sys 0m0.558s 00:04:42.209 15:09:44 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:42.209 15:09:44 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:04:42.209 ************************************ 00:04:42.209 END TEST app_cmdline 00:04:42.209 ************************************ 00:04:42.471 15:09:44 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app/version.sh 00:04:42.471 15:09:44 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:42.471 15:09:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:42.471 15:09:44 -- common/autotest_common.sh@10 -- # set +x 00:04:42.471 ************************************ 00:04:42.471 START TEST version 00:04:42.471 ************************************ 00:04:42.471 15:09:44 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app/version.sh 00:04:42.471 * Looking for test storage... 00:04:42.471 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app 00:04:42.471 15:09:44 version -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:42.471 15:09:44 version -- common/autotest_common.sh@1681 -- # lcov --version 00:04:42.471 15:09:44 version -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:42.731 15:09:44 version -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:42.731 15:09:44 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:42.731 15:09:44 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:42.731 15:09:44 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:42.731 15:09:44 version -- scripts/common.sh@336 -- # IFS=.-: 00:04:42.731 15:09:44 version -- scripts/common.sh@336 -- # read -ra ver1 00:04:42.731 15:09:44 version -- scripts/common.sh@337 -- # IFS=.-: 00:04:42.731 15:09:44 version -- scripts/common.sh@337 -- # read -ra ver2 00:04:42.731 15:09:44 version -- scripts/common.sh@338 -- # local 'op=<' 00:04:42.731 15:09:44 version -- scripts/common.sh@340 -- # ver1_l=2 00:04:42.731 15:09:44 version -- scripts/common.sh@341 -- # ver2_l=1 00:04:42.731 15:09:44 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:42.731 15:09:44 version -- scripts/common.sh@344 -- # case "$op" in 00:04:42.731 15:09:44 version -- scripts/common.sh@345 -- # : 1 00:04:42.731 15:09:44 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:42.731 15:09:44 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:42.731 15:09:44 version -- scripts/common.sh@365 -- # decimal 1 00:04:42.731 15:09:44 version -- scripts/common.sh@353 -- # local d=1 00:04:42.731 15:09:44 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:42.731 15:09:44 version -- scripts/common.sh@355 -- # echo 1 00:04:42.731 15:09:44 version -- scripts/common.sh@365 -- # ver1[v]=1 00:04:42.731 15:09:44 version -- scripts/common.sh@366 -- # decimal 2 00:04:42.731 15:09:44 version -- scripts/common.sh@353 -- # local d=2 00:04:42.731 15:09:44 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:42.731 15:09:44 version -- scripts/common.sh@355 -- # echo 2 00:04:42.731 15:09:44 version -- scripts/common.sh@366 -- # ver2[v]=2 00:04:42.731 15:09:44 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:42.731 15:09:44 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:42.731 15:09:44 version -- scripts/common.sh@368 -- # return 0 00:04:42.731 15:09:44 version -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:42.731 15:09:44 version -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:42.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.731 --rc genhtml_branch_coverage=1 00:04:42.731 --rc genhtml_function_coverage=1 00:04:42.731 --rc genhtml_legend=1 00:04:42.731 --rc geninfo_all_blocks=1 00:04:42.731 --rc geninfo_unexecuted_blocks=1 00:04:42.731 00:04:42.731 ' 00:04:42.731 15:09:44 version -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:42.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.731 --rc genhtml_branch_coverage=1 00:04:42.731 --rc genhtml_function_coverage=1 00:04:42.731 --rc genhtml_legend=1 00:04:42.731 --rc geninfo_all_blocks=1 00:04:42.731 --rc geninfo_unexecuted_blocks=1 00:04:42.731 00:04:42.731 ' 00:04:42.731 15:09:44 version -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:42.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.731 --rc genhtml_branch_coverage=1 00:04:42.731 --rc genhtml_function_coverage=1 00:04:42.731 --rc genhtml_legend=1 00:04:42.731 --rc geninfo_all_blocks=1 00:04:42.731 --rc geninfo_unexecuted_blocks=1 00:04:42.731 00:04:42.731 ' 00:04:42.731 15:09:44 version -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:42.731 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.731 --rc genhtml_branch_coverage=1 00:04:42.731 --rc genhtml_function_coverage=1 00:04:42.731 --rc genhtml_legend=1 00:04:42.731 --rc geninfo_all_blocks=1 00:04:42.731 --rc geninfo_unexecuted_blocks=1 00:04:42.731 00:04:42.731 ' 00:04:42.731 15:09:44 version -- app/version.sh@17 -- # get_header_version major 00:04:42.731 15:09:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/nvmf-phy-autotest/spdk/include/spdk/version.h 00:04:42.731 15:09:44 version -- app/version.sh@14 -- # cut -f2 00:04:42.731 15:09:44 version -- app/version.sh@14 -- # tr -d '"' 00:04:42.731 15:09:44 version -- app/version.sh@17 -- # major=25 00:04:42.731 15:09:44 version -- app/version.sh@18 -- # get_header_version minor 00:04:42.732 15:09:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/nvmf-phy-autotest/spdk/include/spdk/version.h 00:04:42.732 15:09:44 version -- app/version.sh@14 -- # cut -f2 00:04:42.732 15:09:44 version -- app/version.sh@14 -- # tr -d '"' 00:04:42.732 15:09:44 version -- app/version.sh@18 -- # minor=1 00:04:42.732 15:09:44 version -- app/version.sh@19 -- # get_header_version patch 00:04:42.732 15:09:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/nvmf-phy-autotest/spdk/include/spdk/version.h 00:04:42.732 15:09:44 version -- app/version.sh@14 -- # cut -f2 00:04:42.732 15:09:44 version -- app/version.sh@14 -- # tr -d '"' 00:04:42.732 15:09:44 version -- app/version.sh@19 -- # patch=0 00:04:42.732 15:09:44 version -- app/version.sh@20 -- # get_header_version suffix 00:04:42.732 15:09:44 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/nvmf-phy-autotest/spdk/include/spdk/version.h 00:04:42.732 15:09:44 version -- app/version.sh@14 -- # cut -f2 00:04:42.732 15:09:44 version -- app/version.sh@14 -- # tr -d '"' 00:04:42.732 15:09:44 version -- app/version.sh@20 -- # suffix=-pre 00:04:42.732 15:09:44 version -- app/version.sh@22 -- # version=25.1 00:04:42.732 15:09:44 version -- app/version.sh@25 -- # (( patch != 0 )) 00:04:42.732 15:09:44 version -- app/version.sh@28 -- # version=25.1rc0 00:04:42.732 15:09:44 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python 00:04:42.732 15:09:44 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:04:42.732 15:09:44 version -- app/version.sh@30 -- # py_version=25.1rc0 00:04:42.732 15:09:44 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:04:42.732 00:04:42.732 real 0m0.281s 00:04:42.732 user 0m0.149s 00:04:42.732 sys 0m0.190s 00:04:42.732 15:09:44 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:42.732 15:09:44 version -- common/autotest_common.sh@10 -- # set +x 00:04:42.732 ************************************ 00:04:42.732 END TEST version 00:04:42.732 ************************************ 00:04:42.732 15:09:44 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:04:42.732 15:09:44 -- spdk/autotest.sh@194 -- # uname -s 00:04:42.732 15:09:44 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:04:42.732 15:09:44 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:04:42.732 15:09:44 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:04:42.732 15:09:44 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@256 -- # timing_exit lib 00:04:42.732 15:09:44 -- common/autotest_common.sh@730 -- # xtrace_disable 00:04:42.732 15:09:44 -- common/autotest_common.sh@10 -- # set +x 00:04:42.732 15:09:44 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@263 -- # '[' 0 -eq 1 ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@272 -- # '[' 1 -eq 1 ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@273 -- # export NET_TYPE 00:04:42.732 15:09:44 -- spdk/autotest.sh@276 -- # '[' rdma = rdma ']' 00:04:42.732 15:09:44 -- spdk/autotest.sh@277 -- # run_test nvmf_rdma /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf.sh --transport=rdma 00:04:42.732 15:09:44 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:04:42.732 15:09:44 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:42.732 15:09:44 -- common/autotest_common.sh@10 -- # set +x 00:04:42.732 ************************************ 00:04:42.732 START TEST nvmf_rdma 00:04:42.732 ************************************ 00:04:42.732 15:09:44 nvmf_rdma -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf.sh --transport=rdma 00:04:42.992 * Looking for test storage... 00:04:42.992 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1681 -- # lcov --version 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@336 -- # IFS=.-: 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@336 -- # read -ra ver1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@337 -- # IFS=.-: 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@337 -- # read -ra ver2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@338 -- # local 'op=<' 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@340 -- # ver1_l=2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@341 -- # ver2_l=1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@344 -- # case "$op" in 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@345 -- # : 1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@365 -- # decimal 1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@353 -- # local d=1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@355 -- # echo 1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@365 -- # ver1[v]=1 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@366 -- # decimal 2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@353 -- # local d=2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@355 -- # echo 2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@366 -- # ver2[v]=2 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:42.992 15:09:44 nvmf_rdma -- scripts/common.sh@368 -- # return 0 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:42.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.992 --rc genhtml_branch_coverage=1 00:04:42.992 --rc genhtml_function_coverage=1 00:04:42.992 --rc genhtml_legend=1 00:04:42.992 --rc geninfo_all_blocks=1 00:04:42.992 --rc geninfo_unexecuted_blocks=1 00:04:42.992 00:04:42.992 ' 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:42.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.992 --rc genhtml_branch_coverage=1 00:04:42.992 --rc genhtml_function_coverage=1 00:04:42.992 --rc genhtml_legend=1 00:04:42.992 --rc geninfo_all_blocks=1 00:04:42.992 --rc geninfo_unexecuted_blocks=1 00:04:42.992 00:04:42.992 ' 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:42.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.992 --rc genhtml_branch_coverage=1 00:04:42.992 --rc genhtml_function_coverage=1 00:04:42.992 --rc genhtml_legend=1 00:04:42.992 --rc geninfo_all_blocks=1 00:04:42.992 --rc geninfo_unexecuted_blocks=1 00:04:42.992 00:04:42.992 ' 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:42.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.992 --rc genhtml_branch_coverage=1 00:04:42.992 --rc genhtml_function_coverage=1 00:04:42.992 --rc genhtml_legend=1 00:04:42.992 --rc geninfo_all_blocks=1 00:04:42.992 --rc geninfo_unexecuted_blocks=1 00:04:42.992 00:04:42.992 ' 00:04:42.992 15:09:44 nvmf_rdma -- nvmf/nvmf.sh@10 -- # run_test nvmf_target_core /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_target_core.sh --transport=rdma 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:42.992 15:09:44 nvmf_rdma -- common/autotest_common.sh@10 -- # set +x 00:04:42.992 ************************************ 00:04:42.992 START TEST nvmf_target_core 00:04:42.992 ************************************ 00:04:42.992 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_target_core.sh --transport=rdma 00:04:43.252 * Looking for test storage... 00:04:43.252 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1681 -- # lcov --version 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@336 -- # IFS=.-: 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@336 -- # read -ra ver1 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@337 -- # IFS=.-: 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@337 -- # read -ra ver2 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@338 -- # local 'op=<' 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@340 -- # ver1_l=2 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@341 -- # ver2_l=1 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@344 -- # case "$op" in 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@345 -- # : 1 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@365 -- # decimal 1 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@353 -- # local d=1 00:04:43.252 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@355 -- # echo 1 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@365 -- # ver1[v]=1 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@366 -- # decimal 2 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@353 -- # local d=2 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@355 -- # echo 2 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@366 -- # ver2[v]=2 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- scripts/common.sh@368 -- # return 0 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:43.253 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.253 --rc genhtml_branch_coverage=1 00:04:43.253 --rc genhtml_function_coverage=1 00:04:43.253 --rc genhtml_legend=1 00:04:43.253 --rc geninfo_all_blocks=1 00:04:43.253 --rc geninfo_unexecuted_blocks=1 00:04:43.253 00:04:43.253 ' 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:43.253 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.253 --rc genhtml_branch_coverage=1 00:04:43.253 --rc genhtml_function_coverage=1 00:04:43.253 --rc genhtml_legend=1 00:04:43.253 --rc geninfo_all_blocks=1 00:04:43.253 --rc geninfo_unexecuted_blocks=1 00:04:43.253 00:04:43.253 ' 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:43.253 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.253 --rc genhtml_branch_coverage=1 00:04:43.253 --rc genhtml_function_coverage=1 00:04:43.253 --rc genhtml_legend=1 00:04:43.253 --rc geninfo_all_blocks=1 00:04:43.253 --rc geninfo_unexecuted_blocks=1 00:04:43.253 00:04:43.253 ' 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:43.253 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.253 --rc genhtml_branch_coverage=1 00:04:43.253 --rc genhtml_function_coverage=1 00:04:43.253 --rc genhtml_legend=1 00:04:43.253 --rc geninfo_all_blocks=1 00:04:43.253 --rc geninfo_unexecuted_blocks=1 00:04:43.253 00:04:43.253 ' 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@7 -- # uname -s 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:43.253 15:09:44 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- scripts/common.sh@15 -- # shopt -s extglob 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- paths/export.sh@5 -- # export PATH 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@50 -- # : 0 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:04:43.253 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/common.sh@54 -- # have_pci_nics=0 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@11 -- # trap 'exit 1' SIGINT SIGTERM EXIT 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@13 -- # TEST_ARGS=("$@") 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@15 -- # [[ 0 -eq 0 ]] 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@16 -- # run_test nvmf_abort /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/abort.sh --transport=rdma 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:04:43.253 ************************************ 00:04:43.253 START TEST nvmf_abort 00:04:43.253 ************************************ 00:04:43.253 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/abort.sh --transport=rdma 00:04:43.514 * Looking for test storage... 00:04:43.514 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1681 -- # lcov --version 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@336 -- # IFS=.-: 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@336 -- # read -ra ver1 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@337 -- # IFS=.-: 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@337 -- # read -ra ver2 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@338 -- # local 'op=<' 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@340 -- # ver1_l=2 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@341 -- # ver2_l=1 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@344 -- # case "$op" in 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@345 -- # : 1 00:04:43.514 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@365 -- # decimal 1 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@353 -- # local d=1 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@355 -- # echo 1 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@365 -- # ver1[v]=1 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@366 -- # decimal 2 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@353 -- # local d=2 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@355 -- # echo 2 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@366 -- # ver2[v]=2 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@368 -- # return 0 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:43.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.515 --rc genhtml_branch_coverage=1 00:04:43.515 --rc genhtml_function_coverage=1 00:04:43.515 --rc genhtml_legend=1 00:04:43.515 --rc geninfo_all_blocks=1 00:04:43.515 --rc geninfo_unexecuted_blocks=1 00:04:43.515 00:04:43.515 ' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:43.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.515 --rc genhtml_branch_coverage=1 00:04:43.515 --rc genhtml_function_coverage=1 00:04:43.515 --rc genhtml_legend=1 00:04:43.515 --rc geninfo_all_blocks=1 00:04:43.515 --rc geninfo_unexecuted_blocks=1 00:04:43.515 00:04:43.515 ' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:43.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.515 --rc genhtml_branch_coverage=1 00:04:43.515 --rc genhtml_function_coverage=1 00:04:43.515 --rc genhtml_legend=1 00:04:43.515 --rc geninfo_all_blocks=1 00:04:43.515 --rc geninfo_unexecuted_blocks=1 00:04:43.515 00:04:43.515 ' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:43.515 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.515 --rc genhtml_branch_coverage=1 00:04:43.515 --rc genhtml_function_coverage=1 00:04:43.515 --rc genhtml_legend=1 00:04:43.515 --rc geninfo_all_blocks=1 00:04:43.515 --rc geninfo_unexecuted_blocks=1 00:04:43.515 00:04:43.515 ' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@7 -- # uname -s 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@15 -- # shopt -s extglob 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- paths/export.sh@5 -- # export PATH 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@50 -- # : 0 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:04:43.515 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@54 -- # have_pci_nics=0 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@11 -- # MALLOC_BDEV_SIZE=64 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@12 -- # MALLOC_BLOCK_SIZE=4096 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@14 -- # nvmftestinit 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:04:43.515 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@292 -- # prepare_net_devs 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@254 -- # local -g is_hw=no 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@256 -- # remove_target_ns 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@22 -- # _remove_target_ns 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@125 -- # xtrace_disable 00:04:43.516 15:09:45 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@131 -- # pci_devs=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@131 -- # local -a pci_devs 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@132 -- # pci_net_devs=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@133 -- # pci_drivers=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@133 -- # local -A pci_drivers 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@135 -- # net_devs=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@135 -- # local -ga net_devs 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@136 -- # e810=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@136 -- # local -ga e810 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@137 -- # x722=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@137 -- # local -ga x722 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@138 -- # mlx=() 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@138 -- # local -ga mlx 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:04:51.647 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:04:51.647 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:04:51.647 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:04:51.648 Found net devices under 0000:18:00.0: mlx_0_0 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:04:51.648 Found net devices under 0000:18:00.1: mlx_0_1 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@249 -- # get_rdma_if_list 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@75 -- # rdma_devs=() 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:04:51.648 15:09:51 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@89 -- # continue 2 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@89 -- # continue 2 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@258 -- # is_hw=yes 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@61 -- # uname 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@65 -- # modprobe ib_cm 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@66 -- # modprobe ib_core 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@67 -- # modprobe ib_umad 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@69 -- # modprobe iw_cm 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@27 -- # local -gA dev_map 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@28 -- # local -g _dev 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@44 -- # ips=() 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@58 -- # key_initiator=target1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@11 -- # local val=167772161 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:04:51.648 10.0.0.1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@11 -- # local val=167772162 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:04:51.648 10.0.0.2 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:04:51.648 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@38 -- # ping_ips 1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # get_net_dev target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@107 -- # local dev=target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:04:51.649 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:04:51.649 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:04:51.649 00:04:51.649 --- 10.0.0.2 ping statistics --- 00:04:51.649 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:04:51.649 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # get_net_dev target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@107 -- # local dev=target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:04:51.649 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:04:51.649 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:04:51.649 00:04:51.649 --- 10.0.0.2 ping statistics --- 00:04:51.649 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:04:51.649 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@98 -- # (( pair++ )) 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@266 -- # return 0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # get_net_dev target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@107 -- # local dev=target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # get_net_dev target1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@107 -- # local dev=target1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # get_net_dev target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@107 -- # local dev=target0 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:04:51.649 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # get_net_dev target1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@107 -- # local dev=target1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@15 -- # nvmfappstart -m 0xE 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@324 -- # nvmfpid=1659934 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xE 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@325 -- # waitforlisten 1659934 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@831 -- # '[' -z 1659934 ']' 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@836 -- # local max_retries=100 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:51.650 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@840 -- # xtrace_disable 00:04:51.650 15:09:52 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 [2024-09-27 15:09:52.338491] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:04:51.650 [2024-09-27 15:09:52.338553] [ DPDK EAL parameters: nvmf -c 0xE --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:04:51.650 [2024-09-27 15:09:52.422094] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:04:51.650 [2024-09-27 15:09:52.516487] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:04:51.650 [2024-09-27 15:09:52.516530] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:04:51.650 [2024-09-27 15:09:52.516540] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:04:51.650 [2024-09-27 15:09:52.516549] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:04:51.650 [2024-09-27 15:09:52.516556] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:04:51.650 [2024-09-27 15:09:52.516617] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:04:51.650 [2024-09-27 15:09:52.516642] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:04:51.650 [2024-09-27 15:09:52.516643] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@864 -- # return 0 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@730 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@17 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 -a 256 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 [2024-09-27 15:09:53.296737] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1f40c10/0x1f45100) succeed. 00:04:51.650 [2024-09-27 15:09:53.317437] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1f421b0/0x1f867a0) succeed. 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@20 -- # rpc_cmd bdev_malloc_create 64 4096 -b Malloc0 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 Malloc0 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@21 -- # rpc_cmd bdev_delay_create -b Malloc0 -d Delay0 -r 1000000 -t 1000000 -w 1000000 -n 1000000 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 Delay0 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@24 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 Delay0 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4420 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.650 [2024-09-27 15:09:53.484938] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@27 -- # rpc_cmd nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.650 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:51.909 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.909 15:09:53 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@30 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/abort -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -c 0x1 -t 1 -l warning -q 128 00:04:51.909 [2024-09-27 15:09:53.603440] nvme_fabric.c: 295:nvme_fabric_discover_probe: *WARNING*: Skipping unsupported current discovery service or discovery service referral 00:04:54.449 Initializing NVMe Controllers 00:04:54.449 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:04:54.449 controller IO queue size 128 less than required 00:04:54.449 Consider using lower queue depth or small IO size because IO requests may be queued at the NVMe driver. 00:04:54.449 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 0 00:04:54.449 Initialization complete. Launching workers. 00:04:54.449 NS: RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 I/O completed: 123, failed: 41690 00:04:54.449 CTRLR: RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) abort submitted 41751, failed to submit 62 00:04:54.449 success 41691, unsuccessful 60, failed 0 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@34 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode0 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@36 -- # trap - SIGINT SIGTERM EXIT 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- target/abort.sh@38 -- # nvmftestfini 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@331 -- # nvmfcleanup 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@99 -- # sync 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@102 -- # set +e 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@103 -- # for i in {1..20} 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:04:54.449 rmmod nvme_rdma 00:04:54.449 rmmod nvme_fabrics 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@106 -- # set -e 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@107 -- # return 0 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@332 -- # '[' -n 1659934 ']' 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@333 -- # killprocess 1659934 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@950 -- # '[' -z 1659934 ']' 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@954 -- # kill -0 1659934 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@955 -- # uname 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1659934 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1659934' 00:04:54.449 killing process with pid 1659934 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@969 -- # kill 1659934 00:04:54.449 15:09:55 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@974 -- # wait 1659934 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@338 -- # nvmf_fini 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@264 -- # local dev 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@267 -- # remove_target_ns 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@22 -- # _remove_target_ns 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@268 -- # delete_main_bridge 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@130 -- # return 0 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:04:54.449 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@41 -- # _dev=0 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@41 -- # dev_map=() 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/setup.sh@284 -- # iptr 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@538 -- # iptables-save 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- nvmf/common.sh@538 -- # iptables-restore 00:04:54.450 00:04:54.450 real 0m11.101s 00:04:54.450 user 0m15.069s 00:04:54.450 sys 0m5.860s 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_abort -- common/autotest_common.sh@10 -- # set +x 00:04:54.450 ************************************ 00:04:54.450 END TEST nvmf_abort 00:04:54.450 ************************************ 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@17 -- # run_test nvmf_ns_hotplug_stress /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/ns_hotplug_stress.sh --transport=rdma 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:04:54.450 ************************************ 00:04:54.450 START TEST nvmf_ns_hotplug_stress 00:04:54.450 ************************************ 00:04:54.450 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/ns_hotplug_stress.sh --transport=rdma 00:04:54.711 * Looking for test storage... 00:04:54.711 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1681 -- # lcov --version 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@336 -- # IFS=.-: 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@336 -- # read -ra ver1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@337 -- # IFS=.-: 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@337 -- # read -ra ver2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@338 -- # local 'op=<' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@340 -- # ver1_l=2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@341 -- # ver2_l=1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@344 -- # case "$op" in 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@345 -- # : 1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@365 -- # decimal 1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@353 -- # local d=1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@355 -- # echo 1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@365 -- # ver1[v]=1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@366 -- # decimal 2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@353 -- # local d=2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@355 -- # echo 2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@366 -- # ver2[v]=2 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@368 -- # return 0 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:54.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.711 --rc genhtml_branch_coverage=1 00:04:54.711 --rc genhtml_function_coverage=1 00:04:54.711 --rc genhtml_legend=1 00:04:54.711 --rc geninfo_all_blocks=1 00:04:54.711 --rc geninfo_unexecuted_blocks=1 00:04:54.711 00:04:54.711 ' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:54.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.711 --rc genhtml_branch_coverage=1 00:04:54.711 --rc genhtml_function_coverage=1 00:04:54.711 --rc genhtml_legend=1 00:04:54.711 --rc geninfo_all_blocks=1 00:04:54.711 --rc geninfo_unexecuted_blocks=1 00:04:54.711 00:04:54.711 ' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:54.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.711 --rc genhtml_branch_coverage=1 00:04:54.711 --rc genhtml_function_coverage=1 00:04:54.711 --rc genhtml_legend=1 00:04:54.711 --rc geninfo_all_blocks=1 00:04:54.711 --rc geninfo_unexecuted_blocks=1 00:04:54.711 00:04:54.711 ' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:54.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:54.711 --rc genhtml_branch_coverage=1 00:04:54.711 --rc genhtml_function_coverage=1 00:04:54.711 --rc genhtml_legend=1 00:04:54.711 --rc geninfo_all_blocks=1 00:04:54.711 --rc geninfo_unexecuted_blocks=1 00:04:54.711 00:04:54.711 ' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@7 -- # uname -s 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@15 -- # shopt -s extglob 00:04:54.711 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- paths/export.sh@5 -- # export PATH 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@50 -- # : 0 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:04:54.712 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@54 -- # have_pci_nics=0 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@11 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@22 -- # nvmftestinit 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@292 -- # prepare_net_devs 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@254 -- # local -g is_hw=no 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@256 -- # remove_target_ns 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@22 -- # _remove_target_ns 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@125 -- # xtrace_disable 00:04:54.712 15:09:56 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@10 -- # set +x 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@131 -- # pci_devs=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@131 -- # local -a pci_devs 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@132 -- # pci_net_devs=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@133 -- # pci_drivers=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@133 -- # local -A pci_drivers 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@135 -- # net_devs=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@135 -- # local -ga net_devs 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@136 -- # e810=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@136 -- # local -ga e810 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@137 -- # x722=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@137 -- # local -ga x722 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@138 -- # mlx=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@138 -- # local -ga mlx 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:05:02.906 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:05:02.906 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:05:02.906 Found net devices under 0000:18:00.0: mlx_0_0 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:05:02.906 Found net devices under 0000:18:00.1: mlx_0_1 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@249 -- # get_rdma_if_list 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@75 -- # rdma_devs=() 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:05:02.906 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@89 -- # continue 2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@89 -- # continue 2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@258 -- # is_hw=yes 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@61 -- # uname 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@65 -- # modprobe ib_cm 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@66 -- # modprobe ib_core 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@67 -- # modprobe ib_umad 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@69 -- # modprobe iw_cm 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@27 -- # local -gA dev_map 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@28 -- # local -g _dev 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@44 -- # ips=() 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@58 -- # key_initiator=target1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@11 -- # local val=167772161 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:05:02.907 10.0.0.1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@11 -- # local val=167772162 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:05:02.907 10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@38 -- # ping_ips 1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:05:02.907 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:05:02.907 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.033 ms 00:05:02.907 00:05:02.907 --- 10.0.0.2 ping statistics --- 00:05:02.907 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:05:02.907 rtt min/avg/max/mdev = 0.033/0.033/0.033/0.000 ms 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:05:02.907 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:05:02.907 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:05:02.907 00:05:02.907 --- 10.0.0.2 ping statistics --- 00:05:02.907 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:05:02.907 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@98 -- # (( pair++ )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@266 -- # return 0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # get_net_dev target1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@107 -- # local dev=target1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:02.907 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # get_net_dev target1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@107 -- # local dev=target1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@23 -- # nvmfappstart -m 0xE 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@10 -- # set +x 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@324 -- # nvmfpid=1663434 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@325 -- # waitforlisten 1663434 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xE 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@831 -- # '[' -z 1663434 ']' 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:02.908 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:02.908 15:10:03 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@10 -- # set +x 00:05:02.908 [2024-09-27 15:10:03.615110] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:05:02.908 [2024-09-27 15:10:03.615176] [ DPDK EAL parameters: nvmf -c 0xE --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:05:02.908 [2024-09-27 15:10:03.700300] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:02.908 [2024-09-27 15:10:03.790728] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:05:02.908 [2024-09-27 15:10:03.790770] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:05:02.908 [2024-09-27 15:10:03.790780] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:02.908 [2024-09-27 15:10:03.790789] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:02.908 [2024-09-27 15:10:03.790796] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:05:02.908 [2024-09-27 15:10:03.793364] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:05:02.908 [2024-09-27 15:10:03.793468] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:05:02.908 [2024-09-27 15:10:03.793469] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@864 -- # return 0 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@10 -- # set +x 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@25 -- # null_size=1000 00:05:02.908 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:05:02.908 [2024-09-27 15:10:04.719382] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0xf36c10/0xf3b100) succeed. 00:05:02.908 [2024-09-27 15:10:04.729903] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0xf381b0/0xf7c7a0) succeed. 00:05:03.166 15:10:04 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -m 10 00:05:03.426 15:10:05 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@30 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:05:03.426 [2024-09-27 15:10:05.245907] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:05:03.685 15:10:05 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:05:03.685 15:10:05 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@32 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 32 512 -b Malloc0 00:05:03.943 Malloc0 00:05:03.943 15:10:05 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_delay_create -b Malloc0 -d Delay0 -r 1000000 -t 1000000 -w 1000000 -n 1000000 00:05:04.203 Delay0 00:05:04.203 15:10:05 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:04.462 15:10:06 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@35 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create NULL1 1000 512 00:05:04.462 NULL1 00:05:04.463 15:10:06 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 NULL1 00:05:04.721 15:10:06 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@42 -- # PERF_PID=1663899 00:05:04.721 15:10:06 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@40 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -c 0x1 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -t 30 -q 128 -w randread -o 512 -Q 1000 00:05:04.721 15:10:06 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:04.721 15:10:06 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:06.100 Read completed with error (sct=0, sc=11) 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.100 15:10:07 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.100 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.101 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.101 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:06.101 15:10:07 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1001 00:05:06.101 15:10:07 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1001 00:05:06.359 true 00:05:06.359 15:10:08 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:06.359 15:10:08 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 15:10:08 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:07.296 15:10:09 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1002 00:05:07.296 15:10:09 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1002 00:05:07.555 true 00:05:07.555 15:10:09 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:07.555 15:10:09 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 15:10:10 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.491 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:08.750 15:10:10 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1003 00:05:08.750 15:10:10 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1003 00:05:08.750 true 00:05:08.750 15:10:10 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:08.750 15:10:10 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 15:10:11 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.686 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:09.944 15:10:11 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1004 00:05:09.944 15:10:11 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1004 00:05:09.944 true 00:05:09.944 15:10:11 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:09.944 15:10:11 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:10.882 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:10.882 15:10:12 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:10.882 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:10.882 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:10.882 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:10.882 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:10.882 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:11.140 15:10:12 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1005 00:05:11.140 15:10:12 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1005 00:05:11.140 true 00:05:11.140 15:10:12 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:11.140 15:10:12 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.079 15:10:13 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.079 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.338 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:12.338 15:10:13 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1006 00:05:12.338 15:10:13 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1006 00:05:12.338 true 00:05:12.338 15:10:14 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:12.338 15:10:14 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.278 15:10:14 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.278 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.537 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:13.537 15:10:15 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1007 00:05:13.537 15:10:15 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1007 00:05:13.537 true 00:05:13.797 15:10:15 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:13.797 15:10:15 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:14.367 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.367 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.367 15:10:16 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:14.626 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:14.627 15:10:16 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1008 00:05:14.627 15:10:16 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1008 00:05:14.886 true 00:05:14.886 15:10:16 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:14.886 15:10:16 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 15:10:17 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:15.823 15:10:17 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1009 00:05:15.823 15:10:17 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1009 00:05:16.082 true 00:05:16.082 15:10:17 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:16.082 15:10:17 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 15:10:18 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:17.019 15:10:18 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1010 00:05:17.019 15:10:18 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1010 00:05:17.278 true 00:05:17.278 15:10:19 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:17.278 15:10:19 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:18.216 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.216 15:10:19 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:18.216 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.216 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.216 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.216 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.216 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.475 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.475 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:18.475 15:10:20 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1011 00:05:18.475 15:10:20 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1011 00:05:18.735 true 00:05:18.735 15:10:20 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:18.735 15:10:20 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:19.305 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 15:10:21 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:19.564 15:10:21 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1012 00:05:19.564 15:10:21 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1012 00:05:19.824 true 00:05:19.824 15:10:21 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:19.824 15:10:21 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 15:10:22 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:20.764 15:10:22 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1013 00:05:20.764 15:10:22 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1013 00:05:21.024 true 00:05:21.024 15:10:22 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:21.024 15:10:22 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 15:10:23 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:21.962 15:10:23 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1014 00:05:21.963 15:10:23 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1014 00:05:22.222 true 00:05:22.222 15:10:24 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:22.222 15:10:24 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:23.162 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:23.162 15:10:24 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:23.162 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:23.162 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:23.421 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1015 00:05:23.421 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1015 00:05:23.421 true 00:05:23.421 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:23.421 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:23.680 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:23.940 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1016 00:05:23.940 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1016 00:05:24.199 true 00:05:24.199 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:24.199 15:10:25 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:25.139 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.139 15:10:26 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:25.139 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.139 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.139 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.139 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.398 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.398 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.398 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.398 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:25.398 15:10:27 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1017 00:05:25.398 15:10:27 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1017 00:05:25.657 true 00:05:25.657 15:10:27 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:25.657 15:10:27 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 15:10:28 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:26.596 15:10:28 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1018 00:05:26.596 15:10:28 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1018 00:05:26.855 true 00:05:26.855 15:10:28 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:26.855 15:10:28 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:27.530 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.530 15:10:29 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:27.530 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.530 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.790 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.790 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.790 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.790 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.790 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:27.790 15:10:29 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1019 00:05:27.790 15:10:29 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1019 00:05:28.049 true 00:05:28.049 15:10:29 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:28.049 15:10:29 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 15:10:30 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:28.987 15:10:30 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1020 00:05:28.987 15:10:30 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1020 00:05:29.246 true 00:05:29.246 15:10:30 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:29.246 15:10:30 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 15:10:31 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:30.186 15:10:31 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1021 00:05:30.186 15:10:31 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1021 00:05:30.445 true 00:05:30.445 15:10:32 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:30.445 15:10:32 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.382 15:10:33 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.382 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.641 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.641 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:31.641 15:10:33 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1022 00:05:31.641 15:10:33 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1022 00:05:31.641 true 00:05:31.900 15:10:33 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:31.900 15:10:33 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:32.468 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.468 15:10:34 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:32.727 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.727 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.727 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.727 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.728 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.728 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.728 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:32.728 15:10:34 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1023 00:05:32.728 15:10:34 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1023 00:05:32.986 true 00:05:32.986 15:10:34 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:32.986 15:10:34 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 15:10:35 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 Message suppressed 999 times: Read completed with error (sct=0, sc=11) 00:05:33.921 15:10:35 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1024 00:05:33.921 15:10:35 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1024 00:05:34.179 true 00:05:34.179 15:10:35 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:34.179 15:10:35 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:35.115 15:10:36 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:35.115 15:10:36 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1025 00:05:35.115 15:10:36 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1025 00:05:35.373 true 00:05:35.373 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:35.373 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:35.632 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:35.891 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1026 00:05:35.891 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1026 00:05:35.891 true 00:05:35.891 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:35.891 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:36.149 15:10:37 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:36.408 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1027 00:05:36.408 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1027 00:05:36.666 true 00:05:36.666 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:36.666 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:36.925 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:36.926 Initializing NVMe Controllers 00:05:36.926 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:05:36.926 Controller IO queue size 128, less than required. 00:05:36.926 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:05:36.926 Controller IO queue size 128, less than required. 00:05:36.926 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:05:36.926 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:05:36.926 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 with lcore 0 00:05:36.926 Initialization complete. Launching workers. 00:05:36.926 ======================================================== 00:05:36.926 Latency(us) 00:05:36.926 Device Information : IOPS MiB/s Average min max 00:05:36.926 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 6182.33 3.02 18037.79 821.38 1137812.45 00:05:36.926 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 from core 0: 33292.63 16.26 3844.62 2312.91 293916.66 00:05:36.926 ======================================================== 00:05:36.926 Total : 39474.96 19.27 6067.47 821.38 1137812.45 00:05:36.926 00:05:36.926 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@49 -- # null_size=1028 00:05:36.926 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_resize NULL1 1028 00:05:37.184 true 00:05:37.184 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@44 -- # kill -0 1663899 00:05:37.185 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/ns_hotplug_stress.sh: line 44: kill: (1663899) - No such process 00:05:37.185 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@53 -- # wait 1663899 00:05:37.185 15:10:38 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:37.444 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@55 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:37.703 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@58 -- # nthreads=8 00:05:37.703 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@58 -- # pids=() 00:05:37.703 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i = 0 )) 00:05:37.703 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:37.703 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null0 100 4096 00:05:37.703 null0 00:05:37.962 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:37.962 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:37.962 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null1 100 4096 00:05:37.962 null1 00:05:37.962 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:37.962 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:37.962 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null2 100 4096 00:05:38.221 null2 00:05:38.221 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:38.221 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:38.221 15:10:39 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null3 100 4096 00:05:38.479 null3 00:05:38.479 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:38.479 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:38.479 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null4 100 4096 00:05:38.738 null4 00:05:38.738 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:38.738 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:38.738 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null5 100 4096 00:05:38.738 null5 00:05:38.998 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:38.998 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:38.998 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null6 100 4096 00:05:38.998 null6 00:05:38.998 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:38.998 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:38.998 15:10:40 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_null_create null7 100 4096 00:05:39.257 null7 00:05:39.257 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@59 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 1 null0 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=1 bdev=null0 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 2 null1 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=2 bdev=null1 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 3 null2 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=3 bdev=null2 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 4 null3 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=4 bdev=null3 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 5 null4 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=5 bdev=null4 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 6 null5 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=6 bdev=null5 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 7 null6 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=7 bdev=null6 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@64 -- # pids+=($!) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( ++i )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@62 -- # (( i < nthreads )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@66 -- # wait 1668638 1668639 1668641 1668643 1668645 1668647 1668648 1668650 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@63 -- # add_remove 8 null7 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@14 -- # local nsid=8 bdev=null7 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i = 0 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.258 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:39.520 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:39.779 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:39.780 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:39.780 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:40.038 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:40.038 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.039 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.298 15:10:41 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:40.298 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:40.557 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:40.816 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.076 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:41.337 15:10:42 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.337 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:41.596 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.596 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.596 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:41.596 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.596 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.596 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:41.597 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:41.856 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:42.115 15:10:43 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:42.374 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:42.633 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.037 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 3 nqn.2016-06.io.spdk:cnode1 null2 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 1 nqn.2016-06.io.spdk:cnode1 null0 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 8 nqn.2016-06.io.spdk:cnode1 null7 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 7 nqn.2016-06.io.spdk:cnode1 null6 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 2 nqn.2016-06.io.spdk:cnode1 null1 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 6 nqn.2016-06.io.spdk:cnode1 null5 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 5 nqn.2016-06.io.spdk:cnode1 null4 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.296 15:10:44 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@17 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns -n 4 nqn.2016-06.io.spdk:cnode1 null3 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 3 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 6 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 8 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 4 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 7 00:05:43.296 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( ++i )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@16 -- # (( i < 10 )) 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@68 -- # trap - SIGINT SIGTERM EXIT 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- target/ns_hotplug_stress.sh@70 -- # nvmftestfini 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@331 -- # nvmfcleanup 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@99 -- # sync 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@102 -- # set +e 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@103 -- # for i in {1..20} 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:05:43.556 rmmod nvme_rdma 00:05:43.556 rmmod nvme_fabrics 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@106 -- # set -e 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@107 -- # return 0 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@332 -- # '[' -n 1663434 ']' 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@333 -- # killprocess 1663434 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@950 -- # '[' -z 1663434 ']' 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@954 -- # kill -0 1663434 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@955 -- # uname 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:05:43.556 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1663434 00:05:43.816 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:05:43.816 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:05:43.816 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1663434' 00:05:43.816 killing process with pid 1663434 00:05:43.816 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@969 -- # kill 1663434 00:05:43.816 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@974 -- # wait 1663434 00:05:44.075 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:05:44.075 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@338 -- # nvmf_fini 00:05:44.075 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@264 -- # local dev 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@267 -- # remove_target_ns 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@22 -- # _remove_target_ns 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@268 -- # delete_main_bridge 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@130 -- # return 0 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@41 -- # _dev=0 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@41 -- # dev_map=() 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/setup.sh@284 -- # iptr 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@538 -- # iptables-restore 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@538 -- # iptables-save 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:05:44.076 00:05:44.076 real 0m49.507s 00:05:44.076 user 3m24.117s 00:05:44.076 sys 0m14.688s 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_ns_hotplug_stress -- common/autotest_common.sh@10 -- # set +x 00:05:44.076 ************************************ 00:05:44.076 END TEST nvmf_ns_hotplug_stress 00:05:44.076 ************************************ 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@18 -- # run_test nvmf_delete_subsystem /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/delete_subsystem.sh --transport=rdma 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:05:44.076 ************************************ 00:05:44.076 START TEST nvmf_delete_subsystem 00:05:44.076 ************************************ 00:05:44.076 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/delete_subsystem.sh --transport=rdma 00:05:44.335 * Looking for test storage... 00:05:44.335 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:05:44.335 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:44.335 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1681 -- # lcov --version 00:05:44.335 15:10:45 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@336 -- # IFS=.-: 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@336 -- # read -ra ver1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@337 -- # IFS=.-: 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@337 -- # read -ra ver2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@338 -- # local 'op=<' 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@340 -- # ver1_l=2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@341 -- # ver2_l=1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@344 -- # case "$op" in 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@345 -- # : 1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@365 -- # decimal 1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@353 -- # local d=1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@355 -- # echo 1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@365 -- # ver1[v]=1 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@366 -- # decimal 2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@353 -- # local d=2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@355 -- # echo 2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@366 -- # ver2[v]=2 00:05:44.335 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@368 -- # return 0 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:44.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.336 --rc genhtml_branch_coverage=1 00:05:44.336 --rc genhtml_function_coverage=1 00:05:44.336 --rc genhtml_legend=1 00:05:44.336 --rc geninfo_all_blocks=1 00:05:44.336 --rc geninfo_unexecuted_blocks=1 00:05:44.336 00:05:44.336 ' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:44.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.336 --rc genhtml_branch_coverage=1 00:05:44.336 --rc genhtml_function_coverage=1 00:05:44.336 --rc genhtml_legend=1 00:05:44.336 --rc geninfo_all_blocks=1 00:05:44.336 --rc geninfo_unexecuted_blocks=1 00:05:44.336 00:05:44.336 ' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:44.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.336 --rc genhtml_branch_coverage=1 00:05:44.336 --rc genhtml_function_coverage=1 00:05:44.336 --rc genhtml_legend=1 00:05:44.336 --rc geninfo_all_blocks=1 00:05:44.336 --rc geninfo_unexecuted_blocks=1 00:05:44.336 00:05:44.336 ' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:44.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.336 --rc genhtml_branch_coverage=1 00:05:44.336 --rc genhtml_function_coverage=1 00:05:44.336 --rc genhtml_legend=1 00:05:44.336 --rc geninfo_all_blocks=1 00:05:44.336 --rc geninfo_unexecuted_blocks=1 00:05:44.336 00:05:44.336 ' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@7 -- # uname -s 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@15 -- # shopt -s extglob 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- paths/export.sh@5 -- # export PATH 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@50 -- # : 0 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:05:44.336 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@54 -- # have_pci_nics=0 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@12 -- # nvmftestinit 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@292 -- # prepare_net_devs 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@254 -- # local -g is_hw=no 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@256 -- # remove_target_ns 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@22 -- # _remove_target_ns 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@125 -- # xtrace_disable 00:05:44.336 15:10:46 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@131 -- # pci_devs=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@131 -- # local -a pci_devs 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@132 -- # pci_net_devs=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@133 -- # pci_drivers=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@133 -- # local -A pci_drivers 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@135 -- # net_devs=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@135 -- # local -ga net_devs 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@136 -- # e810=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@136 -- # local -ga e810 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@137 -- # x722=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@137 -- # local -ga x722 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@138 -- # mlx=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@138 -- # local -ga mlx 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:05:52.465 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:05:52.465 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:05:52.465 Found net devices under 0000:18:00.0: mlx_0_0 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:05:52.465 Found net devices under 0000:18:00.1: mlx_0_1 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@249 -- # get_rdma_if_list 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@75 -- # rdma_devs=() 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@89 -- # continue 2 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@89 -- # continue 2 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@258 -- # is_hw=yes 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@61 -- # uname 00:05:52.465 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@65 -- # modprobe ib_cm 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@66 -- # modprobe ib_core 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@67 -- # modprobe ib_umad 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@69 -- # modprobe iw_cm 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@27 -- # local -gA dev_map 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@28 -- # local -g _dev 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@44 -- # ips=() 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@58 -- # key_initiator=target1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@11 -- # local val=167772161 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:05:52.466 10.0.0.1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@11 -- # local val=167772162 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:05:52.466 10.0.0.2 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@38 -- # ping_ips 1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@107 -- # local dev=target0 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:52.466 15:10:52 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:05:52.466 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:05:52.466 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.031 ms 00:05:52.466 00:05:52.466 --- 10.0.0.2 ping statistics --- 00:05:52.466 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:05:52.466 rtt min/avg/max/mdev = 0.031/0.031/0.031/0.000 ms 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:52.466 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@107 -- # local dev=target0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:05:52.467 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:05:52.467 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.030 ms 00:05:52.467 00:05:52.467 --- 10.0.0.2 ping statistics --- 00:05:52.467 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:05:52.467 rtt min/avg/max/mdev = 0.030/0.030/0.030/0.000 ms 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@98 -- # (( pair++ )) 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@266 -- # return 0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@107 -- # local dev=target0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # get_net_dev target1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@107 -- # local dev=target1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@107 -- # local dev=target0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # get_net_dev target1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@107 -- # local dev=target1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@13 -- # nvmfappstart -m 0x3 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@724 -- # xtrace_disable 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@324 -- # nvmfpid=1672352 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@325 -- # waitforlisten 1672352 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x3 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@831 -- # '[' -z 1672352 ']' 00:05:52.467 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:52.468 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@836 -- # local max_retries=100 00:05:52.468 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:52.468 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:52.468 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@840 -- # xtrace_disable 00:05:52.468 15:10:53 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 [2024-09-27 15:10:53.209962] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:05:52.468 [2024-09-27 15:10:53.210029] [ DPDK EAL parameters: nvmf -c 0x3 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:05:52.468 [2024-09-27 15:10:53.298124] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:52.468 [2024-09-27 15:10:53.384681] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:05:52.468 [2024-09-27 15:10:53.384726] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:05:52.468 [2024-09-27 15:10:53.384735] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:52.468 [2024-09-27 15:10:53.384743] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:52.468 [2024-09-27 15:10:53.384750] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:05:52.468 [2024-09-27 15:10:53.384892] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.468 [2024-09-27 15:10:53.384892] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@864 -- # return 0 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@730 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@15 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 [2024-09-27 15:10:54.133448] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x16fbda0/0x1700290) succeed. 00:05:52.468 [2024-09-27 15:10:54.142313] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x16fd2a0/0x1741930) succeed. 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@16 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -m 10 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@17 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 [2024-09-27 15:10:54.242865] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@18 -- # rpc_cmd bdev_null_create NULL1 1000 512 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 NULL1 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@23 -- # rpc_cmd bdev_delay_create -b NULL1 -d Delay0 -r 1000000 -t 1000000 -w 1000000 -n 1000000 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 Delay0 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@24 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@28 -- # perf_pid=1672489 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@30 -- # sleep 2 00:05:52.468 15:10:54 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@26 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -c 0xC -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -t 5 -q 128 -w randrw -M 70 -o 512 -P 4 00:05:52.727 [2024-09-27 15:10:54.368829] subsystem.c:1641:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on RDMA/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:05:54.631 15:10:56 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@32 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:05:54.631 15:10:56 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:54.631 15:10:56 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:55.565 NVMe io qpair process completion error 00:05:55.565 NVMe io qpair process completion error 00:05:55.824 NVMe io qpair process completion error 00:05:55.824 NVMe io qpair process completion error 00:05:55.824 NVMe io qpair process completion error 00:05:55.824 NVMe io qpair process completion error 00:05:55.824 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:55.824 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@34 -- # delay=0 00:05:55.824 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@35 -- # kill -0 1672489 00:05:55.824 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@36 -- # sleep 0.5 00:05:56.391 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@38 -- # (( delay++ > 30 )) 00:05:56.391 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@35 -- # kill -0 1672489 00:05:56.391 15:10:57 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@36 -- # sleep 0.5 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Write completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.651 Read completed with error (sct=0, sc=8) 00:05:56.651 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 starting I/O failed: -6 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Write completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Read completed with error (sct=0, sc=8) 00:05:56.652 Initializing NVMe Controllers 00:05:56.652 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:05:56.652 Controller IO queue size 128, less than required. 00:05:56.652 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:05:56.652 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 2 00:05:56.652 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 3 00:05:56.652 Initialization complete. Launching workers. 00:05:56.652 ======================================================== 00:05:56.652 Latency(us) 00:05:56.652 Device Information : IOPS MiB/s Average min max 00:05:56.652 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 2: 80.59 0.04 1592275.64 1000163.55 2970542.39 00:05:56.652 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 3: 80.59 0.04 1593698.54 1001197.92 2972006.27 00:05:56.652 ======================================================== 00:05:56.652 Total : 161.17 0.08 1592987.09 1000163.55 2972006.27 00:05:56.652 00:05:56.652 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@38 -- # (( delay++ > 30 )) 00:05:56.652 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@35 -- # kill -0 1672489 00:05:56.652 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@36 -- # sleep 0.5 00:05:56.653 [2024-09-27 15:10:58.473807] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:05:56.653 [2024-09-27 15:10:58.473851] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:05:56.653 /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf: errors occurred 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@38 -- # (( delay++ > 30 )) 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@35 -- # kill -0 1672489 00:05:57.221 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/delete_subsystem.sh: line 35: kill: (1672489) - No such process 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@45 -- # NOT wait 1672489 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@650 -- # local es=0 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@652 -- # valid_exec_arg wait 1672489 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@638 -- # local arg=wait 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@642 -- # type -t wait 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@653 -- # wait 1672489 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@653 -- # es=1 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@48 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -m 10 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@49 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:57.221 [2024-09-27 15:10:58.994442] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@50 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Delay0 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:57.221 15:10:58 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:05:57.221 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:57.221 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@54 -- # perf_pid=1673210 00:05:57.221 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@56 -- # delay=0 00:05:57.221 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@52 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -c 0xC -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -t 3 -q 128 -w randrw -M 70 -o 512 -P 4 00:05:57.221 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:05:57.221 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:05:57.480 [2024-09-27 15:10:59.102018] subsystem.c:1641:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on RDMA/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:05:57.738 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:05:57.738 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:05:57.738 15:10:59 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:05:58.305 15:11:00 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:05:58.305 15:11:00 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:05:58.306 15:11:00 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:05:58.873 15:11:00 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:05:58.873 15:11:00 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:05:58.873 15:11:00 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:05:59.441 15:11:01 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:05:59.441 15:11:01 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:05:59.441 15:11:01 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:05:59.700 15:11:01 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:05:59.700 15:11:01 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:05:59.700 15:11:01 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:00.268 15:11:02 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:00.268 15:11:02 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:00.268 15:11:02 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:00.835 15:11:02 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:00.835 15:11:02 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:00.835 15:11:02 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:01.403 15:11:03 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:01.403 15:11:03 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:01.403 15:11:03 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:01.970 15:11:03 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:01.970 15:11:03 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:01.970 15:11:03 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:02.229 15:11:04 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:02.229 15:11:04 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:02.229 15:11:04 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:02.796 15:11:04 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:02.796 15:11:04 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:02.797 15:11:04 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:03.362 15:11:05 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:03.362 15:11:05 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:03.362 15:11:05 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:03.929 15:11:05 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:03.929 15:11:05 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:03.929 15:11:05 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:04.497 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:04.497 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:04.497 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@58 -- # sleep 0.5 00:06:04.497 Initializing NVMe Controllers 00:06:04.497 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:06:04.497 Controller IO queue size 128, less than required. 00:06:04.497 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:06:04.497 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 2 00:06:04.497 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 3 00:06:04.497 Initialization complete. Launching workers. 00:06:04.497 ======================================================== 00:06:04.497 Latency(us) 00:06:04.497 Device Information : IOPS MiB/s Average min max 00:06:04.497 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 2: 128.00 0.06 1001715.10 1000051.32 1004903.18 00:06:04.497 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 3: 128.00 0.06 1002873.18 1000584.00 1007326.76 00:06:04.497 ======================================================== 00:06:04.497 Total : 256.00 0.12 1002294.14 1000051.32 1007326.76 00:06:04.497 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@60 -- # (( delay++ > 20 )) 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@57 -- # kill -0 1673210 00:06:04.756 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/delete_subsystem.sh: line 57: kill: (1673210) - No such process 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@67 -- # wait 1673210 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- target/delete_subsystem.sh@71 -- # nvmftestfini 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@331 -- # nvmfcleanup 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@99 -- # sync 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@102 -- # set +e 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@103 -- # for i in {1..20} 00:06:04.756 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:06:05.016 rmmod nvme_rdma 00:06:05.016 rmmod nvme_fabrics 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@106 -- # set -e 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@107 -- # return 0 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@332 -- # '[' -n 1672352 ']' 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@333 -- # killprocess 1672352 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@950 -- # '[' -z 1672352 ']' 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@954 -- # kill -0 1672352 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@955 -- # uname 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1672352 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1672352' 00:06:05.016 killing process with pid 1672352 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@969 -- # kill 1672352 00:06:05.016 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@974 -- # wait 1672352 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@338 -- # nvmf_fini 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@264 -- # local dev 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@267 -- # remove_target_ns 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@22 -- # _remove_target_ns 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@268 -- # delete_main_bridge 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@130 -- # return 0 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:06:05.276 15:11:06 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@41 -- # _dev=0 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@41 -- # dev_map=() 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/setup.sh@284 -- # iptr 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@538 -- # iptables-save 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- nvmf/common.sh@538 -- # iptables-restore 00:06:05.276 00:06:05.276 real 0m21.162s 00:06:05.276 user 0m50.609s 00:06:05.276 sys 0m6.700s 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_delete_subsystem -- common/autotest_common.sh@10 -- # set +x 00:06:05.276 ************************************ 00:06:05.276 END TEST nvmf_delete_subsystem 00:06:05.276 ************************************ 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@21 -- # run_test nvmf_host_management /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/host_management.sh --transport=rdma 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:06:05.276 ************************************ 00:06:05.276 START TEST nvmf_host_management 00:06:05.276 ************************************ 00:06:05.276 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/host_management.sh --transport=rdma 00:06:05.536 * Looking for test storage... 00:06:05.536 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1681 -- # lcov --version 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@336 -- # IFS=.-: 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@336 -- # read -ra ver1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@337 -- # IFS=.-: 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@337 -- # read -ra ver2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@338 -- # local 'op=<' 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@340 -- # ver1_l=2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@341 -- # ver2_l=1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@344 -- # case "$op" in 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@345 -- # : 1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@365 -- # decimal 1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@353 -- # local d=1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@355 -- # echo 1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@365 -- # ver1[v]=1 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@366 -- # decimal 2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@353 -- # local d=2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@355 -- # echo 2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@366 -- # ver2[v]=2 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@368 -- # return 0 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:05.536 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.536 --rc genhtml_branch_coverage=1 00:06:05.536 --rc genhtml_function_coverage=1 00:06:05.536 --rc genhtml_legend=1 00:06:05.536 --rc geninfo_all_blocks=1 00:06:05.536 --rc geninfo_unexecuted_blocks=1 00:06:05.536 00:06:05.536 ' 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:05.536 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.536 --rc genhtml_branch_coverage=1 00:06:05.536 --rc genhtml_function_coverage=1 00:06:05.536 --rc genhtml_legend=1 00:06:05.536 --rc geninfo_all_blocks=1 00:06:05.536 --rc geninfo_unexecuted_blocks=1 00:06:05.536 00:06:05.536 ' 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:05.536 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.536 --rc genhtml_branch_coverage=1 00:06:05.536 --rc genhtml_function_coverage=1 00:06:05.536 --rc genhtml_legend=1 00:06:05.536 --rc geninfo_all_blocks=1 00:06:05.536 --rc geninfo_unexecuted_blocks=1 00:06:05.536 00:06:05.536 ' 00:06:05.536 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:05.536 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:05.536 --rc genhtml_branch_coverage=1 00:06:05.536 --rc genhtml_function_coverage=1 00:06:05.536 --rc genhtml_legend=1 00:06:05.536 --rc geninfo_all_blocks=1 00:06:05.536 --rc geninfo_unexecuted_blocks=1 00:06:05.536 00:06:05.536 ' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@7 -- # uname -s 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@15 -- # shopt -s extglob 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- paths/export.sh@5 -- # export PATH 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@50 -- # : 0 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:06:05.537 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@54 -- # have_pci_nics=0 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@11 -- # MALLOC_BDEV_SIZE=64 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@105 -- # nvmftestinit 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@292 -- # prepare_net_devs 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@254 -- # local -g is_hw=no 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@256 -- # remove_target_ns 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@22 -- # _remove_target_ns 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@125 -- # xtrace_disable 00:06:05.537 15:11:07 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@131 -- # pci_devs=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@131 -- # local -a pci_devs 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@132 -- # pci_net_devs=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@133 -- # pci_drivers=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@133 -- # local -A pci_drivers 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@135 -- # net_devs=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@135 -- # local -ga net_devs 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@136 -- # e810=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@136 -- # local -ga e810 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@137 -- # x722=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@137 -- # local -ga x722 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@138 -- # mlx=() 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@138 -- # local -ga mlx 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:06:13.659 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:06:13.659 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:06:13.659 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:06:13.659 Found net devices under 0000:18:00.0: mlx_0_0 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:06:13.660 Found net devices under 0000:18:00.1: mlx_0_1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@249 -- # get_rdma_if_list 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@75 -- # rdma_devs=() 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@89 -- # continue 2 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@89 -- # continue 2 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@258 -- # is_hw=yes 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@61 -- # uname 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@65 -- # modprobe ib_cm 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@66 -- # modprobe ib_core 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@67 -- # modprobe ib_umad 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@69 -- # modprobe iw_cm 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@27 -- # local -gA dev_map 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@28 -- # local -g _dev 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@44 -- # ips=() 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@58 -- # key_initiator=target1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@11 -- # local val=167772161 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:06:13.660 10.0.0.1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@11 -- # local val=167772162 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:06:13.660 10.0.0.2 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:06:13.660 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@38 -- # ping_ips 1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@107 -- # local dev=target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:06:13.661 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:06:13.661 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.036 ms 00:06:13.661 00:06:13.661 --- 10.0.0.2 ping statistics --- 00:06:13.661 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:06:13.661 rtt min/avg/max/mdev = 0.036/0.036/0.036/0.000 ms 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@107 -- # local dev=target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:06:13.661 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:06:13.661 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.028 ms 00:06:13.661 00:06:13.661 --- 10.0.0.2 ping statistics --- 00:06:13.661 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:06:13.661 rtt min/avg/max/mdev = 0.028/0.028/0.028/0.000 ms 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@98 -- # (( pair++ )) 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@266 -- # return 0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@107 -- # local dev=target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # get_net_dev target1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@107 -- # local dev=target1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@107 -- # local dev=target0 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:13.661 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # get_net_dev target1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@107 -- # local dev=target1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@107 -- # nvmf_host_management 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@69 -- # starttarget 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@16 -- # nvmfappstart -m 0x1E 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@324 -- # nvmfpid=1677234 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@325 -- # waitforlisten 1677234 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1E 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@831 -- # '[' -z 1677234 ']' 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.662 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:13.662 15:11:14 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.662 [2024-09-27 15:11:14.465744] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:06:13.662 [2024-09-27 15:11:14.465802] [ DPDK EAL parameters: nvmf -c 0x1E --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:06:13.662 [2024-09-27 15:11:14.551619] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:13.662 [2024-09-27 15:11:14.637913] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:06:13.662 [2024-09-27 15:11:14.637955] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:06:13.662 [2024-09-27 15:11:14.637964] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:13.662 [2024-09-27 15:11:14.637988] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:13.662 [2024-09-27 15:11:14.637996] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:06:13.662 [2024-09-27 15:11:14.638108] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:13.662 [2024-09-27 15:11:14.638212] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:13.662 [2024-09-27 15:11:14.638311] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:13.662 [2024-09-27 15:11:14.638313] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@864 -- # return 0 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@18 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.662 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.662 [2024-09-27 15:11:15.387288] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0xc897a0/0xc8dc90) succeed. 00:06:13.662 [2024-09-27 15:11:15.397780] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0xc8ade0/0xccf330) succeed. 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@20 -- # timing_enter create_subsystem 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@22 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@23 -- # cat 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@30 -- # rpc_cmd 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:13.921 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.921 Malloc0 00:06:13.922 [2024-09-27 15:11:15.590119] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@31 -- # timing_exit create_subsystems 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@73 -- # perfpid=1677428 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@74 -- # waitforlisten 1677428 /var/tmp/bdevperf.sock 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@831 -- # '[' -z 1677428 ']' 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@72 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bdevperf.sock --json /dev/fd/63 -q 64 -o 65536 -w verify -t 10 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@72 -- # gen_nvmf_target_json 0 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:06:13.922 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@368 -- # config=() 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@368 -- # local subsystem config 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:06:13.922 { 00:06:13.922 "params": { 00:06:13.922 "name": "Nvme$subsystem", 00:06:13.922 "trtype": "$TEST_TRANSPORT", 00:06:13.922 "traddr": "$NVMF_FIRST_TARGET_IP", 00:06:13.922 "adrfam": "ipv4", 00:06:13.922 "trsvcid": "$NVMF_PORT", 00:06:13.922 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:06:13.922 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:06:13.922 "hdgst": ${hdgst:-false}, 00:06:13.922 "ddgst": ${ddgst:-false} 00:06:13.922 }, 00:06:13.922 "method": "bdev_nvme_attach_controller" 00:06:13.922 } 00:06:13.922 EOF 00:06:13.922 )") 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@390 -- # cat 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@392 -- # jq . 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@393 -- # IFS=, 00:06:13.922 15:11:15 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:06:13.922 "params": { 00:06:13.922 "name": "Nvme0", 00:06:13.922 "trtype": "rdma", 00:06:13.922 "traddr": "10.0.0.2", 00:06:13.922 "adrfam": "ipv4", 00:06:13.922 "trsvcid": "4420", 00:06:13.922 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:06:13.922 "hostnqn": "nqn.2016-06.io.spdk:host0", 00:06:13.922 "hdgst": false, 00:06:13.922 "ddgst": false 00:06:13.922 }, 00:06:13.922 "method": "bdev_nvme_attach_controller" 00:06:13.922 }' 00:06:13.922 [2024-09-27 15:11:15.701140] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:06:13.922 [2024-09-27 15:11:15.701204] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1677428 ] 00:06:14.181 [2024-09-27 15:11:15.788675] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.181 [2024-09-27 15:11:15.869438] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.441 Running I/O for 10 seconds... 00:06:15.009 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:15.009 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@864 -- # return 0 00:06:15.009 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@75 -- # rpc_cmd -s /var/tmp/bdevperf.sock framework_wait_init 00:06:15.009 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.009 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:15.009 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@78 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; kill -9 $perfpid || true; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@80 -- # waitforio /var/tmp/bdevperf.sock Nvme0n1 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@45 -- # '[' -z /var/tmp/bdevperf.sock ']' 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@49 -- # '[' -z Nvme0n1 ']' 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@52 -- # local ret=1 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@53 -- # local i 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@54 -- # (( i = 10 )) 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@54 -- # (( i != 0 )) 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@55 -- # jq -r '.bdevs[0].num_read_ops' 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@55 -- # rpc_cmd -s /var/tmp/bdevperf.sock bdev_get_iostat -b Nvme0n1 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@55 -- # read_io_count=1580 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@58 -- # '[' 1580 -ge 100 ']' 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@59 -- # ret=0 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@60 -- # break 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@64 -- # return 0 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@84 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2016-06.io.spdk:cnode0 nqn.2016-06.io.spdk:host0 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@85 -- # rpc_cmd nvmf_subsystem_add_host nqn.2016-06.io.spdk:cnode0 nqn.2016-06.io.spdk:host0 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.010 15:11:16 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@87 -- # sleep 1 00:06:15.836 1705.00 IOPS, 106.56 MiB/s [2024-09-27 15:11:17.640130] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:61 nsid:1 lba:87168 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001981f900 len:0x10000 key:0x1c0700 00:06:15.836 [2024-09-27 15:11:17.640171] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640192] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:54 nsid:1 lba:87296 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001980f880 len:0x10000 key:0x1c0700 00:06:15.836 [2024-09-27 15:11:17.640202] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640214] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:63 nsid:1 lba:87424 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000196eff80 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640223] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640234] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:56 nsid:1 lba:87552 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000196dff00 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640244] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640255] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:41 nsid:1 lba:87680 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000196cfe80 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640270] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640281] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:87808 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000196bfe00 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640290] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640301] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:87936 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000196afd80 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640311] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640321] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:39 nsid:1 lba:88064 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001969fd00 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640332] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640348] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:45 nsid:1 lba:88192 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001968fc80 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640359] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640369] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:62 nsid:1 lba:88320 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001967fc00 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640379] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640390] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:47 nsid:1 lba:88448 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001966fb80 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640399] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640410] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:88576 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001965fb00 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640419] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640430] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:49 nsid:1 lba:88704 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001964fa80 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640439] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640450] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:88832 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001963fa00 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640460] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640471] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:51 nsid:1 lba:88960 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001962f980 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640480] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640491] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:1 lba:89088 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001961f900 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640500] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640513] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:26 nsid:1 lba:89216 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001960f880 len:0x10000 key:0x1c0600 00:06:15.836 [2024-09-27 15:11:17.640522] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640534] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:5 nsid:1 lba:89344 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200003a4b040 len:0x10000 key:0x1c0100 00:06:15.836 [2024-09-27 15:11:17.640543] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640553] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:55 nsid:1 lba:89472 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200003a3afc0 len:0x10000 key:0x1c0100 00:06:15.836 [2024-09-27 15:11:17.640563] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640573] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:7 nsid:1 lba:89600 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200003a2af40 len:0x10000 key:0x1c0100 00:06:15.836 [2024-09-27 15:11:17.640583] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640593] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:57 nsid:1 lba:89728 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200003a1aec0 len:0x10000 key:0x1c0100 00:06:15.836 [2024-09-27 15:11:17.640603] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640614] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:9 nsid:1 lba:89856 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200003a0ae40 len:0x10000 key:0x1c0100 00:06:15.836 [2024-09-27 15:11:17.640623] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640634] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:59 nsid:1 lba:89984 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000138e4580 len:0x10000 key:0x1c0500 00:06:15.836 [2024-09-27 15:11:17.640643] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640654] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:30 nsid:1 lba:90112 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000138d4500 len:0x10000 key:0x1c0500 00:06:15.836 [2024-09-27 15:11:17.640663] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640674] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:90240 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000138c4480 len:0x10000 key:0x1c0500 00:06:15.836 [2024-09-27 15:11:17.640683] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.836 [2024-09-27 15:11:17.640694] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:32 nsid:1 lba:90368 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000138b4400 len:0x10000 key:0x1c0500 00:06:15.836 [2024-09-27 15:11:17.640703] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640714] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:90496 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000138a4380 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640723] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640735] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:34 nsid:1 lba:90624 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013894300 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640745] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640756] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:90752 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013884280 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640765] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640776] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:36 nsid:1 lba:90880 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013874200 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640785] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640795] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:17 nsid:1 lba:91008 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013864180 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640806] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640816] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:38 nsid:1 lba:91136 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013854100 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640825] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640836] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:19 nsid:1 lba:91264 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013844080 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640845] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640857] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:43 nsid:1 lba:91392 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013834000 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640866] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640877] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:21 nsid:1 lba:91520 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013823f80 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640886] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640897] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:91648 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013813f00 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640906] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640917] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:23 nsid:1 lba:91776 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013803e80 len:0x10000 key:0x1c0500 00:06:15.837 [2024-09-27 15:11:17.640926] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640936] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:91904 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019ad1e80 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.640946] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640957] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:25 nsid:1 lba:92032 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019ac1e00 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.640968] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640978] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:92160 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019ab1d80 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.640987] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.640999] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:27 nsid:1 lba:92288 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019aa1d00 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641008] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641019] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:6 nsid:1 lba:92416 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a91c80 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641028] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641039] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:29 nsid:1 lba:92544 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a81c00 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641048] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641059] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:8 nsid:1 lba:92672 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a71b80 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641068] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641079] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:31 nsid:1 lba:92800 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a61b00 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641088] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641099] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:10 nsid:1 lba:92928 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a51a80 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641108] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641118] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:33 nsid:1 lba:93056 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a41a00 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641127] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641138] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:12 nsid:1 lba:93184 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a31980 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641147] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641158] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:35 nsid:1 lba:93312 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a21900 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641168] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641178] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:53 nsid:1 lba:93440 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a11880 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641189] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641200] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:37 nsid:1 lba:93568 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200019a01800 len:0x10000 key:0x1c0800 00:06:15.837 [2024-09-27 15:11:17.641209] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641220] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:16 nsid:1 lba:93696 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000198eff80 len:0x10000 key:0x1c0700 00:06:15.837 [2024-09-27 15:11:17.641230] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641240] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:40 nsid:1 lba:93824 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000198dff00 len:0x10000 key:0x1c0700 00:06:15.837 [2024-09-27 15:11:17.641249] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641260] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:93952 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000198cfe80 len:0x10000 key:0x1c0700 00:06:15.837 [2024-09-27 15:11:17.641269] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641280] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:42 nsid:1 lba:94080 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000198bfe00 len:0x10000 key:0x1c0700 00:06:15.837 [2024-09-27 15:11:17.641289] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641300] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:20 nsid:1 lba:94208 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000198afd80 len:0x10000 key:0x1c0700 00:06:15.837 [2024-09-27 15:11:17.641309] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.837 [2024-09-27 15:11:17.641320] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:44 nsid:1 lba:94336 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001989fd00 len:0x10000 key:0x1c0700 00:06:15.837 [2024-09-27 15:11:17.641330] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641344] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:22 nsid:1 lba:94464 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001988fc80 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641353] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641364] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:46 nsid:1 lba:94592 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001987fc00 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641373] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641384] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:24 nsid:1 lba:94720 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001986fb80 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641393] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641404] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:48 nsid:1 lba:94848 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001985fb00 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641413] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641425] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:60 nsid:1 lba:94976 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001984fa80 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641435] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641445] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:50 nsid:1 lba:95104 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001983fa00 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641454] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.641465] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:28 nsid:1 lba:95232 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001982f980 len:0x10000 key:0x1c0700 00:06:15.838 [2024-09-27 15:11:17.641474] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:43caf000 sqhd:7250 p:0 m:0 dnr:0 00:06:15.838 [2024-09-27 15:11:17.643530] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200019a01540 was disconnected and freed. reset controller. 00:06:15.838 [2024-09-27 15:11:17.644439] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode0] resetting controller 00:06:15.838 task offset: 87168 on job bdev=Nvme0n1 fails 00:06:15.838 00:06:15.838 Latency(us) 00:06:15.838 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:06:15.838 Job: Nvme0n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:06:15.838 Job: Nvme0n1 ended in about 1.59 seconds with error 00:06:15.838 Verification LBA range: start 0x0 length 0x400 00:06:15.838 Nvme0n1 : 1.59 1073.79 67.11 40.31 0.00 56920.29 2179.78 1028516.29 00:06:15.838 =================================================================================================================== 00:06:15.838 Total : 1073.79 67.11 40.31 0.00 56920.29 2179.78 1028516.29 00:06:15.838 [2024-09-27 15:11:17.646256] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@91 -- # kill -9 1677428 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@97 -- # rm -f /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 /var/tmp/spdk_cpu_lock_003 /var/tmp/spdk_cpu_lock_004 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@100 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -o 65536 -w verify -t 1 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@100 -- # gen_nvmf_target_json 0 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@368 -- # config=() 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@368 -- # local subsystem config 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:06:15.838 { 00:06:15.838 "params": { 00:06:15.838 "name": "Nvme$subsystem", 00:06:15.838 "trtype": "$TEST_TRANSPORT", 00:06:15.838 "traddr": "$NVMF_FIRST_TARGET_IP", 00:06:15.838 "adrfam": "ipv4", 00:06:15.838 "trsvcid": "$NVMF_PORT", 00:06:15.838 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:06:15.838 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:06:15.838 "hdgst": ${hdgst:-false}, 00:06:15.838 "ddgst": ${ddgst:-false} 00:06:15.838 }, 00:06:15.838 "method": "bdev_nvme_attach_controller" 00:06:15.838 } 00:06:15.838 EOF 00:06:15.838 )") 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@390 -- # cat 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@392 -- # jq . 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@393 -- # IFS=, 00:06:15.838 15:11:17 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:06:15.838 "params": { 00:06:15.838 "name": "Nvme0", 00:06:15.838 "trtype": "rdma", 00:06:15.838 "traddr": "10.0.0.2", 00:06:15.838 "adrfam": "ipv4", 00:06:15.838 "trsvcid": "4420", 00:06:15.838 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:06:15.838 "hostnqn": "nqn.2016-06.io.spdk:host0", 00:06:15.838 "hdgst": false, 00:06:15.838 "ddgst": false 00:06:15.838 }, 00:06:15.838 "method": "bdev_nvme_attach_controller" 00:06:15.838 }' 00:06:16.096 [2024-09-27 15:11:17.700535] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:06:16.096 [2024-09-27 15:11:17.700596] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1677665 ] 00:06:16.096 [2024-09-27 15:11:17.786007] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.096 [2024-09-27 15:11:17.870397] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.354 Running I/O for 1 seconds... 00:06:17.289 3008.00 IOPS, 188.00 MiB/s 00:06:17.289 Latency(us) 00:06:17.289 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:06:17.289 Job: Nvme0n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:06:17.290 Verification LBA range: start 0x0 length 0x400 00:06:17.290 Nvme0n1 : 1.01 3042.17 190.14 0.00 0.00 20612.64 591.25 42398.94 00:06:17.290 =================================================================================================================== 00:06:17.290 Total : 3042.17 190.14 0.00 0.00 20612.64 591.25 42398.94 00:06:17.548 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/host_management.sh: line 68: 1677428 Killed $rootdir/build/examples/bdevperf -r /var/tmp/bdevperf.sock --json <(gen_nvmf_target_json "0") -q 64 -o 65536 -w verify -t 10 "${NO_HUGE[@]}" 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@102 -- # stoptarget 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@36 -- # rm -f ./local-job0-0-verify.state 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@37 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevperf.conf 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@38 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@40 -- # nvmftestfini 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@331 -- # nvmfcleanup 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@99 -- # sync 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@102 -- # set +e 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@103 -- # for i in {1..20} 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:06:17.548 rmmod nvme_rdma 00:06:17.548 rmmod nvme_fabrics 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@106 -- # set -e 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@107 -- # return 0 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@332 -- # '[' -n 1677234 ']' 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@333 -- # killprocess 1677234 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@950 -- # '[' -z 1677234 ']' 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@954 -- # kill -0 1677234 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@955 -- # uname 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:17.548 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1677234 00:06:17.807 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:06:17.807 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:06:17.807 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1677234' 00:06:17.807 killing process with pid 1677234 00:06:17.807 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@969 -- # kill 1677234 00:06:17.807 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@974 -- # wait 1677234 00:06:18.067 [2024-09-27 15:11:19.702038] app.c: 719:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 1, errno: 2 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@338 -- # nvmf_fini 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@264 -- # local dev 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@267 -- # remove_target_ns 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@22 -- # _remove_target_ns 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@268 -- # delete_main_bridge 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@130 -- # return 0 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@41 -- # _dev=0 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@41 -- # dev_map=() 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/setup.sh@284 -- # iptr 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@538 -- # iptables-save 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- nvmf/common.sh@538 -- # iptables-restore 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- target/host_management.sh@109 -- # trap - SIGINT SIGTERM EXIT 00:06:18.067 00:06:18.067 real 0m12.646s 00:06:18.067 user 0m25.815s 00:06:18.067 sys 0m6.524s 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_host_management -- common/autotest_common.sh@10 -- # set +x 00:06:18.067 ************************************ 00:06:18.067 END TEST nvmf_host_management 00:06:18.067 ************************************ 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@22 -- # run_test nvmf_lvol /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_lvol.sh --transport=rdma 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:06:18.067 ************************************ 00:06:18.067 START TEST nvmf_lvol 00:06:18.067 ************************************ 00:06:18.067 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_lvol.sh --transport=rdma 00:06:18.327 * Looking for test storage... 00:06:18.327 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:06:18.327 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:18.327 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1681 -- # lcov --version 00:06:18.327 15:11:19 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@336 -- # IFS=.-: 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@336 -- # read -ra ver1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@337 -- # IFS=.-: 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@337 -- # read -ra ver2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@338 -- # local 'op=<' 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@340 -- # ver1_l=2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@341 -- # ver2_l=1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@344 -- # case "$op" in 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@345 -- # : 1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@365 -- # decimal 1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@353 -- # local d=1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@355 -- # echo 1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@365 -- # ver1[v]=1 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@366 -- # decimal 2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@353 -- # local d=2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@355 -- # echo 2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@366 -- # ver2[v]=2 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@368 -- # return 0 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:18.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.327 --rc genhtml_branch_coverage=1 00:06:18.327 --rc genhtml_function_coverage=1 00:06:18.327 --rc genhtml_legend=1 00:06:18.327 --rc geninfo_all_blocks=1 00:06:18.327 --rc geninfo_unexecuted_blocks=1 00:06:18.327 00:06:18.327 ' 00:06:18.327 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:18.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.327 --rc genhtml_branch_coverage=1 00:06:18.327 --rc genhtml_function_coverage=1 00:06:18.327 --rc genhtml_legend=1 00:06:18.327 --rc geninfo_all_blocks=1 00:06:18.327 --rc geninfo_unexecuted_blocks=1 00:06:18.327 00:06:18.328 ' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:18.328 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.328 --rc genhtml_branch_coverage=1 00:06:18.328 --rc genhtml_function_coverage=1 00:06:18.328 --rc genhtml_legend=1 00:06:18.328 --rc geninfo_all_blocks=1 00:06:18.328 --rc geninfo_unexecuted_blocks=1 00:06:18.328 00:06:18.328 ' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:18.328 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.328 --rc genhtml_branch_coverage=1 00:06:18.328 --rc genhtml_function_coverage=1 00:06:18.328 --rc genhtml_legend=1 00:06:18.328 --rc geninfo_all_blocks=1 00:06:18.328 --rc geninfo_unexecuted_blocks=1 00:06:18.328 00:06:18.328 ' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@7 -- # uname -s 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@15 -- # shopt -s extglob 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- paths/export.sh@5 -- # export PATH 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@50 -- # : 0 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:06:18.328 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@54 -- # have_pci_nics=0 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@11 -- # MALLOC_BDEV_SIZE=64 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@13 -- # LVOL_BDEV_INIT_SIZE=20 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@14 -- # LVOL_BDEV_FINAL_SIZE=30 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@16 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@18 -- # nvmftestinit 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@292 -- # prepare_net_devs 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@254 -- # local -g is_hw=no 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@256 -- # remove_target_ns 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@22 -- # _remove_target_ns 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@125 -- # xtrace_disable 00:06:18.328 15:11:20 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@10 -- # set +x 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@131 -- # pci_devs=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@131 -- # local -a pci_devs 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@132 -- # pci_net_devs=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@133 -- # pci_drivers=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@133 -- # local -A pci_drivers 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@135 -- # net_devs=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@135 -- # local -ga net_devs 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@136 -- # e810=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@136 -- # local -ga e810 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@137 -- # x722=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@137 -- # local -ga x722 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@138 -- # mlx=() 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@138 -- # local -ga mlx 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:06:26.459 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:06:26.459 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:06:26.459 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:06:26.460 Found net devices under 0000:18:00.0: mlx_0_0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:06:26.460 Found net devices under 0000:18:00.1: mlx_0_1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@249 -- # get_rdma_if_list 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@75 -- # rdma_devs=() 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@89 -- # continue 2 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@89 -- # continue 2 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@258 -- # is_hw=yes 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@61 -- # uname 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@65 -- # modprobe ib_cm 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@66 -- # modprobe ib_core 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@67 -- # modprobe ib_umad 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@69 -- # modprobe iw_cm 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@27 -- # local -gA dev_map 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@28 -- # local -g _dev 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@44 -- # ips=() 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@58 -- # key_initiator=target1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@11 -- # local val=167772161 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:06:26.460 10.0.0.1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@11 -- # local val=167772162 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:06:26.460 10.0.0.2 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@38 -- # ping_ips 1 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:26.460 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@107 -- # local dev=target0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:06:26.461 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:06:26.461 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.027 ms 00:06:26.461 00:06:26.461 --- 10.0.0.2 ping statistics --- 00:06:26.461 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:06:26.461 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@107 -- # local dev=target0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:06:26.461 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:06:26.461 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.027 ms 00:06:26.461 00:06:26.461 --- 10.0.0.2 ping statistics --- 00:06:26.461 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:06:26.461 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@98 -- # (( pair++ )) 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@266 -- # return 0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@107 -- # local dev=target0 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:26.461 15:11:26 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # get_net_dev target1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@107 -- # local dev=target1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@107 -- # local dev=target0 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # get_net_dev target1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@107 -- # local dev=target1 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:06:26.461 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@19 -- # nvmfappstart -m 0x7 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@10 -- # set +x 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@324 -- # nvmfpid=1680953 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x7 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@325 -- # waitforlisten 1680953 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@831 -- # '[' -z 1680953 ']' 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:26.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:26.462 15:11:27 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@10 -- # set +x 00:06:26.462 [2024-09-27 15:11:27.161205] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:06:26.462 [2024-09-27 15:11:27.161266] [ DPDK EAL parameters: nvmf -c 0x7 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:06:26.462 [2024-09-27 15:11:27.245022] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:26.462 [2024-09-27 15:11:27.333280] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:06:26.462 [2024-09-27 15:11:27.333328] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:06:26.462 [2024-09-27 15:11:27.333337] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:26.462 [2024-09-27 15:11:27.333366] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:26.462 [2024-09-27 15:11:27.333374] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:06:26.462 [2024-09-27 15:11:27.333467] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:26.462 [2024-09-27 15:11:27.333580] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.462 [2024-09-27 15:11:27.333581] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@864 -- # return 0 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@10 -- # set +x 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:06:26.462 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:06:26.462 [2024-09-27 15:11:28.292063] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x7e7910/0x7ebe00) succeed. 00:06:26.462 [2024-09-27 15:11:28.302979] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x7e8eb0/0x82d4a0) succeed. 00:06:26.720 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:06:26.979 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@24 -- # base_bdevs='Malloc0 ' 00:06:26.979 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@25 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:06:27.238 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@25 -- # base_bdevs+=Malloc1 00:06:27.238 15:11:28 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@26 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_raid_create -n raid0 -z 64 -r 0 -b 'Malloc0 Malloc1' 00:06:27.238 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore raid0 lvs 00:06:27.496 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@29 -- # lvs=7fd642a6-ffad-424a-8aed-b8d65d006529 00:06:27.496 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@32 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -u 7fd642a6-ffad-424a-8aed-b8d65d006529 lvol 20 00:06:27.755 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@32 -- # lvol=2eba978a-665a-4830-a846-cf4d39414703 00:06:27.755 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@35 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:06:28.014 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 2eba978a-665a-4830-a846-cf4d39414703 00:06:28.272 15:11:29 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@37 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4420 00:06:28.272 [2024-09-27 15:11:30.079472] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:06:28.272 15:11:30 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@38 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:06:28.530 15:11:30 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@42 -- # perf_pid=1681355 00:06:28.530 15:11:30 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@44 -- # sleep 1 00:06:28.530 15:11:30 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@41 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -o 4096 -q 128 -s 512 -w randwrite -t 10 -c 0x18 00:06:29.905 15:11:31 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@47 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_snapshot 2eba978a-665a-4830-a846-cf4d39414703 MY_SNAPSHOT 00:06:29.905 15:11:31 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@47 -- # snapshot=6956cb05-a4c7-4155-97fa-51cce81c529f 00:06:29.905 15:11:31 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_resize 2eba978a-665a-4830-a846-cf4d39414703 30 00:06:29.905 15:11:31 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@49 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_clone 6956cb05-a4c7-4155-97fa-51cce81c529f MY_CLONE 00:06:30.164 15:11:31 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@49 -- # clone=97150c73-f1ce-4a34-947b-8798c30e714c 00:06:30.164 15:11:31 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_inflate 97150c73-f1ce-4a34-947b-8798c30e714c 00:06:30.423 15:11:32 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@53 -- # wait 1681355 00:06:40.401 Initializing NVMe Controllers 00:06:40.401 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode0 00:06:40.401 Controller IO queue size 128, less than required. 00:06:40.401 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:06:40.401 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 3 00:06:40.401 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 with lcore 4 00:06:40.401 Initialization complete. Launching workers. 00:06:40.401 ======================================================== 00:06:40.401 Latency(us) 00:06:40.401 Device Information : IOPS MiB/s Average min max 00:06:40.401 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 3: 16462.00 64.30 7777.87 2170.31 41616.52 00:06:40.401 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode0) NSID 1 from core 4: 16405.20 64.08 7804.33 3719.23 51345.56 00:06:40.401 ======================================================== 00:06:40.401 Total : 32867.20 128.39 7791.08 2170.31 51345.56 00:06:40.401 00:06:40.401 15:11:41 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@56 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode0 00:06:40.401 15:11:41 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete 2eba978a-665a-4830-a846-cf4d39414703 00:06:40.401 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@58 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u 7fd642a6-ffad-424a-8aed-b8d65d006529 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@60 -- # rm -f 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@62 -- # trap - SIGINT SIGTERM EXIT 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- target/nvmf_lvol.sh@64 -- # nvmftestfini 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@331 -- # nvmfcleanup 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@99 -- # sync 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@102 -- # set +e 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@103 -- # for i in {1..20} 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:06:40.660 rmmod nvme_rdma 00:06:40.660 rmmod nvme_fabrics 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@106 -- # set -e 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@107 -- # return 0 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@332 -- # '[' -n 1680953 ']' 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@333 -- # killprocess 1680953 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@950 -- # '[' -z 1680953 ']' 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@954 -- # kill -0 1680953 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@955 -- # uname 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1680953 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1680953' 00:06:40.660 killing process with pid 1680953 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@969 -- # kill 1680953 00:06:40.660 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@974 -- # wait 1680953 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@338 -- # nvmf_fini 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@264 -- # local dev 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@267 -- # remove_target_ns 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@22 -- # _remove_target_ns 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@268 -- # delete_main_bridge 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@130 -- # return 0 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@41 -- # _dev=0 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@41 -- # dev_map=() 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/setup.sh@284 -- # iptr 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@538 -- # iptables-save 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- nvmf/common.sh@538 -- # iptables-restore 00:06:41.227 00:06:41.227 real 0m22.976s 00:06:41.227 user 1m13.233s 00:06:41.227 sys 0m6.736s 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvol -- common/autotest_common.sh@10 -- # set +x 00:06:41.227 ************************************ 00:06:41.227 END TEST nvmf_lvol 00:06:41.227 ************************************ 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@23 -- # run_test nvmf_lvs_grow /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_lvs_grow.sh --transport=rdma 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:06:41.227 ************************************ 00:06:41.227 START TEST nvmf_lvs_grow 00:06:41.227 ************************************ 00:06:41.227 15:11:42 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_lvs_grow.sh --transport=rdma 00:06:41.227 * Looking for test storage... 00:06:41.227 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:06:41.227 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:41.227 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1681 -- # lcov --version 00:06:41.227 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@336 -- # IFS=.-: 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@336 -- # read -ra ver1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@337 -- # IFS=.-: 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@337 -- # read -ra ver2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@338 -- # local 'op=<' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@340 -- # ver1_l=2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@341 -- # ver2_l=1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@344 -- # case "$op" in 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@345 -- # : 1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@365 -- # decimal 1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@353 -- # local d=1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@355 -- # echo 1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@365 -- # ver1[v]=1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@366 -- # decimal 2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@353 -- # local d=2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@355 -- # echo 2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@366 -- # ver2[v]=2 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@368 -- # return 0 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:41.487 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.487 --rc genhtml_branch_coverage=1 00:06:41.487 --rc genhtml_function_coverage=1 00:06:41.487 --rc genhtml_legend=1 00:06:41.487 --rc geninfo_all_blocks=1 00:06:41.487 --rc geninfo_unexecuted_blocks=1 00:06:41.487 00:06:41.487 ' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:41.487 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.487 --rc genhtml_branch_coverage=1 00:06:41.487 --rc genhtml_function_coverage=1 00:06:41.487 --rc genhtml_legend=1 00:06:41.487 --rc geninfo_all_blocks=1 00:06:41.487 --rc geninfo_unexecuted_blocks=1 00:06:41.487 00:06:41.487 ' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:41.487 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.487 --rc genhtml_branch_coverage=1 00:06:41.487 --rc genhtml_function_coverage=1 00:06:41.487 --rc genhtml_legend=1 00:06:41.487 --rc geninfo_all_blocks=1 00:06:41.487 --rc geninfo_unexecuted_blocks=1 00:06:41.487 00:06:41.487 ' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:41.487 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.487 --rc genhtml_branch_coverage=1 00:06:41.487 --rc genhtml_function_coverage=1 00:06:41.487 --rc genhtml_legend=1 00:06:41.487 --rc geninfo_all_blocks=1 00:06:41.487 --rc geninfo_unexecuted_blocks=1 00:06:41.487 00:06:41.487 ' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@7 -- # uname -s 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@15 -- # shopt -s extglob 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- paths/export.sh@5 -- # export PATH 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@50 -- # : 0 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:06:41.487 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:06:41.488 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@54 -- # have_pci_nics=0 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@11 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@12 -- # bdevperf_rpc_sock=/var/tmp/bdevperf.sock 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@98 -- # nvmftestinit 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@292 -- # prepare_net_devs 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@254 -- # local -g is_hw=no 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@256 -- # remove_target_ns 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@22 -- # _remove_target_ns 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@125 -- # xtrace_disable 00:06:41.488 15:11:43 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@131 -- # pci_devs=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@131 -- # local -a pci_devs 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@132 -- # pci_net_devs=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@133 -- # pci_drivers=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@133 -- # local -A pci_drivers 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@135 -- # net_devs=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@135 -- # local -ga net_devs 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@136 -- # e810=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@136 -- # local -ga e810 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@137 -- # x722=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@137 -- # local -ga x722 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@138 -- # mlx=() 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@138 -- # local -ga mlx 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:06:48.059 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:06:48.059 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:06:48.059 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:06:48.060 Found net devices under 0000:18:00.0: mlx_0_0 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:06:48.060 Found net devices under 0000:18:00.1: mlx_0_1 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@249 -- # get_rdma_if_list 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@75 -- # rdma_devs=() 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@89 -- # continue 2 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@89 -- # continue 2 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@258 -- # is_hw=yes 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@61 -- # uname 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@65 -- # modprobe ib_cm 00:06:48.060 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@66 -- # modprobe ib_core 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@67 -- # modprobe ib_umad 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@69 -- # modprobe iw_cm 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@27 -- # local -gA dev_map 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@28 -- # local -g _dev 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@44 -- # ips=() 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@58 -- # key_initiator=target1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@11 -- # local val=167772161 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:06:48.321 10.0.0.1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@11 -- # local val=167772162 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:06:48.321 10.0.0.2 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:06:48.321 15:11:49 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@38 -- # ping_ips 1 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@107 -- # local dev=target0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:06:48.321 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:06:48.321 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.036 ms 00:06:48.321 00:06:48.321 --- 10.0.0.2 ping statistics --- 00:06:48.321 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:06:48.321 rtt min/avg/max/mdev = 0.036/0.036/0.036/0.000 ms 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:48.321 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@107 -- # local dev=target0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:06:48.322 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:06:48.322 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.029 ms 00:06:48.322 00:06:48.322 --- 10.0.0.2 ping statistics --- 00:06:48.322 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:06:48.322 rtt min/avg/max/mdev = 0.029/0.029/0.029/0.000 ms 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@98 -- # (( pair++ )) 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@266 -- # return 0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@107 -- # local dev=target0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # get_net_dev target1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@107 -- # local dev=target1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # get_net_dev target0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@107 -- # local dev=target0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # get_net_dev target1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@107 -- # local dev=target1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:06:48.322 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@99 -- # nvmfappstart -m 0x1 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@324 -- # nvmfpid=1685933 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@325 -- # waitforlisten 1685933 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@831 -- # '[' -z 1685933 ']' 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:48.583 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:48.583 15:11:50 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:06:48.583 [2024-09-27 15:11:50.229138] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:06:48.583 [2024-09-27 15:11:50.229193] [ DPDK EAL parameters: nvmf -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:06:48.583 [2024-09-27 15:11:50.315935] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.583 [2024-09-27 15:11:50.404382] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:06:48.583 [2024-09-27 15:11:50.404444] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:06:48.583 [2024-09-27 15:11:50.404453] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:48.583 [2024-09-27 15:11:50.404462] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:48.583 [2024-09-27 15:11:50.404469] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:06:48.583 [2024-09-27 15:11:50.404500] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@864 -- # return 0 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:06:49.521 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@100 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:06:49.521 [2024-09-27 15:11:51.326823] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x14e82a0/0x14ec790) succeed. 00:06:49.521 [2024-09-27 15:11:51.335987] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x14e97a0/0x152de30) succeed. 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@102 -- # run_test lvs_grow_clean lvs_grow 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:06:49.780 ************************************ 00:06:49.780 START TEST lvs_grow_clean 00:06:49.780 ************************************ 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@1125 -- # lvs_grow 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@15 -- # local aio_bdev lvs lvol 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@16 -- # local data_clusters free_clusters 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@17 -- # local bdevperf_pid run_test_pid 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@18 -- # local aio_init_size_mb=200 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@19 -- # local aio_final_size_mb=400 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@20 -- # local lvol_bdev_size_mb=150 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@23 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@24 -- # truncate -s 200M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:06:49.780 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@25 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev aio_bdev 4096 00:06:50.040 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@25 -- # aio_bdev=aio_bdev 00:06:50.040 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --cluster-sz 4194304 --md-pages-per-cluster-ratio 300 aio_bdev lvs 00:06:50.040 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@28 -- # lvs=76c82256-e9a4-4f66-bcc9-5e666f957679 00:06:50.040 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:06:50.040 15:11:51 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@29 -- # jq -r '.[0].total_data_clusters' 00:06:50.299 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@29 -- # data_clusters=49 00:06:50.299 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@30 -- # (( data_clusters == 49 )) 00:06:50.299 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -u 76c82256-e9a4-4f66-bcc9-5e666f957679 lvol 150 00:06:50.627 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@33 -- # lvol=5a868cf2-3e5a-4f96-835e-b1b14b31c4cc 00:06:50.627 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@36 -- # truncate -s 400M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:06:50.627 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@37 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_rescan aio_bdev 00:06:50.627 [2024-09-27 15:11:52.455316] bdev_aio.c:1044:bdev_aio_rescan: *NOTICE*: AIO device is resized: bdev name /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev, old block count 51200, new block count 102400 00:06:50.627 [2024-09-27 15:11:52.455402] vbdev_lvol.c: 165:vbdev_lvs_base_bdev_event_cb: *NOTICE*: Unsupported bdev event: type 1 00:06:50.627 true 00:06:50.928 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@38 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:06:50.928 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@38 -- # jq -r '.[0].total_data_clusters' 00:06:50.928 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@38 -- # (( data_clusters == 49 )) 00:06:50.928 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@41 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:06:51.200 15:11:52 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@42 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 5a868cf2-3e5a-4f96-835e-b1b14b31c4cc 00:06:51.200 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@43 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4420 00:06:51.459 [2024-09-27 15:11:53.197630] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:06:51.459 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@44 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@48 -- # bdevperf_pid=1686514 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@47 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bdevperf.sock -m 0x2 -o 4096 -q 128 -w randwrite -t 10 -S 1 -z 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@49 -- # trap 'killprocess $bdevperf_pid; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@50 -- # waitforlisten 1686514 /var/tmp/bdevperf.sock 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@831 -- # '[' -z 1686514 ']' 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:06:51.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:51.719 15:11:53 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@10 -- # set +x 00:06:51.719 [2024-09-27 15:11:53.451755] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:06:51.719 [2024-09-27 15:11:53.451814] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1686514 ] 00:06:51.719 [2024-09-27 15:11:53.534128] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.978 [2024-09-27 15:11:53.615032] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.547 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:52.547 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@864 -- # return 0 00:06:52.547 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@52 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b Nvme0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode0 00:06:52.807 Nvme0n1 00:06:52.807 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@53 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_get_bdevs -b Nvme0n1 -t 3000 00:06:53.066 [ 00:06:53.066 { 00:06:53.066 "name": "Nvme0n1", 00:06:53.066 "aliases": [ 00:06:53.066 "5a868cf2-3e5a-4f96-835e-b1b14b31c4cc" 00:06:53.066 ], 00:06:53.066 "product_name": "NVMe disk", 00:06:53.066 "block_size": 4096, 00:06:53.066 "num_blocks": 38912, 00:06:53.066 "uuid": "5a868cf2-3e5a-4f96-835e-b1b14b31c4cc", 00:06:53.066 "numa_id": 0, 00:06:53.066 "assigned_rate_limits": { 00:06:53.066 "rw_ios_per_sec": 0, 00:06:53.067 "rw_mbytes_per_sec": 0, 00:06:53.067 "r_mbytes_per_sec": 0, 00:06:53.067 "w_mbytes_per_sec": 0 00:06:53.067 }, 00:06:53.067 "claimed": false, 00:06:53.067 "zoned": false, 00:06:53.067 "supported_io_types": { 00:06:53.067 "read": true, 00:06:53.067 "write": true, 00:06:53.067 "unmap": true, 00:06:53.067 "flush": true, 00:06:53.067 "reset": true, 00:06:53.067 "nvme_admin": true, 00:06:53.067 "nvme_io": true, 00:06:53.067 "nvme_io_md": false, 00:06:53.067 "write_zeroes": true, 00:06:53.067 "zcopy": false, 00:06:53.067 "get_zone_info": false, 00:06:53.067 "zone_management": false, 00:06:53.067 "zone_append": false, 00:06:53.067 "compare": true, 00:06:53.067 "compare_and_write": true, 00:06:53.067 "abort": true, 00:06:53.067 "seek_hole": false, 00:06:53.067 "seek_data": false, 00:06:53.067 "copy": true, 00:06:53.067 "nvme_iov_md": false 00:06:53.067 }, 00:06:53.067 "memory_domains": [ 00:06:53.067 { 00:06:53.067 "dma_device_id": "SPDK_RDMA_DMA_DEVICE", 00:06:53.067 "dma_device_type": 0 00:06:53.067 } 00:06:53.067 ], 00:06:53.067 "driver_specific": { 00:06:53.067 "nvme": [ 00:06:53.067 { 00:06:53.067 "trid": { 00:06:53.067 "trtype": "RDMA", 00:06:53.067 "adrfam": "IPv4", 00:06:53.067 "traddr": "10.0.0.2", 00:06:53.067 "trsvcid": "4420", 00:06:53.067 "subnqn": "nqn.2016-06.io.spdk:cnode0" 00:06:53.067 }, 00:06:53.067 "ctrlr_data": { 00:06:53.067 "cntlid": 1, 00:06:53.067 "vendor_id": "0x8086", 00:06:53.067 "model_number": "SPDK bdev Controller", 00:06:53.067 "serial_number": "SPDK0", 00:06:53.067 "firmware_revision": "25.01", 00:06:53.067 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:06:53.067 "oacs": { 00:06:53.067 "security": 0, 00:06:53.067 "format": 0, 00:06:53.067 "firmware": 0, 00:06:53.067 "ns_manage": 0 00:06:53.067 }, 00:06:53.067 "multi_ctrlr": true, 00:06:53.067 "ana_reporting": false 00:06:53.067 }, 00:06:53.067 "vs": { 00:06:53.067 "nvme_version": "1.3" 00:06:53.067 }, 00:06:53.067 "ns_data": { 00:06:53.067 "id": 1, 00:06:53.067 "can_share": true 00:06:53.067 } 00:06:53.067 } 00:06:53.067 ], 00:06:53.067 "mp_policy": "active_passive" 00:06:53.067 } 00:06:53.067 } 00:06:53.067 ] 00:06:53.067 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@56 -- # run_test_pid=1686703 00:06:53.067 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@57 -- # sleep 2 00:06:53.067 15:11:54 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@55 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bdevperf.sock perform_tests 00:06:53.067 Running I/O for 10 seconds... 00:06:54.446 Latency(us) 00:06:54.446 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:06:54.446 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:06:54.446 Nvme0n1 : 1.00 33859.00 132.26 0.00 0.00 0.00 0.00 0.00 00:06:54.446 =================================================================================================================== 00:06:54.446 Total : 33859.00 132.26 0.00 0.00 0.00 0.00 0.00 00:06:54.446 00:06:55.016 15:11:56 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_grow_lvstore -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:06:55.275 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:06:55.275 Nvme0n1 : 2.00 34113.00 133.25 0.00 0.00 0.00 0.00 0.00 00:06:55.275 =================================================================================================================== 00:06:55.275 Total : 34113.00 133.25 0.00 0.00 0.00 0.00 0.00 00:06:55.275 00:06:55.275 true 00:06:55.275 15:11:57 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@61 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:06:55.275 15:11:57 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@61 -- # jq -r '.[0].total_data_clusters' 00:06:55.535 15:11:57 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@61 -- # data_clusters=99 00:06:55.535 15:11:57 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@62 -- # (( data_clusters == 99 )) 00:06:55.535 15:11:57 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@65 -- # wait 1686703 00:06:56.104 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:06:56.104 Nvme0n1 : 3.00 34208.33 133.63 0.00 0.00 0.00 0.00 0.00 00:06:56.104 =================================================================================================================== 00:06:56.104 Total : 34208.33 133.63 0.00 0.00 0.00 0.00 0.00 00:06:56.104 00:06:57.043 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:06:57.043 Nvme0n1 : 4.00 34304.00 134.00 0.00 0.00 0.00 0.00 0.00 00:06:57.043 =================================================================================================================== 00:06:57.043 Total : 34304.00 134.00 0.00 0.00 0.00 0.00 0.00 00:06:57.043 00:06:58.423 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:06:58.423 Nvme0n1 : 5.00 34362.00 134.23 0.00 0.00 0.00 0.00 0.00 00:06:58.424 =================================================================================================================== 00:06:58.424 Total : 34362.00 134.23 0.00 0.00 0.00 0.00 0.00 00:06:58.424 00:06:59.362 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:06:59.362 Nvme0n1 : 6.00 34240.17 133.75 0.00 0.00 0.00 0.00 0.00 00:06:59.362 =================================================================================================================== 00:06:59.362 Total : 34240.17 133.75 0.00 0.00 0.00 0.00 0.00 00:06:59.362 00:07:00.299 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:00.299 Nvme0n1 : 7.00 34263.14 133.84 0.00 0.00 0.00 0.00 0.00 00:07:00.299 =================================================================================================================== 00:07:00.299 Total : 34263.14 133.84 0.00 0.00 0.00 0.00 0.00 00:07:00.299 00:07:01.237 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:01.237 Nvme0n1 : 8.00 34251.25 133.79 0.00 0.00 0.00 0.00 0.00 00:07:01.237 =================================================================================================================== 00:07:01.237 Total : 34251.25 133.79 0.00 0.00 0.00 0.00 0.00 00:07:01.237 00:07:02.176 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:02.176 Nvme0n1 : 9.00 34221.67 133.68 0.00 0.00 0.00 0.00 0.00 00:07:02.176 =================================================================================================================== 00:07:02.176 Total : 34221.67 133.68 0.00 0.00 0.00 0.00 0.00 00:07:02.176 00:07:03.115 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:03.115 Nvme0n1 : 10.00 34230.90 133.71 0.00 0.00 0.00 0.00 0.00 00:07:03.115 =================================================================================================================== 00:07:03.115 Total : 34230.90 133.71 0.00 0.00 0.00 0.00 0.00 00:07:03.115 00:07:03.115 00:07:03.115 Latency(us) 00:07:03.115 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:03.115 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:03.115 Nvme0n1 : 10.00 34230.74 133.71 0.00 0.00 3736.42 2393.49 12366.36 00:07:03.115 =================================================================================================================== 00:07:03.115 Total : 34230.74 133.71 0.00 0.00 3736.42 2393.49 12366.36 00:07:03.115 { 00:07:03.115 "results": [ 00:07:03.115 { 00:07:03.115 "job": "Nvme0n1", 00:07:03.115 "core_mask": "0x2", 00:07:03.115 "workload": "randwrite", 00:07:03.115 "status": "finished", 00:07:03.115 "queue_depth": 128, 00:07:03.115 "io_size": 4096, 00:07:03.115 "runtime": 10.003436, 00:07:03.115 "iops": 34230.73831831383, 00:07:03.115 "mibps": 133.7138215559134, 00:07:03.115 "io_failed": 0, 00:07:03.115 "io_timeout": 0, 00:07:03.115 "avg_latency_us": 3736.423293392714, 00:07:03.115 "min_latency_us": 2393.488695652174, 00:07:03.115 "max_latency_us": 12366.358260869565 00:07:03.115 } 00:07:03.115 ], 00:07:03.115 "core_count": 1 00:07:03.115 } 00:07:03.115 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@66 -- # killprocess 1686514 00:07:03.115 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@950 -- # '[' -z 1686514 ']' 00:07:03.115 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@954 -- # kill -0 1686514 00:07:03.115 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@955 -- # uname 00:07:03.115 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:03.115 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1686514 00:07:03.376 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:07:03.376 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:07:03.376 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1686514' 00:07:03.376 killing process with pid 1686514 00:07:03.376 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@969 -- # kill 1686514 00:07:03.376 Received shutdown signal, test time was about 10.000000 seconds 00:07:03.376 00:07:03.376 Latency(us) 00:07:03.376 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:03.376 =================================================================================================================== 00:07:03.376 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:07:03.376 15:12:04 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@974 -- # wait 1686514 00:07:03.376 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@68 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:07:03.635 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@69 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode0 00:07:03.894 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@70 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:03.894 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@70 -- # jq -r '.[0].free_clusters' 00:07:04.155 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@70 -- # free_clusters=61 00:07:04.155 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@72 -- # [[ '' == \d\i\r\t\y ]] 00:07:04.155 15:12:05 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@84 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_delete aio_bdev 00:07:04.416 [2024-09-27 15:12:06.061187] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev aio_bdev being removed: closing lvstore lvs 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@85 -- # NOT /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@650 -- # local es=0 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py ]] 00:07:04.416 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:04.676 request: 00:07:04.676 { 00:07:04.676 "uuid": "76c82256-e9a4-4f66-bcc9-5e666f957679", 00:07:04.676 "method": "bdev_lvol_get_lvstores", 00:07:04.676 "req_id": 1 00:07:04.676 } 00:07:04.676 Got JSON-RPC error response 00:07:04.676 response: 00:07:04.676 { 00:07:04.676 "code": -19, 00:07:04.676 "message": "No such device" 00:07:04.676 } 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@653 -- # es=1 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@86 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev aio_bdev 4096 00:07:04.676 aio_bdev 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@87 -- # waitforbdev 5a868cf2-3e5a-4f96-835e-b1b14b31c4cc 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@899 -- # local bdev_name=5a868cf2-3e5a-4f96-835e-b1b14b31c4cc 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@901 -- # local i 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:07:04.676 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:07:04.935 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b 5a868cf2-3e5a-4f96-835e-b1b14b31c4cc -t 2000 00:07:05.196 [ 00:07:05.196 { 00:07:05.196 "name": "5a868cf2-3e5a-4f96-835e-b1b14b31c4cc", 00:07:05.196 "aliases": [ 00:07:05.196 "lvs/lvol" 00:07:05.196 ], 00:07:05.196 "product_name": "Logical Volume", 00:07:05.196 "block_size": 4096, 00:07:05.196 "num_blocks": 38912, 00:07:05.196 "uuid": "5a868cf2-3e5a-4f96-835e-b1b14b31c4cc", 00:07:05.196 "assigned_rate_limits": { 00:07:05.196 "rw_ios_per_sec": 0, 00:07:05.196 "rw_mbytes_per_sec": 0, 00:07:05.196 "r_mbytes_per_sec": 0, 00:07:05.196 "w_mbytes_per_sec": 0 00:07:05.196 }, 00:07:05.196 "claimed": false, 00:07:05.196 "zoned": false, 00:07:05.196 "supported_io_types": { 00:07:05.196 "read": true, 00:07:05.196 "write": true, 00:07:05.196 "unmap": true, 00:07:05.196 "flush": false, 00:07:05.196 "reset": true, 00:07:05.196 "nvme_admin": false, 00:07:05.196 "nvme_io": false, 00:07:05.196 "nvme_io_md": false, 00:07:05.196 "write_zeroes": true, 00:07:05.196 "zcopy": false, 00:07:05.196 "get_zone_info": false, 00:07:05.196 "zone_management": false, 00:07:05.196 "zone_append": false, 00:07:05.196 "compare": false, 00:07:05.196 "compare_and_write": false, 00:07:05.196 "abort": false, 00:07:05.196 "seek_hole": true, 00:07:05.196 "seek_data": true, 00:07:05.196 "copy": false, 00:07:05.196 "nvme_iov_md": false 00:07:05.196 }, 00:07:05.196 "driver_specific": { 00:07:05.196 "lvol": { 00:07:05.196 "lvol_store_uuid": "76c82256-e9a4-4f66-bcc9-5e666f957679", 00:07:05.196 "base_bdev": "aio_bdev", 00:07:05.196 "thin_provision": false, 00:07:05.196 "num_allocated_clusters": 38, 00:07:05.196 "snapshot": false, 00:07:05.196 "clone": false, 00:07:05.196 "esnap_clone": false 00:07:05.196 } 00:07:05.196 } 00:07:05.196 } 00:07:05.196 ] 00:07:05.196 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@907 -- # return 0 00:07:05.196 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@88 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:05.196 15:12:06 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@88 -- # jq -r '.[0].free_clusters' 00:07:05.460 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@88 -- # (( free_clusters == 61 )) 00:07:05.460 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@89 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:05.460 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@89 -- # jq -r '.[0].total_data_clusters' 00:07:05.460 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@89 -- # (( data_clusters == 99 )) 00:07:05.460 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@92 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete 5a868cf2-3e5a-4f96-835e-b1b14b31c4cc 00:07:05.720 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@93 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u 76c82256-e9a4-4f66-bcc9-5e666f957679 00:07:05.980 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@94 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_delete aio_bdev 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- target/nvmf_lvs_grow.sh@95 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:07:06.240 00:07:06.240 real 0m16.441s 00:07:06.240 user 0m16.397s 00:07:06.240 sys 0m1.264s 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_clean -- common/autotest_common.sh@10 -- # set +x 00:07:06.240 ************************************ 00:07:06.240 END TEST lvs_grow_clean 00:07:06.240 ************************************ 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@103 -- # run_test lvs_grow_dirty lvs_grow dirty 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:07:06.240 ************************************ 00:07:06.240 START TEST lvs_grow_dirty 00:07:06.240 ************************************ 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@1125 -- # lvs_grow dirty 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@15 -- # local aio_bdev lvs lvol 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@16 -- # local data_clusters free_clusters 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@17 -- # local bdevperf_pid run_test_pid 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@18 -- # local aio_init_size_mb=200 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@19 -- # local aio_final_size_mb=400 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@20 -- # local lvol_bdev_size_mb=150 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@23 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@24 -- # truncate -s 200M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:07:06.240 15:12:07 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@25 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev aio_bdev 4096 00:07:06.501 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@25 -- # aio_bdev=aio_bdev 00:07:06.501 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --cluster-sz 4194304 --md-pages-per-cluster-ratio 300 aio_bdev lvs 00:07:06.762 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@28 -- # lvs=ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:06.762 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:06.762 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@29 -- # jq -r '.[0].total_data_clusters' 00:07:06.762 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@29 -- # data_clusters=49 00:07:06.762 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@30 -- # (( data_clusters == 49 )) 00:07:06.762 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b lvol 150 00:07:07.021 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@33 -- # lvol=c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:07.021 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@36 -- # truncate -s 400M /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:07:07.021 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@37 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_rescan aio_bdev 00:07:07.282 [2024-09-27 15:12:08.956371] bdev_aio.c:1044:bdev_aio_rescan: *NOTICE*: AIO device is resized: bdev name /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev, old block count 51200, new block count 102400 00:07:07.282 [2024-09-27 15:12:08.956444] vbdev_lvol.c: 165:vbdev_lvs_base_bdev_event_cb: *NOTICE*: Unsupported bdev event: type 1 00:07:07.282 true 00:07:07.282 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@38 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:07.282 15:12:08 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@38 -- # jq -r '.[0].total_data_clusters' 00:07:07.541 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@38 -- # (( data_clusters == 49 )) 00:07:07.541 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@41 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK0 00:07:07.541 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@42 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:07.801 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@43 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4420 00:07:08.061 [2024-09-27 15:12:09.714802] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:07:08.061 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@44 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@48 -- # bdevperf_pid=1689229 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@49 -- # trap 'killprocess $bdevperf_pid; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@47 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bdevperf.sock -m 0x2 -o 4096 -q 128 -w randwrite -t 10 -S 1 -z 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@50 -- # waitforlisten 1689229 /var/tmp/bdevperf.sock 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@831 -- # '[' -z 1689229 ']' 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:07:08.320 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:08.320 15:12:09 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@10 -- # set +x 00:07:08.320 [2024-09-27 15:12:09.956489] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:08.320 [2024-09-27 15:12:09.956549] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1689229 ] 00:07:08.321 [2024-09-27 15:12:10.043026] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.321 [2024-09-27 15:12:10.134629] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.258 15:12:10 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:09.258 15:12:10 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@864 -- # return 0 00:07:09.258 15:12:10 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@52 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b Nvme0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode0 00:07:09.258 Nvme0n1 00:07:09.258 15:12:11 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@53 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_get_bdevs -b Nvme0n1 -t 3000 00:07:09.517 [ 00:07:09.517 { 00:07:09.517 "name": "Nvme0n1", 00:07:09.517 "aliases": [ 00:07:09.517 "c0c36789-27e9-455d-9034-de1ac59a99a3" 00:07:09.517 ], 00:07:09.517 "product_name": "NVMe disk", 00:07:09.517 "block_size": 4096, 00:07:09.517 "num_blocks": 38912, 00:07:09.517 "uuid": "c0c36789-27e9-455d-9034-de1ac59a99a3", 00:07:09.517 "numa_id": 0, 00:07:09.517 "assigned_rate_limits": { 00:07:09.517 "rw_ios_per_sec": 0, 00:07:09.517 "rw_mbytes_per_sec": 0, 00:07:09.517 "r_mbytes_per_sec": 0, 00:07:09.517 "w_mbytes_per_sec": 0 00:07:09.517 }, 00:07:09.517 "claimed": false, 00:07:09.517 "zoned": false, 00:07:09.517 "supported_io_types": { 00:07:09.517 "read": true, 00:07:09.517 "write": true, 00:07:09.517 "unmap": true, 00:07:09.517 "flush": true, 00:07:09.517 "reset": true, 00:07:09.517 "nvme_admin": true, 00:07:09.517 "nvme_io": true, 00:07:09.517 "nvme_io_md": false, 00:07:09.517 "write_zeroes": true, 00:07:09.517 "zcopy": false, 00:07:09.517 "get_zone_info": false, 00:07:09.517 "zone_management": false, 00:07:09.517 "zone_append": false, 00:07:09.517 "compare": true, 00:07:09.517 "compare_and_write": true, 00:07:09.517 "abort": true, 00:07:09.517 "seek_hole": false, 00:07:09.517 "seek_data": false, 00:07:09.517 "copy": true, 00:07:09.517 "nvme_iov_md": false 00:07:09.517 }, 00:07:09.517 "memory_domains": [ 00:07:09.517 { 00:07:09.517 "dma_device_id": "SPDK_RDMA_DMA_DEVICE", 00:07:09.517 "dma_device_type": 0 00:07:09.517 } 00:07:09.517 ], 00:07:09.517 "driver_specific": { 00:07:09.517 "nvme": [ 00:07:09.517 { 00:07:09.517 "trid": { 00:07:09.517 "trtype": "RDMA", 00:07:09.517 "adrfam": "IPv4", 00:07:09.517 "traddr": "10.0.0.2", 00:07:09.517 "trsvcid": "4420", 00:07:09.517 "subnqn": "nqn.2016-06.io.spdk:cnode0" 00:07:09.517 }, 00:07:09.517 "ctrlr_data": { 00:07:09.517 "cntlid": 1, 00:07:09.517 "vendor_id": "0x8086", 00:07:09.517 "model_number": "SPDK bdev Controller", 00:07:09.517 "serial_number": "SPDK0", 00:07:09.517 "firmware_revision": "25.01", 00:07:09.517 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:07:09.517 "oacs": { 00:07:09.517 "security": 0, 00:07:09.517 "format": 0, 00:07:09.517 "firmware": 0, 00:07:09.517 "ns_manage": 0 00:07:09.517 }, 00:07:09.517 "multi_ctrlr": true, 00:07:09.517 "ana_reporting": false 00:07:09.517 }, 00:07:09.517 "vs": { 00:07:09.517 "nvme_version": "1.3" 00:07:09.517 }, 00:07:09.517 "ns_data": { 00:07:09.517 "id": 1, 00:07:09.517 "can_share": true 00:07:09.517 } 00:07:09.517 } 00:07:09.517 ], 00:07:09.517 "mp_policy": "active_passive" 00:07:09.517 } 00:07:09.517 } 00:07:09.517 ] 00:07:09.517 15:12:11 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@56 -- # run_test_pid=1689406 00:07:09.517 15:12:11 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@57 -- # sleep 2 00:07:09.517 15:12:11 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@55 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bdevperf.sock perform_tests 00:07:09.777 Running I/O for 10 seconds... 00:07:10.714 Latency(us) 00:07:10.714 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:10.714 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:10.714 Nvme0n1 : 1.00 33697.00 131.63 0.00 0.00 0.00 0.00 0.00 00:07:10.714 =================================================================================================================== 00:07:10.714 Total : 33697.00 131.63 0.00 0.00 0.00 0.00 0.00 00:07:10.714 00:07:11.652 15:12:13 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_grow_lvstore -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:11.652 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:11.652 Nvme0n1 : 2.00 33986.00 132.76 0.00 0.00 0.00 0.00 0.00 00:07:11.652 =================================================================================================================== 00:07:11.652 Total : 33986.00 132.76 0.00 0.00 0.00 0.00 0.00 00:07:11.652 00:07:11.652 true 00:07:11.652 15:12:13 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@61 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:11.652 15:12:13 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@61 -- # jq -r '.[0].total_data_clusters' 00:07:11.910 15:12:13 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@61 -- # data_clusters=99 00:07:11.910 15:12:13 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@62 -- # (( data_clusters == 99 )) 00:07:11.910 15:12:13 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@65 -- # wait 1689406 00:07:12.846 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:12.846 Nvme0n1 : 3.00 34155.00 133.42 0.00 0.00 0.00 0.00 0.00 00:07:12.846 =================================================================================================================== 00:07:12.846 Total : 34155.00 133.42 0.00 0.00 0.00 0.00 0.00 00:07:12.846 00:07:13.781 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:13.781 Nvme0n1 : 4.00 34201.75 133.60 0.00 0.00 0.00 0.00 0.00 00:07:13.781 =================================================================================================================== 00:07:13.781 Total : 34201.75 133.60 0.00 0.00 0.00 0.00 0.00 00:07:13.781 00:07:14.717 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:14.717 Nvme0n1 : 5.00 34311.40 134.03 0.00 0.00 0.00 0.00 0.00 00:07:14.717 =================================================================================================================== 00:07:14.717 Total : 34311.40 134.03 0.00 0.00 0.00 0.00 0.00 00:07:14.717 00:07:15.653 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:15.653 Nvme0n1 : 6.00 34383.67 134.31 0.00 0.00 0.00 0.00 0.00 00:07:15.653 =================================================================================================================== 00:07:15.653 Total : 34383.67 134.31 0.00 0.00 0.00 0.00 0.00 00:07:15.653 00:07:16.673 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:16.673 Nvme0n1 : 7.00 34423.71 134.47 0.00 0.00 0.00 0.00 0.00 00:07:16.673 =================================================================================================================== 00:07:16.673 Total : 34423.71 134.47 0.00 0.00 0.00 0.00 0.00 00:07:16.673 00:07:17.611 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:17.611 Nvme0n1 : 8.00 34468.50 134.64 0.00 0.00 0.00 0.00 0.00 00:07:17.611 =================================================================================================================== 00:07:17.611 Total : 34468.50 134.64 0.00 0.00 0.00 0.00 0.00 00:07:17.611 00:07:18.991 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:18.991 Nvme0n1 : 9.00 34467.44 134.64 0.00 0.00 0.00 0.00 0.00 00:07:18.991 =================================================================================================================== 00:07:18.991 Total : 34467.44 134.64 0.00 0.00 0.00 0.00 0.00 00:07:18.991 00:07:19.561 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:19.561 Nvme0n1 : 10.00 34493.00 134.74 0.00 0.00 0.00 0.00 0.00 00:07:19.561 =================================================================================================================== 00:07:19.561 Total : 34493.00 134.74 0.00 0.00 0.00 0.00 0.00 00:07:19.561 00:07:19.821 00:07:19.821 Latency(us) 00:07:19.821 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:19.821 Job: Nvme0n1 (Core Mask 0x2, workload: randwrite, depth: 128, IO size: 4096) 00:07:19.821 Nvme0n1 : 10.00 34492.49 134.74 0.00 0.00 3708.38 2649.93 12708.29 00:07:19.821 =================================================================================================================== 00:07:19.821 Total : 34492.49 134.74 0.00 0.00 3708.38 2649.93 12708.29 00:07:19.821 { 00:07:19.821 "results": [ 00:07:19.821 { 00:07:19.821 "job": "Nvme0n1", 00:07:19.821 "core_mask": "0x2", 00:07:19.821 "workload": "randwrite", 00:07:19.821 "status": "finished", 00:07:19.821 "queue_depth": 128, 00:07:19.821 "io_size": 4096, 00:07:19.821 "runtime": 10.003772, 00:07:19.821 "iops": 34492.48943298588, 00:07:19.821 "mibps": 134.7362868476011, 00:07:19.821 "io_failed": 0, 00:07:19.821 "io_timeout": 0, 00:07:19.821 "avg_latency_us": 3708.378445447575, 00:07:19.821 "min_latency_us": 2649.9339130434782, 00:07:19.821 "max_latency_us": 12708.285217391305 00:07:19.821 } 00:07:19.821 ], 00:07:19.821 "core_count": 1 00:07:19.821 } 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@66 -- # killprocess 1689229 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@950 -- # '[' -z 1689229 ']' 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@954 -- # kill -0 1689229 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@955 -- # uname 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1689229 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1689229' 00:07:19.821 killing process with pid 1689229 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@969 -- # kill 1689229 00:07:19.821 Received shutdown signal, test time was about 10.000000 seconds 00:07:19.821 00:07:19.821 Latency(us) 00:07:19.821 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:19.821 =================================================================================================================== 00:07:19.821 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:07:19.821 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@974 -- # wait 1689229 00:07:20.081 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@68 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:07:20.081 15:12:21 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@69 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode0 00:07:20.341 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@70 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:20.341 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@70 -- # jq -r '.[0].free_clusters' 00:07:20.600 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@70 -- # free_clusters=61 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@72 -- # [[ dirty == \d\i\r\t\y ]] 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@74 -- # kill -9 1685933 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@75 -- # wait 1685933 00:07:20.601 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_lvs_grow.sh: line 75: 1685933 Killed "${NVMF_APP[@]}" "$@" 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@75 -- # true 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@76 -- # nvmfappstart -m 0x1 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@10 -- # set +x 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- nvmf/common.sh@324 -- # nvmfpid=1690867 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- nvmf/common.sh@325 -- # waitforlisten 1690867 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@831 -- # '[' -z 1690867 ']' 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:20.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:20.601 15:12:22 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@10 -- # set +x 00:07:20.601 [2024-09-27 15:12:22.438552] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:20.601 [2024-09-27 15:12:22.438618] [ DPDK EAL parameters: nvmf -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:20.861 [2024-09-27 15:12:22.520132] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.861 [2024-09-27 15:12:22.609116] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:07:20.861 [2024-09-27 15:12:22.609156] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:07:20.861 [2024-09-27 15:12:22.609166] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:20.861 [2024-09-27 15:12:22.609191] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:20.861 [2024-09-27 15:12:22.609198] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:07:20.861 [2024-09-27 15:12:22.609219] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.430 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:21.430 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@864 -- # return 0 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@10 -- # set +x 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@77 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev aio_bdev 4096 00:07:21.690 [2024-09-27 15:12:23.497470] blobstore.c:4875:bs_recover: *NOTICE*: Performing recovery on blobstore 00:07:21.690 [2024-09-27 15:12:23.497572] blobstore.c:4822:bs_load_replay_md_cpl: *NOTICE*: Recover: blob 0x0 00:07:21.690 [2024-09-27 15:12:23.497599] blobstore.c:4822:bs_load_replay_md_cpl: *NOTICE*: Recover: blob 0x1 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@77 -- # aio_bdev=aio_bdev 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@78 -- # waitforbdev c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@899 -- # local bdev_name=c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@901 -- # local i 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:07:21.690 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:07:21.949 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b c0c36789-27e9-455d-9034-de1ac59a99a3 -t 2000 00:07:22.209 [ 00:07:22.209 { 00:07:22.209 "name": "c0c36789-27e9-455d-9034-de1ac59a99a3", 00:07:22.209 "aliases": [ 00:07:22.209 "lvs/lvol" 00:07:22.209 ], 00:07:22.209 "product_name": "Logical Volume", 00:07:22.209 "block_size": 4096, 00:07:22.209 "num_blocks": 38912, 00:07:22.209 "uuid": "c0c36789-27e9-455d-9034-de1ac59a99a3", 00:07:22.209 "assigned_rate_limits": { 00:07:22.209 "rw_ios_per_sec": 0, 00:07:22.209 "rw_mbytes_per_sec": 0, 00:07:22.209 "r_mbytes_per_sec": 0, 00:07:22.209 "w_mbytes_per_sec": 0 00:07:22.209 }, 00:07:22.209 "claimed": false, 00:07:22.209 "zoned": false, 00:07:22.209 "supported_io_types": { 00:07:22.209 "read": true, 00:07:22.209 "write": true, 00:07:22.209 "unmap": true, 00:07:22.209 "flush": false, 00:07:22.209 "reset": true, 00:07:22.209 "nvme_admin": false, 00:07:22.209 "nvme_io": false, 00:07:22.209 "nvme_io_md": false, 00:07:22.209 "write_zeroes": true, 00:07:22.209 "zcopy": false, 00:07:22.209 "get_zone_info": false, 00:07:22.209 "zone_management": false, 00:07:22.209 "zone_append": false, 00:07:22.209 "compare": false, 00:07:22.209 "compare_and_write": false, 00:07:22.209 "abort": false, 00:07:22.209 "seek_hole": true, 00:07:22.209 "seek_data": true, 00:07:22.209 "copy": false, 00:07:22.209 "nvme_iov_md": false 00:07:22.209 }, 00:07:22.209 "driver_specific": { 00:07:22.209 "lvol": { 00:07:22.209 "lvol_store_uuid": "ad51237c-8ca3-4a53-aeb8-42cc7bb2636b", 00:07:22.209 "base_bdev": "aio_bdev", 00:07:22.209 "thin_provision": false, 00:07:22.209 "num_allocated_clusters": 38, 00:07:22.209 "snapshot": false, 00:07:22.209 "clone": false, 00:07:22.209 "esnap_clone": false 00:07:22.209 } 00:07:22.209 } 00:07:22.209 } 00:07:22.209 ] 00:07:22.209 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@907 -- # return 0 00:07:22.209 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@79 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:22.209 15:12:23 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@79 -- # jq -r '.[0].free_clusters' 00:07:22.468 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@79 -- # (( free_clusters == 61 )) 00:07:22.468 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@80 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:22.468 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@80 -- # jq -r '.[0].total_data_clusters' 00:07:22.728 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@80 -- # (( data_clusters == 99 )) 00:07:22.728 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@84 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_delete aio_bdev 00:07:22.728 [2024-09-27 15:12:24.494258] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev aio_bdev being removed: closing lvstore lvs 00:07:22.728 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@85 -- # NOT /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:22.728 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@650 -- # local es=0 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py ]] 00:07:22.729 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:22.989 request: 00:07:22.989 { 00:07:22.989 "uuid": "ad51237c-8ca3-4a53-aeb8-42cc7bb2636b", 00:07:22.989 "method": "bdev_lvol_get_lvstores", 00:07:22.989 "req_id": 1 00:07:22.989 } 00:07:22.989 Got JSON-RPC error response 00:07:22.989 response: 00:07:22.989 { 00:07:22.989 "code": -19, 00:07:22.989 "message": "No such device" 00:07:22.989 } 00:07:22.989 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@653 -- # es=1 00:07:22.989 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:22.989 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:22.989 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:22.989 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@86 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev aio_bdev 4096 00:07:23.248 aio_bdev 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@87 -- # waitforbdev c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@899 -- # local bdev_name=c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@901 -- # local i 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:07:23.248 15:12:24 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:07:23.508 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b c0c36789-27e9-455d-9034-de1ac59a99a3 -t 2000 00:07:23.508 [ 00:07:23.508 { 00:07:23.508 "name": "c0c36789-27e9-455d-9034-de1ac59a99a3", 00:07:23.508 "aliases": [ 00:07:23.508 "lvs/lvol" 00:07:23.508 ], 00:07:23.508 "product_name": "Logical Volume", 00:07:23.508 "block_size": 4096, 00:07:23.508 "num_blocks": 38912, 00:07:23.508 "uuid": "c0c36789-27e9-455d-9034-de1ac59a99a3", 00:07:23.508 "assigned_rate_limits": { 00:07:23.508 "rw_ios_per_sec": 0, 00:07:23.508 "rw_mbytes_per_sec": 0, 00:07:23.508 "r_mbytes_per_sec": 0, 00:07:23.508 "w_mbytes_per_sec": 0 00:07:23.508 }, 00:07:23.508 "claimed": false, 00:07:23.508 "zoned": false, 00:07:23.508 "supported_io_types": { 00:07:23.508 "read": true, 00:07:23.508 "write": true, 00:07:23.508 "unmap": true, 00:07:23.508 "flush": false, 00:07:23.508 "reset": true, 00:07:23.508 "nvme_admin": false, 00:07:23.508 "nvme_io": false, 00:07:23.508 "nvme_io_md": false, 00:07:23.508 "write_zeroes": true, 00:07:23.508 "zcopy": false, 00:07:23.508 "get_zone_info": false, 00:07:23.508 "zone_management": false, 00:07:23.508 "zone_append": false, 00:07:23.508 "compare": false, 00:07:23.508 "compare_and_write": false, 00:07:23.508 "abort": false, 00:07:23.508 "seek_hole": true, 00:07:23.508 "seek_data": true, 00:07:23.508 "copy": false, 00:07:23.508 "nvme_iov_md": false 00:07:23.508 }, 00:07:23.508 "driver_specific": { 00:07:23.508 "lvol": { 00:07:23.508 "lvol_store_uuid": "ad51237c-8ca3-4a53-aeb8-42cc7bb2636b", 00:07:23.508 "base_bdev": "aio_bdev", 00:07:23.508 "thin_provision": false, 00:07:23.508 "num_allocated_clusters": 38, 00:07:23.508 "snapshot": false, 00:07:23.508 "clone": false, 00:07:23.508 "esnap_clone": false 00:07:23.508 } 00:07:23.508 } 00:07:23.508 } 00:07:23.508 ] 00:07:23.508 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@907 -- # return 0 00:07:23.508 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@88 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:23.508 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@88 -- # jq -r '.[0].free_clusters' 00:07:23.768 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@88 -- # (( free_clusters == 61 )) 00:07:23.768 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@89 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_get_lvstores -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:23.768 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@89 -- # jq -r '.[0].total_data_clusters' 00:07:24.027 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@89 -- # (( data_clusters == 99 )) 00:07:24.027 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@92 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete c0c36789-27e9-455d-9034-de1ac59a99a3 00:07:24.286 15:12:25 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@93 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u ad51237c-8ca3-4a53-aeb8-42cc7bb2636b 00:07:24.286 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@94 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_aio_delete aio_bdev 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- target/nvmf_lvs_grow.sh@95 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/aio_bdev 00:07:24.545 00:07:24.545 real 0m18.363s 00:07:24.545 user 0m47.337s 00:07:24.545 sys 0m3.594s 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow.lvs_grow_dirty -- common/autotest_common.sh@10 -- # set +x 00:07:24.545 ************************************ 00:07:24.545 END TEST lvs_grow_dirty 00:07:24.545 ************************************ 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@1 -- # process_shm --id 0 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@808 -- # type=--id 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@809 -- # id=0 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@810 -- # '[' --id = --pid ']' 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@814 -- # find /dev/shm -name '*.0' -printf '%f\n' 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@814 -- # shm_files=nvmf_trace.0 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@816 -- # [[ -z nvmf_trace.0 ]] 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@820 -- # for n in $shm_files 00:07:24.545 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@821 -- # tar -C /dev/shm/ -cvzf /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/nvmf_trace.0_shm.tar.gz nvmf_trace.0 00:07:24.545 nvmf_trace.0 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@823 -- # return 0 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- target/nvmf_lvs_grow.sh@1 -- # nvmftestfini 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@331 -- # nvmfcleanup 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@99 -- # sync 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@102 -- # set +e 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@103 -- # for i in {1..20} 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:07:24.805 rmmod nvme_rdma 00:07:24.805 rmmod nvme_fabrics 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@106 -- # set -e 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@107 -- # return 0 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@332 -- # '[' -n 1690867 ']' 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@333 -- # killprocess 1690867 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@950 -- # '[' -z 1690867 ']' 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@954 -- # kill -0 1690867 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@955 -- # uname 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1690867 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1690867' 00:07:24.805 killing process with pid 1690867 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@969 -- # kill 1690867 00:07:24.805 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@974 -- # wait 1690867 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@338 -- # nvmf_fini 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@264 -- # local dev 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@267 -- # remove_target_ns 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@22 -- # _remove_target_ns 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@268 -- # delete_main_bridge 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@130 -- # return 0 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@41 -- # _dev=0 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@41 -- # dev_map=() 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/setup.sh@284 -- # iptr 00:07:25.065 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@538 -- # iptables-save 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- nvmf/common.sh@538 -- # iptables-restore 00:07:25.066 00:07:25.066 real 0m43.863s 00:07:25.066 user 1m10.581s 00:07:25.066 sys 0m10.773s 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_lvs_grow -- common/autotest_common.sh@10 -- # set +x 00:07:25.066 ************************************ 00:07:25.066 END TEST nvmf_lvs_grow 00:07:25.066 ************************************ 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@24 -- # run_test nvmf_bdev_io_wait /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdev_io_wait.sh --transport=rdma 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:07:25.066 ************************************ 00:07:25.066 START TEST nvmf_bdev_io_wait 00:07:25.066 ************************************ 00:07:25.066 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdev_io_wait.sh --transport=rdma 00:07:25.326 * Looking for test storage... 00:07:25.326 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:07:25.326 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:25.326 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1681 -- # lcov --version 00:07:25.326 15:12:26 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@336 -- # IFS=.-: 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@336 -- # read -ra ver1 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@337 -- # IFS=.-: 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@337 -- # read -ra ver2 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@338 -- # local 'op=<' 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@340 -- # ver1_l=2 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@341 -- # ver2_l=1 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@344 -- # case "$op" in 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@345 -- # : 1 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@365 -- # decimal 1 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@353 -- # local d=1 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@355 -- # echo 1 00:07:25.326 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@365 -- # ver1[v]=1 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@366 -- # decimal 2 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@353 -- # local d=2 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@355 -- # echo 2 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@366 -- # ver2[v]=2 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@368 -- # return 0 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:25.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:25.327 --rc genhtml_branch_coverage=1 00:07:25.327 --rc genhtml_function_coverage=1 00:07:25.327 --rc genhtml_legend=1 00:07:25.327 --rc geninfo_all_blocks=1 00:07:25.327 --rc geninfo_unexecuted_blocks=1 00:07:25.327 00:07:25.327 ' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:25.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:25.327 --rc genhtml_branch_coverage=1 00:07:25.327 --rc genhtml_function_coverage=1 00:07:25.327 --rc genhtml_legend=1 00:07:25.327 --rc geninfo_all_blocks=1 00:07:25.327 --rc geninfo_unexecuted_blocks=1 00:07:25.327 00:07:25.327 ' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:25.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:25.327 --rc genhtml_branch_coverage=1 00:07:25.327 --rc genhtml_function_coverage=1 00:07:25.327 --rc genhtml_legend=1 00:07:25.327 --rc geninfo_all_blocks=1 00:07:25.327 --rc geninfo_unexecuted_blocks=1 00:07:25.327 00:07:25.327 ' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:25.327 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:25.327 --rc genhtml_branch_coverage=1 00:07:25.327 --rc genhtml_function_coverage=1 00:07:25.327 --rc genhtml_legend=1 00:07:25.327 --rc geninfo_all_blocks=1 00:07:25.327 --rc geninfo_unexecuted_blocks=1 00:07:25.327 00:07:25.327 ' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@7 -- # uname -s 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@15 -- # shopt -s extglob 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- paths/export.sh@5 -- # export PATH 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@50 -- # : 0 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:07:25.327 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@54 -- # have_pci_nics=0 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@11 -- # MALLOC_BDEV_SIZE=64 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@14 -- # nvmftestinit 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@292 -- # prepare_net_devs 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@254 -- # local -g is_hw=no 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@256 -- # remove_target_ns 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@22 -- # _remove_target_ns 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@125 -- # xtrace_disable 00:07:25.327 15:12:27 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@131 -- # pci_devs=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@131 -- # local -a pci_devs 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@132 -- # pci_net_devs=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@133 -- # pci_drivers=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@133 -- # local -A pci_drivers 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@135 -- # net_devs=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@135 -- # local -ga net_devs 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@136 -- # e810=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@136 -- # local -ga e810 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@137 -- # x722=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@137 -- # local -ga x722 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@138 -- # mlx=() 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@138 -- # local -ga mlx 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:07:33.455 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:07:33.455 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:07:33.455 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:07:33.456 Found net devices under 0000:18:00.0: mlx_0_0 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:07:33.456 Found net devices under 0000:18:00.1: mlx_0_1 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@249 -- # get_rdma_if_list 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@75 -- # rdma_devs=() 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@89 -- # continue 2 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@89 -- # continue 2 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@258 -- # is_hw=yes 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@61 -- # uname 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@65 -- # modprobe ib_cm 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@66 -- # modprobe ib_core 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@67 -- # modprobe ib_umad 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@69 -- # modprobe iw_cm 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@27 -- # local -gA dev_map 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@28 -- # local -g _dev 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@44 -- # ips=() 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@58 -- # key_initiator=target1 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@11 -- # local val=167772161 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:07:33.456 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:07:33.457 10.0.0.1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@11 -- # local val=167772162 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:07:33.457 10.0.0.2 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@38 -- # ping_ips 1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@107 -- # local dev=target0 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:33.457 15:12:33 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:07:33.457 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:07:33.457 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:07:33.457 00:07:33.457 --- 10.0.0.2 ping statistics --- 00:07:33.457 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:07:33.457 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@107 -- # local dev=target0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:07:33.457 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:07:33.457 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.028 ms 00:07:33.457 00:07:33.457 --- 10.0.0.2 ping statistics --- 00:07:33.457 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:07:33.457 rtt min/avg/max/mdev = 0.028/0.028/0.028/0.000 ms 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@98 -- # (( pair++ )) 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@266 -- # return 0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@107 -- # local dev=target0 00:07:33.457 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # get_net_dev target1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@107 -- # local dev=target1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@107 -- # local dev=target0 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # get_net_dev target1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@107 -- # local dev=target1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@15 -- # nvmfappstart -m 0xF --wait-for-rpc 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@324 -- # nvmfpid=1694500 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF --wait-for-rpc 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@325 -- # waitforlisten 1694500 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@831 -- # '[' -z 1694500 ']' 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:33.458 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:33.458 15:12:34 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.458 [2024-09-27 15:12:34.212856] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:33.458 [2024-09-27 15:12:34.212921] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:33.458 [2024-09-27 15:12:34.298491] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:33.458 [2024-09-27 15:12:34.391276] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:07:33.458 [2024-09-27 15:12:34.391320] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:07:33.458 [2024-09-27 15:12:34.391330] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:33.458 [2024-09-27 15:12:34.391338] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:33.458 [2024-09-27 15:12:34.391349] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:07:33.458 [2024-09-27 15:12:34.391415] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.459 [2024-09-27 15:12:34.391523] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.459 [2024-09-27 15:12:34.391626] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.459 [2024-09-27 15:12:34.391627] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@864 -- # return 0 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@18 -- # rpc_cmd bdev_set_options -p 5 -c 1 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@19 -- # rpc_cmd framework_start_init 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@20 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.459 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.459 [2024-09-27 15:12:35.222813] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x6404f0/0x6449e0) succeed. 00:07:33.459 [2024-09-27 15:12:35.232882] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x641b30/0x686080) succeed. 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@22 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.720 Malloc0 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@24 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@25 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:33.720 [2024-09-27 15:12:35.424368] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@28 -- # WRITE_PID=1694707 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@27 -- # gen_nvmf_target_json 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -m 0x10 -i 1 --json /dev/fd/63 -q 128 -o 4096 -w write -t 1 -s 256 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@30 -- # READ_PID=1694709 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # config=() 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # local subsystem config 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:07:33.720 { 00:07:33.720 "params": { 00:07:33.720 "name": "Nvme$subsystem", 00:07:33.720 "trtype": "$TEST_TRANSPORT", 00:07:33.720 "traddr": "$NVMF_FIRST_TARGET_IP", 00:07:33.720 "adrfam": "ipv4", 00:07:33.720 "trsvcid": "$NVMF_PORT", 00:07:33.720 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:07:33.720 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:07:33.720 "hdgst": ${hdgst:-false}, 00:07:33.720 "ddgst": ${ddgst:-false} 00:07:33.720 }, 00:07:33.720 "method": "bdev_nvme_attach_controller" 00:07:33.720 } 00:07:33.720 EOF 00:07:33.720 )") 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -m 0x20 -i 2 --json /dev/fd/63 -q 128 -o 4096 -w read -t 1 -s 256 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@29 -- # gen_nvmf_target_json 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@32 -- # FLUSH_PID=1694711 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # config=() 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # local subsystem config 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:07:33.720 { 00:07:33.720 "params": { 00:07:33.720 "name": "Nvme$subsystem", 00:07:33.720 "trtype": "$TEST_TRANSPORT", 00:07:33.720 "traddr": "$NVMF_FIRST_TARGET_IP", 00:07:33.720 "adrfam": "ipv4", 00:07:33.720 "trsvcid": "$NVMF_PORT", 00:07:33.720 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:07:33.720 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:07:33.720 "hdgst": ${hdgst:-false}, 00:07:33.720 "ddgst": ${ddgst:-false} 00:07:33.720 }, 00:07:33.720 "method": "bdev_nvme_attach_controller" 00:07:33.720 } 00:07:33.720 EOF 00:07:33.720 )") 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -m 0x40 -i 3 --json /dev/fd/63 -q 128 -o 4096 -w flush -t 1 -s 256 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@31 -- # gen_nvmf_target_json 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@34 -- # UNMAP_PID=1694714 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # cat 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@35 -- # sync 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # config=() 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # local subsystem config 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:07:33.720 { 00:07:33.720 "params": { 00:07:33.720 "name": "Nvme$subsystem", 00:07:33.720 "trtype": "$TEST_TRANSPORT", 00:07:33.720 "traddr": "$NVMF_FIRST_TARGET_IP", 00:07:33.720 "adrfam": "ipv4", 00:07:33.720 "trsvcid": "$NVMF_PORT", 00:07:33.720 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:07:33.720 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:07:33.720 "hdgst": ${hdgst:-false}, 00:07:33.720 "ddgst": ${ddgst:-false} 00:07:33.720 }, 00:07:33.720 "method": "bdev_nvme_attach_controller" 00:07:33.720 } 00:07:33.720 EOF 00:07:33.720 )") 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -m 0x80 -i 4 --json /dev/fd/63 -q 128 -o 4096 -w unmap -t 1 -s 256 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@33 -- # gen_nvmf_target_json 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # cat 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # config=() 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@368 -- # local subsystem config 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:07:33.720 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:07:33.720 { 00:07:33.720 "params": { 00:07:33.720 "name": "Nvme$subsystem", 00:07:33.720 "trtype": "$TEST_TRANSPORT", 00:07:33.720 "traddr": "$NVMF_FIRST_TARGET_IP", 00:07:33.720 "adrfam": "ipv4", 00:07:33.720 "trsvcid": "$NVMF_PORT", 00:07:33.720 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:07:33.720 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:07:33.720 "hdgst": ${hdgst:-false}, 00:07:33.721 "ddgst": ${ddgst:-false} 00:07:33.721 }, 00:07:33.721 "method": "bdev_nvme_attach_controller" 00:07:33.721 } 00:07:33.721 EOF 00:07:33.721 )") 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # cat 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@37 -- # wait 1694707 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@390 -- # cat 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@392 -- # jq . 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@392 -- # jq . 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@392 -- # jq . 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@393 -- # IFS=, 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:07:33.721 "params": { 00:07:33.721 "name": "Nvme1", 00:07:33.721 "trtype": "rdma", 00:07:33.721 "traddr": "10.0.0.2", 00:07:33.721 "adrfam": "ipv4", 00:07:33.721 "trsvcid": "4420", 00:07:33.721 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:07:33.721 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:07:33.721 "hdgst": false, 00:07:33.721 "ddgst": false 00:07:33.721 }, 00:07:33.721 "method": "bdev_nvme_attach_controller" 00:07:33.721 }' 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@393 -- # IFS=, 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:07:33.721 "params": { 00:07:33.721 "name": "Nvme1", 00:07:33.721 "trtype": "rdma", 00:07:33.721 "traddr": "10.0.0.2", 00:07:33.721 "adrfam": "ipv4", 00:07:33.721 "trsvcid": "4420", 00:07:33.721 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:07:33.721 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:07:33.721 "hdgst": false, 00:07:33.721 "ddgst": false 00:07:33.721 }, 00:07:33.721 "method": "bdev_nvme_attach_controller" 00:07:33.721 }' 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@392 -- # jq . 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@393 -- # IFS=, 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:07:33.721 "params": { 00:07:33.721 "name": "Nvme1", 00:07:33.721 "trtype": "rdma", 00:07:33.721 "traddr": "10.0.0.2", 00:07:33.721 "adrfam": "ipv4", 00:07:33.721 "trsvcid": "4420", 00:07:33.721 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:07:33.721 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:07:33.721 "hdgst": false, 00:07:33.721 "ddgst": false 00:07:33.721 }, 00:07:33.721 "method": "bdev_nvme_attach_controller" 00:07:33.721 }' 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@393 -- # IFS=, 00:07:33.721 15:12:35 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:07:33.721 "params": { 00:07:33.721 "name": "Nvme1", 00:07:33.721 "trtype": "rdma", 00:07:33.721 "traddr": "10.0.0.2", 00:07:33.721 "adrfam": "ipv4", 00:07:33.721 "trsvcid": "4420", 00:07:33.721 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:07:33.721 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:07:33.721 "hdgst": false, 00:07:33.721 "ddgst": false 00:07:33.721 }, 00:07:33.721 "method": "bdev_nvme_attach_controller" 00:07:33.721 }' 00:07:33.721 [2024-09-27 15:12:35.478352] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:33.721 [2024-09-27 15:12:35.478354] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:33.721 [2024-09-27 15:12:35.478420] [ DPDK EAL parameters: bdevperf -c 0x10 -m 256 --no-telemetry --log-level=lib.eal:6 --log-level=lib[2024-09-27 15:12:35.478420] [ DPDK EAL parameters: bdevperf -c 0x20 -m 256 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk1 .cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk2 --proc-type=auto ] 00:07:33.721 --proc-type=auto ] 00:07:33.721 [2024-09-27 15:12:35.480321] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:33.721 [2024-09-27 15:12:35.480379] [ DPDK EAL parameters: bdevperf -c 0x40 -m 256 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk3 --proc-type=auto ] 00:07:33.721 [2024-09-27 15:12:35.485536] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:33.721 [2024-09-27 15:12:35.485589] [ DPDK EAL parameters: bdevperf -c 0x80 -m 256 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk4 --proc-type=auto ] 00:07:33.981 [2024-09-27 15:12:35.679024] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.981 [2024-09-27 15:12:35.760532] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 5 00:07:33.981 [2024-09-27 15:12:35.774543] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.240 [2024-09-27 15:12:35.855091] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 6 00:07:34.240 [2024-09-27 15:12:35.871954] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.240 [2024-09-27 15:12:35.920378] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.240 [2024-09-27 15:12:35.966713] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:07:34.240 [2024-09-27 15:12:36.000820] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 7 00:07:34.498 Running I/O for 1 seconds... 00:07:34.758 Running I/O for 1 seconds... 00:07:34.758 Running I/O for 1 seconds... 00:07:34.758 Running I/O for 1 seconds... 00:07:35.327 21234.00 IOPS, 82.95 MiB/s 00:07:35.327 Latency(us) 00:07:35.327 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:35.327 Job: Nvme1n1 (Core Mask 0x20, workload: read, depth: 128, IO size: 4096) 00:07:35.327 Nvme1n1 : 1.01 21241.28 82.97 0.00 0.00 6007.16 4416.56 12594.31 00:07:35.327 =================================================================================================================== 00:07:35.327 Total : 21241.28 82.97 0.00 0.00 6007.16 4416.56 12594.31 00:07:35.586 255552.00 IOPS, 998.25 MiB/s 00:07:35.586 Latency(us) 00:07:35.586 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:35.586 Job: Nvme1n1 (Core Mask 0x40, workload: flush, depth: 128, IO size: 4096) 00:07:35.586 Nvme1n1 : 1.00 255161.69 996.73 0.00 0.00 499.25 231.51 2251.02 00:07:35.586 =================================================================================================================== 00:07:35.586 Total : 255161.69 996.73 0.00 0.00 499.25 231.51 2251.02 00:07:35.846 16447.00 IOPS, 64.25 MiB/s 00:07:35.846 Latency(us) 00:07:35.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:35.846 Job: Nvme1n1 (Core Mask 0x10, workload: write, depth: 128, IO size: 4096) 00:07:35.846 Nvme1n1 : 1.01 16496.05 64.44 0.00 0.00 7735.64 4331.07 16982.37 00:07:35.846 =================================================================================================================== 00:07:35.846 Total : 16496.05 64.44 0.00 0.00 7735.64 4331.07 16982.37 00:07:35.846 18429.00 IOPS, 71.99 MiB/s 00:07:35.846 Latency(us) 00:07:35.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:35.846 Job: Nvme1n1 (Core Mask 0x80, workload: unmap, depth: 128, IO size: 4096) 00:07:35.846 Nvme1n1 : 1.01 18521.26 72.35 0.00 0.00 6895.70 2735.42 16868.40 00:07:35.846 =================================================================================================================== 00:07:35.846 Total : 18521.26 72.35 0.00 0.00 6895.70 2735.42 16868.40 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@38 -- # wait 1694709 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@39 -- # wait 1694711 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@40 -- # wait 1694714 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@42 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@44 -- # trap - SIGINT SIGTERM EXIT 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- target/bdev_io_wait.sh@46 -- # nvmftestfini 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@331 -- # nvmfcleanup 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@99 -- # sync 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@102 -- # set +e 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@103 -- # for i in {1..20} 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:07:36.106 rmmod nvme_rdma 00:07:36.106 rmmod nvme_fabrics 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@106 -- # set -e 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@107 -- # return 0 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@332 -- # '[' -n 1694500 ']' 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@333 -- # killprocess 1694500 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@950 -- # '[' -z 1694500 ']' 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@954 -- # kill -0 1694500 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@955 -- # uname 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1694500 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1694500' 00:07:36.106 killing process with pid 1694500 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@969 -- # kill 1694500 00:07:36.106 15:12:37 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@974 -- # wait 1694500 00:07:36.365 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:07:36.365 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@338 -- # nvmf_fini 00:07:36.365 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@264 -- # local dev 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@267 -- # remove_target_ns 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@22 -- # _remove_target_ns 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@268 -- # delete_main_bridge 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@130 -- # return 0 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:07:36.625 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@41 -- # _dev=0 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@41 -- # dev_map=() 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/setup.sh@284 -- # iptr 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@538 -- # iptables-save 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- nvmf/common.sh@538 -- # iptables-restore 00:07:36.626 00:07:36.626 real 0m11.390s 00:07:36.626 user 0m23.068s 00:07:36.626 sys 0m7.196s 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_bdev_io_wait -- common/autotest_common.sh@10 -- # set +x 00:07:36.626 ************************************ 00:07:36.626 END TEST nvmf_bdev_io_wait 00:07:36.626 ************************************ 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@25 -- # run_test nvmf_queue_depth /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/queue_depth.sh --transport=rdma 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:07:36.626 ************************************ 00:07:36.626 START TEST nvmf_queue_depth 00:07:36.626 ************************************ 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/queue_depth.sh --transport=rdma 00:07:36.626 * Looking for test storage... 00:07:36.626 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1681 -- # lcov --version 00:07:36.626 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@336 -- # IFS=.-: 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@336 -- # read -ra ver1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@337 -- # IFS=.-: 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@337 -- # read -ra ver2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@338 -- # local 'op=<' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@340 -- # ver1_l=2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@341 -- # ver2_l=1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@344 -- # case "$op" in 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@345 -- # : 1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@365 -- # decimal 1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@353 -- # local d=1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@355 -- # echo 1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@365 -- # ver1[v]=1 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@366 -- # decimal 2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@353 -- # local d=2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@355 -- # echo 2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@366 -- # ver2[v]=2 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@368 -- # return 0 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:36.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.886 --rc genhtml_branch_coverage=1 00:07:36.886 --rc genhtml_function_coverage=1 00:07:36.886 --rc genhtml_legend=1 00:07:36.886 --rc geninfo_all_blocks=1 00:07:36.886 --rc geninfo_unexecuted_blocks=1 00:07:36.886 00:07:36.886 ' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:36.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.886 --rc genhtml_branch_coverage=1 00:07:36.886 --rc genhtml_function_coverage=1 00:07:36.886 --rc genhtml_legend=1 00:07:36.886 --rc geninfo_all_blocks=1 00:07:36.886 --rc geninfo_unexecuted_blocks=1 00:07:36.886 00:07:36.886 ' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:36.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.886 --rc genhtml_branch_coverage=1 00:07:36.886 --rc genhtml_function_coverage=1 00:07:36.886 --rc genhtml_legend=1 00:07:36.886 --rc geninfo_all_blocks=1 00:07:36.886 --rc geninfo_unexecuted_blocks=1 00:07:36.886 00:07:36.886 ' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:36.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:36.886 --rc genhtml_branch_coverage=1 00:07:36.886 --rc genhtml_function_coverage=1 00:07:36.886 --rc genhtml_legend=1 00:07:36.886 --rc geninfo_all_blocks=1 00:07:36.886 --rc geninfo_unexecuted_blocks=1 00:07:36.886 00:07:36.886 ' 00:07:36.886 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@12 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@7 -- # uname -s 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@15 -- # shopt -s extglob 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- paths/export.sh@5 -- # export PATH 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@50 -- # : 0 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:07:36.887 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@54 -- # have_pci_nics=0 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@14 -- # MALLOC_BDEV_SIZE=64 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@15 -- # MALLOC_BLOCK_SIZE=512 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@17 -- # bdevperf_rpc_sock=/var/tmp/bdevperf.sock 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@19 -- # nvmftestinit 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@292 -- # prepare_net_devs 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@254 -- # local -g is_hw=no 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@256 -- # remove_target_ns 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@22 -- # _remove_target_ns 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@125 -- # xtrace_disable 00:07:36.887 15:12:38 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@131 -- # pci_devs=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@131 -- # local -a pci_devs 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@132 -- # pci_net_devs=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@133 -- # pci_drivers=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@133 -- # local -A pci_drivers 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@135 -- # net_devs=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@135 -- # local -ga net_devs 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@136 -- # e810=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@136 -- # local -ga e810 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@137 -- # x722=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@137 -- # local -ga x722 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@138 -- # mlx=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@138 -- # local -ga mlx 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:07:43.464 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:07:43.464 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:07:43.464 Found net devices under 0000:18:00.0: mlx_0_0 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:07:43.464 Found net devices under 0000:18:00.1: mlx_0_1 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@249 -- # get_rdma_if_list 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@75 -- # rdma_devs=() 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@89 -- # continue 2 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@89 -- # continue 2 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@258 -- # is_hw=yes 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@61 -- # uname 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:07:43.464 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@65 -- # modprobe ib_cm 00:07:43.465 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@66 -- # modprobe ib_core 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@67 -- # modprobe ib_umad 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@69 -- # modprobe iw_cm 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@27 -- # local -gA dev_map 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@28 -- # local -g _dev 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@44 -- # ips=() 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@58 -- # key_initiator=target1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@11 -- # local val=167772161 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:07:43.725 10.0.0.1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@11 -- # local val=167772162 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:07:43.725 10.0.0.2 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@38 -- # ping_ips 1 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:43.725 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@107 -- # local dev=target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:07:43.726 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:07:43.726 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:07:43.726 00:07:43.726 --- 10.0.0.2 ping statistics --- 00:07:43.726 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:07:43.726 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@107 -- # local dev=target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:07:43.726 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:07:43.726 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:07:43.726 00:07:43.726 --- 10.0.0.2 ping statistics --- 00:07:43.726 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:07:43.726 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@98 -- # (( pair++ )) 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@266 -- # return 0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@107 -- # local dev=target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # get_net_dev target1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@107 -- # local dev=target1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # get_net_dev target0 00:07:43.726 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@107 -- # local dev=target0 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # get_net_dev target1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@107 -- # local dev=target1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:07:43.727 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@21 -- # nvmfappstart -m 0x2 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@324 -- # nvmfpid=1698023 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@325 -- # waitforlisten 1698023 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@831 -- # '[' -z 1698023 ']' 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:43.986 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:43.986 15:12:45 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:43.986 [2024-09-27 15:12:45.633150] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:43.986 [2024-09-27 15:12:45.633214] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:43.986 [2024-09-27 15:12:45.722848] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.986 [2024-09-27 15:12:45.808505] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:07:43.986 [2024-09-27 15:12:45.808552] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:07:43.986 [2024-09-27 15:12:45.808561] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:07:43.986 [2024-09-27 15:12:45.808570] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:07:43.986 [2024-09-27 15:12:45.808578] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:07:43.986 [2024-09-27 15:12:45.808604] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@864 -- # return 0 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@23 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.953 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.954 [2024-09-27 15:12:46.565003] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x249b540/0x249fa30) succeed. 00:07:44.954 [2024-09-27 15:12:46.574233] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x249ca40/0x24e10d0) succeed. 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.954 Malloc0 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@25 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@26 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@27 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.954 [2024-09-27 15:12:46.675772] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@30 -- # bdevperf_pid=1698226 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -z -r /var/tmp/bdevperf.sock -q 1024 -o 4096 -w verify -t 10 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@32 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; killprocess $bdevperf_pid; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@33 -- # waitforlisten 1698226 /var/tmp/bdevperf.sock 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@831 -- # '[' -z 1698226 ']' 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:07:44.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:44.954 15:12:46 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:44.954 [2024-09-27 15:12:46.728675] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:07:44.954 [2024-09-27 15:12:46.728729] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1698226 ] 00:07:45.213 [2024-09-27 15:12:46.812325] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.213 [2024-09-27 15:12:46.900333] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.782 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:45.782 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@864 -- # return 0 00:07:45.782 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@34 -- # rpc_cmd -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:07:45.782 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:45.783 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:46.042 NVMe0n1 00:07:46.042 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.042 15:12:47 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@35 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bdevperf.sock perform_tests 00:07:46.042 Running I/O for 10 seconds... 00:07:56.314 17105.00 IOPS, 66.82 MiB/s 17390.50 IOPS, 67.93 MiB/s 17290.67 IOPS, 67.54 MiB/s 17352.00 IOPS, 67.78 MiB/s 17408.00 IOPS, 68.00 MiB/s 17408.00 IOPS, 68.00 MiB/s 17453.14 IOPS, 68.18 MiB/s 17514.00 IOPS, 68.41 MiB/s 17521.78 IOPS, 68.44 MiB/s 17510.40 IOPS, 68.40 MiB/s 00:07:56.314 Latency(us) 00:07:56.314 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:56.314 Job: NVMe0n1 (Core Mask 0x1, workload: verify, depth: 1024, IO size: 4096) 00:07:56.314 Verification LBA range: start 0x0 length 0x4000 00:07:56.314 NVMe0n1 : 10.03 17559.43 68.59 0.00 0.00 58172.15 22909.11 37384.01 00:07:56.314 =================================================================================================================== 00:07:56.314 Total : 17559.43 68.59 0.00 0.00 58172.15 22909.11 37384.01 00:07:56.314 { 00:07:56.314 "results": [ 00:07:56.314 { 00:07:56.314 "job": "NVMe0n1", 00:07:56.314 "core_mask": "0x1", 00:07:56.314 "workload": "verify", 00:07:56.314 "status": "finished", 00:07:56.314 "verify_range": { 00:07:56.314 "start": 0, 00:07:56.314 "length": 16384 00:07:56.314 }, 00:07:56.314 "queue_depth": 1024, 00:07:56.314 "io_size": 4096, 00:07:56.314 "runtime": 10.030393, 00:07:56.314 "iops": 17559.431619478917, 00:07:56.314 "mibps": 68.59152976358952, 00:07:56.314 "io_failed": 0, 00:07:56.314 "io_timeout": 0, 00:07:56.314 "avg_latency_us": 58172.14673407482, 00:07:56.314 "min_latency_us": 22909.106086956523, 00:07:56.314 "max_latency_us": 37384.013913043476 00:07:56.314 } 00:07:56.314 ], 00:07:56.314 "core_count": 1 00:07:56.314 } 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@39 -- # killprocess 1698226 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@950 -- # '[' -z 1698226 ']' 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@954 -- # kill -0 1698226 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@955 -- # uname 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1698226 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1698226' 00:07:56.314 killing process with pid 1698226 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@969 -- # kill 1698226 00:07:56.314 Received shutdown signal, test time was about 10.000000 seconds 00:07:56.314 00:07:56.314 Latency(us) 00:07:56.314 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:56.314 =================================================================================================================== 00:07:56.314 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:07:56.314 15:12:57 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@974 -- # wait 1698226 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@41 -- # trap - SIGINT SIGTERM EXIT 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- target/queue_depth.sh@43 -- # nvmftestfini 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@331 -- # nvmfcleanup 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@99 -- # sync 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@102 -- # set +e 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@103 -- # for i in {1..20} 00:07:56.314 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:07:56.314 rmmod nvme_rdma 00:07:56.314 rmmod nvme_fabrics 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@106 -- # set -e 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@107 -- # return 0 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@332 -- # '[' -n 1698023 ']' 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@333 -- # killprocess 1698023 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@950 -- # '[' -z 1698023 ']' 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@954 -- # kill -0 1698023 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@955 -- # uname 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1698023 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1698023' 00:07:56.575 killing process with pid 1698023 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@969 -- # kill 1698023 00:07:56.575 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@974 -- # wait 1698023 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@338 -- # nvmf_fini 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@264 -- # local dev 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@267 -- # remove_target_ns 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@22 -- # _remove_target_ns 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@268 -- # delete_main_bridge 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@130 -- # return 0 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:07:56.835 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@41 -- # _dev=0 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@41 -- # dev_map=() 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/setup.sh@284 -- # iptr 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@538 -- # iptables-save 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- nvmf/common.sh@538 -- # iptables-restore 00:07:56.836 00:07:56.836 real 0m20.196s 00:07:56.836 user 0m26.692s 00:07:56.836 sys 0m6.124s 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_queue_depth -- common/autotest_common.sh@10 -- # set +x 00:07:56.836 ************************************ 00:07:56.836 END TEST nvmf_queue_depth 00:07:56.836 ************************************ 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@26 -- # run_test nvmf_nmic /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nmic.sh --transport=rdma 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:07:56.836 ************************************ 00:07:56.836 START TEST nvmf_nmic 00:07:56.836 ************************************ 00:07:56.836 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nmic.sh --transport=rdma 00:07:57.097 * Looking for test storage... 00:07:57.097 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1681 -- # lcov --version 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@336 -- # IFS=.-: 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@336 -- # read -ra ver1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@337 -- # IFS=.-: 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@337 -- # read -ra ver2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@338 -- # local 'op=<' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@340 -- # ver1_l=2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@341 -- # ver2_l=1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@344 -- # case "$op" in 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@345 -- # : 1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@365 -- # decimal 1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@353 -- # local d=1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@355 -- # echo 1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@365 -- # ver1[v]=1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@366 -- # decimal 2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@353 -- # local d=2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@355 -- # echo 2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@366 -- # ver2[v]=2 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@368 -- # return 0 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:57.097 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.097 --rc genhtml_branch_coverage=1 00:07:57.097 --rc genhtml_function_coverage=1 00:07:57.097 --rc genhtml_legend=1 00:07:57.097 --rc geninfo_all_blocks=1 00:07:57.097 --rc geninfo_unexecuted_blocks=1 00:07:57.097 00:07:57.097 ' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:57.097 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.097 --rc genhtml_branch_coverage=1 00:07:57.097 --rc genhtml_function_coverage=1 00:07:57.097 --rc genhtml_legend=1 00:07:57.097 --rc geninfo_all_blocks=1 00:07:57.097 --rc geninfo_unexecuted_blocks=1 00:07:57.097 00:07:57.097 ' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:57.097 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.097 --rc genhtml_branch_coverage=1 00:07:57.097 --rc genhtml_function_coverage=1 00:07:57.097 --rc genhtml_legend=1 00:07:57.097 --rc geninfo_all_blocks=1 00:07:57.097 --rc geninfo_unexecuted_blocks=1 00:07:57.097 00:07:57.097 ' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:57.097 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:57.097 --rc genhtml_branch_coverage=1 00:07:57.097 --rc genhtml_function_coverage=1 00:07:57.097 --rc genhtml_legend=1 00:07:57.097 --rc geninfo_all_blocks=1 00:07:57.097 --rc geninfo_unexecuted_blocks=1 00:07:57.097 00:07:57.097 ' 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@7 -- # uname -s 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:07:57.097 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@15 -- # shopt -s extglob 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- paths/export.sh@5 -- # export PATH 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@50 -- # : 0 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:07:57.098 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@54 -- # have_pci_nics=0 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@11 -- # MALLOC_BDEV_SIZE=64 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@14 -- # nvmftestinit 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@292 -- # prepare_net_devs 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@254 -- # local -g is_hw=no 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@256 -- # remove_target_ns 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@22 -- # _remove_target_ns 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@125 -- # xtrace_disable 00:07:57.098 15:12:58 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@131 -- # pci_devs=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@131 -- # local -a pci_devs 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@132 -- # pci_net_devs=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@133 -- # pci_drivers=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@133 -- # local -A pci_drivers 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@135 -- # net_devs=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@135 -- # local -ga net_devs 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@136 -- # e810=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@136 -- # local -ga e810 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@137 -- # x722=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@137 -- # local -ga x722 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@138 -- # mlx=() 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@138 -- # local -ga mlx 00:08:03.788 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:08:03.789 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:08:03.789 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:08:03.789 Found net devices under 0000:18:00.0: mlx_0_0 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:08:03.789 Found net devices under 0000:18:00.1: mlx_0_1 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@249 -- # get_rdma_if_list 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@75 -- # rdma_devs=() 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:08:03.789 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@89 -- # continue 2 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@89 -- # continue 2 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@258 -- # is_hw=yes 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@61 -- # uname 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@65 -- # modprobe ib_cm 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@66 -- # modprobe ib_core 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@67 -- # modprobe ib_umad 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@69 -- # modprobe iw_cm 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@27 -- # local -gA dev_map 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@28 -- # local -g _dev 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@44 -- # ips=() 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@58 -- # key_initiator=target1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@11 -- # local val=167772161 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:08:04.050 10.0.0.1 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:04.050 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@11 -- # local val=167772162 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:08:04.051 10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@38 -- # ping_ips 1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@107 -- # local dev=target0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:04.051 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:04.051 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:08:04.051 00:08:04.051 --- 10.0.0.2 ping statistics --- 00:08:04.051 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:04.051 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@107 -- # local dev=target0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:04.051 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:04.051 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:08:04.051 00:08:04.051 --- 10.0.0.2 ping statistics --- 00:08:04.051 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:04.051 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@98 -- # (( pair++ )) 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@266 -- # return 0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@107 -- # local dev=target0 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@107 -- # local dev=target1 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:04.051 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@107 -- # local dev=target0 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@107 -- # local dev=target1 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:04.052 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@15 -- # nvmfappstart -m 0xF 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@324 -- # nvmfpid=1702735 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@325 -- # waitforlisten 1702735 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@831 -- # '[' -z 1702735 ']' 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:04.311 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:04.311 15:13:05 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:04.311 [2024-09-27 15:13:05.995321] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:08:04.312 [2024-09-27 15:13:05.995402] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:04.312 [2024-09-27 15:13:06.082829] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:04.571 [2024-09-27 15:13:06.166402] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:08:04.571 [2024-09-27 15:13:06.166450] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:08:04.571 [2024-09-27 15:13:06.166460] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:08:04.571 [2024-09-27 15:13:06.166485] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:08:04.571 [2024-09-27 15:13:06.166504] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:08:04.571 [2024-09-27 15:13:06.166575] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:04.571 [2024-09-27 15:13:06.166686] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:08:04.571 [2024-09-27 15:13:06.166787] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:04.571 [2024-09-27 15:13:06.166788] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@864 -- # return 0 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@17 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.141 15:13:06 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.141 [2024-09-27 15:13:06.915345] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0xf204a0/0xf24990) succeed. 00:08:05.141 [2024-09-27 15:13:06.925722] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0xf21ae0/0xf66030) succeed. 00:08:05.401 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.401 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@20 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:08:05.401 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.401 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.401 Malloc0 00:08:05.401 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@21 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@22 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@23 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 [2024-09-27 15:13:07.101373] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@25 -- # echo 'test case1: single bdev can'\''t be used in multiple subsystems' 00:08:05.402 test case1: single bdev can't be used in multiple subsystems 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@26 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode2 -a -s SPDK2 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@27 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode2 -t rdma -a 10.0.0.2 -s 4420 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@28 -- # nmic_status=0 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@29 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode2 Malloc0 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 [2024-09-27 15:13:07.125238] bdev.c:8193:bdev_open: *ERROR*: bdev Malloc0 already claimed: type exclusive_write by module NVMe-oF Target 00:08:05.402 [2024-09-27 15:13:07.125262] subsystem.c:2157:spdk_nvmf_subsystem_add_ns_ext: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode2: bdev Malloc0 cannot be opened, error=-1 00:08:05.402 [2024-09-27 15:13:07.125273] nvmf_rpc.c:1517:nvmf_rpc_ns_paused: *ERROR*: Unable to add namespace 00:08:05.402 request: 00:08:05.402 { 00:08:05.402 "nqn": "nqn.2016-06.io.spdk:cnode2", 00:08:05.402 "namespace": { 00:08:05.402 "bdev_name": "Malloc0", 00:08:05.402 "no_auto_visible": false 00:08:05.402 }, 00:08:05.402 "method": "nvmf_subsystem_add_ns", 00:08:05.402 "req_id": 1 00:08:05.402 } 00:08:05.402 Got JSON-RPC error response 00:08:05.402 response: 00:08:05.402 { 00:08:05.402 "code": -32602, 00:08:05.402 "message": "Invalid parameters" 00:08:05.402 } 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@29 -- # nmic_status=1 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@31 -- # '[' 1 -eq 0 ']' 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@36 -- # echo ' Adding namespace failed - expected result.' 00:08:05.402 Adding namespace failed - expected result. 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@39 -- # echo 'test case2: host connect to nvmf target in multiple paths' 00:08:05.402 test case2: host connect to nvmf target in multiple paths 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@40 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:05.402 [2024-09-27 15:13:07.137305] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4421 *** 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:05.402 15:13:07 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@41 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:08:06.341 15:13:08 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@42 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4421 00:08:07.722 15:13:09 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@44 -- # waitforserial SPDKISFASTANDAWESOME 00:08:07.722 15:13:09 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1198 -- # local i=0 00:08:07.722 15:13:09 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:08:07.722 15:13:09 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:08:07.722 15:13:09 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1205 -- # sleep 2 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1208 -- # return 0 00:08:09.627 15:13:11 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 4096 -d 1 -t write -r 1 -v 00:08:09.627 [global] 00:08:09.627 thread=1 00:08:09.627 invalidate=1 00:08:09.627 rw=write 00:08:09.627 time_based=1 00:08:09.627 runtime=1 00:08:09.627 ioengine=libaio 00:08:09.627 direct=1 00:08:09.627 bs=4096 00:08:09.627 iodepth=1 00:08:09.627 norandommap=0 00:08:09.627 numjobs=1 00:08:09.627 00:08:09.627 verify_dump=1 00:08:09.627 verify_backlog=512 00:08:09.627 verify_state_save=0 00:08:09.627 do_verify=1 00:08:09.627 verify=crc32c-intel 00:08:09.627 [job0] 00:08:09.627 filename=/dev/nvme0n1 00:08:09.627 Could not set queue depth (nvme0n1) 00:08:09.627 job0: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:09.627 fio-3.35 00:08:09.627 Starting 1 thread 00:08:11.006 00:08:11.006 job0: (groupid=0, jobs=1): err= 0: pid=1703575: Fri Sep 27 15:13:12 2024 00:08:11.006 read: IOPS=6909, BW=27.0MiB/s (28.3MB/s)(27.0MiB/1001msec) 00:08:11.006 slat (nsec): min=8356, max=43173, avg=8928.19, stdev=954.54 00:08:11.006 clat (usec): min=48, max=117, avg=59.38, stdev= 3.88 00:08:11.006 lat (usec): min=58, max=143, avg=68.31, stdev= 4.02 00:08:11.006 clat percentiles (usec): 00:08:11.006 | 1.00th=[ 53], 5.00th=[ 55], 10.00th=[ 55], 20.00th=[ 57], 00:08:11.006 | 30.00th=[ 58], 40.00th=[ 59], 50.00th=[ 60], 60.00th=[ 61], 00:08:11.006 | 70.00th=[ 62], 80.00th=[ 63], 90.00th=[ 65], 95.00th=[ 67], 00:08:11.006 | 99.00th=[ 70], 99.50th=[ 72], 99.90th=[ 80], 99.95th=[ 86], 00:08:11.006 | 99.99th=[ 119] 00:08:11.006 write: IOPS=7160, BW=28.0MiB/s (29.3MB/s)(28.0MiB/1001msec); 0 zone resets 00:08:11.006 slat (nsec): min=10992, max=54079, avg=11764.57, stdev=1210.80 00:08:11.006 clat (usec): min=43, max=111, avg=56.56, stdev= 3.75 00:08:11.006 lat (usec): min=59, max=165, avg=68.32, stdev= 3.99 00:08:11.006 clat percentiles (usec): 00:08:11.006 | 1.00th=[ 50], 5.00th=[ 51], 10.00th=[ 52], 20.00th=[ 54], 00:08:11.006 | 30.00th=[ 55], 40.00th=[ 56], 50.00th=[ 57], 60.00th=[ 58], 00:08:11.006 | 70.00th=[ 59], 80.00th=[ 60], 90.00th=[ 62], 95.00th=[ 63], 00:08:11.006 | 99.00th=[ 67], 99.50th=[ 69], 99.90th=[ 74], 99.95th=[ 77], 00:08:11.006 | 99.99th=[ 112] 00:08:11.006 bw ( KiB/s): min=28672, max=28672, per=100.00%, avg=28672.00, stdev= 0.00, samples=1 00:08:11.006 iops : min= 7168, max= 7168, avg=7168.00, stdev= 0.00, samples=1 00:08:11.006 lat (usec) : 50=0.98%, 100=98.99%, 250=0.03% 00:08:11.006 cpu : usr=9.30%, sys=14.80%, ctx=14084, majf=0, minf=1 00:08:11.006 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:11.006 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:11.006 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:11.006 issued rwts: total=6916,7168,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:11.006 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:11.006 00:08:11.006 Run status group 0 (all jobs): 00:08:11.006 READ: bw=27.0MiB/s (28.3MB/s), 27.0MiB/s-27.0MiB/s (28.3MB/s-28.3MB/s), io=27.0MiB (28.3MB), run=1001-1001msec 00:08:11.006 WRITE: bw=28.0MiB/s (29.3MB/s), 28.0MiB/s-28.0MiB/s (29.3MB/s-29.3MB/s), io=28.0MiB (29.4MB), run=1001-1001msec 00:08:11.006 00:08:11.006 Disk stats (read/write): 00:08:11.006 nvme0n1: ios=6194/6499, merge=0/0, ticks=333/319, in_queue=652, util=90.88% 00:08:11.006 15:13:12 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@48 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:08:12.912 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 2 controller(s) 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@49 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1219 -- # local i=0 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1231 -- # return 0 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@51 -- # trap - SIGINT SIGTERM EXIT 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- target/nmic.sh@53 -- # nvmftestfini 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@331 -- # nvmfcleanup 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@99 -- # sync 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@102 -- # set +e 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@103 -- # for i in {1..20} 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:08:12.912 rmmod nvme_rdma 00:08:12.912 rmmod nvme_fabrics 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@106 -- # set -e 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@107 -- # return 0 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@332 -- # '[' -n 1702735 ']' 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@333 -- # killprocess 1702735 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@950 -- # '[' -z 1702735 ']' 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@954 -- # kill -0 1702735 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@955 -- # uname 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1702735 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1702735' 00:08:12.912 killing process with pid 1702735 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@969 -- # kill 1702735 00:08:12.912 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@974 -- # wait 1702735 00:08:13.171 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:08:13.171 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@338 -- # nvmf_fini 00:08:13.171 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@264 -- # local dev 00:08:13.171 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@267 -- # remove_target_ns 00:08:13.171 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@22 -- # _remove_target_ns 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@268 -- # delete_main_bridge 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@130 -- # return 0 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:08:13.172 15:13:14 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@41 -- # _dev=0 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@41 -- # dev_map=() 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/setup.sh@284 -- # iptr 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@538 -- # iptables-save 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:08:13.172 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- nvmf/common.sh@538 -- # iptables-restore 00:08:13.430 00:08:13.430 real 0m16.409s 00:08:13.430 user 0m39.738s 00:08:13.430 sys 0m6.437s 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_nmic -- common/autotest_common.sh@10 -- # set +x 00:08:13.430 ************************************ 00:08:13.430 END TEST nvmf_nmic 00:08:13.430 ************************************ 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@27 -- # run_test nvmf_fio_target /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/fio.sh --transport=rdma 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:08:13.430 ************************************ 00:08:13.430 START TEST nvmf_fio_target 00:08:13.430 ************************************ 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/fio.sh --transport=rdma 00:08:13.430 * Looking for test storage... 00:08:13.430 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1681 -- # lcov --version 00:08:13.430 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@336 -- # IFS=.-: 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@336 -- # read -ra ver1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@337 -- # IFS=.-: 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@337 -- # read -ra ver2 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@338 -- # local 'op=<' 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@340 -- # ver1_l=2 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@341 -- # ver2_l=1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@344 -- # case "$op" in 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@345 -- # : 1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@365 -- # decimal 1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@353 -- # local d=1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@355 -- # echo 1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@365 -- # ver1[v]=1 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@366 -- # decimal 2 00:08:13.690 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@353 -- # local d=2 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@355 -- # echo 2 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@366 -- # ver2[v]=2 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@368 -- # return 0 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:13.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.691 --rc genhtml_branch_coverage=1 00:08:13.691 --rc genhtml_function_coverage=1 00:08:13.691 --rc genhtml_legend=1 00:08:13.691 --rc geninfo_all_blocks=1 00:08:13.691 --rc geninfo_unexecuted_blocks=1 00:08:13.691 00:08:13.691 ' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:13.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.691 --rc genhtml_branch_coverage=1 00:08:13.691 --rc genhtml_function_coverage=1 00:08:13.691 --rc genhtml_legend=1 00:08:13.691 --rc geninfo_all_blocks=1 00:08:13.691 --rc geninfo_unexecuted_blocks=1 00:08:13.691 00:08:13.691 ' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:13.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.691 --rc genhtml_branch_coverage=1 00:08:13.691 --rc genhtml_function_coverage=1 00:08:13.691 --rc genhtml_legend=1 00:08:13.691 --rc geninfo_all_blocks=1 00:08:13.691 --rc geninfo_unexecuted_blocks=1 00:08:13.691 00:08:13.691 ' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:13.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:13.691 --rc genhtml_branch_coverage=1 00:08:13.691 --rc genhtml_function_coverage=1 00:08:13.691 --rc genhtml_legend=1 00:08:13.691 --rc geninfo_all_blocks=1 00:08:13.691 --rc geninfo_unexecuted_blocks=1 00:08:13.691 00:08:13.691 ' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@7 -- # uname -s 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@15 -- # shopt -s extglob 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- paths/export.sh@5 -- # export PATH 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@50 -- # : 0 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:08:13.691 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@54 -- # have_pci_nics=0 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@11 -- # MALLOC_BDEV_SIZE=64 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@14 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@16 -- # nvmftestinit 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@292 -- # prepare_net_devs 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@254 -- # local -g is_hw=no 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@256 -- # remove_target_ns 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@22 -- # _remove_target_ns 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@125 -- # xtrace_disable 00:08:13.691 15:13:15 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@10 -- # set +x 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@131 -- # pci_devs=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@131 -- # local -a pci_devs 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@132 -- # pci_net_devs=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@133 -- # pci_drivers=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@133 -- # local -A pci_drivers 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@135 -- # net_devs=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@135 -- # local -ga net_devs 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@136 -- # e810=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@136 -- # local -ga e810 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@137 -- # x722=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@137 -- # local -ga x722 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@138 -- # mlx=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@138 -- # local -ga mlx 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:08:20.263 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:08:20.263 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:08:20.263 Found net devices under 0000:18:00.0: mlx_0_0 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:08:20.263 Found net devices under 0000:18:00.1: mlx_0_1 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@249 -- # get_rdma_if_list 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@75 -- # rdma_devs=() 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@89 -- # continue 2 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:20.263 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@89 -- # continue 2 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@258 -- # is_hw=yes 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@61 -- # uname 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@65 -- # modprobe ib_cm 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@66 -- # modprobe ib_core 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@67 -- # modprobe ib_umad 00:08:20.264 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@69 -- # modprobe iw_cm 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@27 -- # local -gA dev_map 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@28 -- # local -g _dev 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@44 -- # ips=() 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@58 -- # key_initiator=target1 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@11 -- # local val=167772161 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:08:20.524 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:08:20.525 10.0.0.1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@11 -- # local val=167772162 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:08:20.525 10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@38 -- # ping_ips 1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@107 -- # local dev=target0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:20.525 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:20.525 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:08:20.525 00:08:20.525 --- 10.0.0.2 ping statistics --- 00:08:20.525 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:20.525 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@107 -- # local dev=target0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:20.525 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:20.525 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.031 ms 00:08:20.525 00:08:20.525 --- 10.0.0.2 ping statistics --- 00:08:20.525 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:20.525 rtt min/avg/max/mdev = 0.031/0.031/0.031/0.000 ms 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@98 -- # (( pair++ )) 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@266 -- # return 0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@107 -- # local dev=target0 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:20.525 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@107 -- # local dev=target1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@107 -- # local dev=target0 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@107 -- # local dev=target1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@17 -- # nvmfappstart -m 0xF 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:20.526 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@10 -- # set +x 00:08:20.785 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@324 -- # nvmfpid=1706955 00:08:20.785 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@325 -- # waitforlisten 1706955 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@831 -- # '[' -z 1706955 ']' 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:20.786 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:20.786 15:13:22 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@10 -- # set +x 00:08:20.786 [2024-09-27 15:13:22.424213] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:08:20.786 [2024-09-27 15:13:22.424276] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:20.786 [2024-09-27 15:13:22.509747] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:20.786 [2024-09-27 15:13:22.599612] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:08:20.786 [2024-09-27 15:13:22.599660] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:08:20.786 [2024-09-27 15:13:22.599670] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:08:20.786 [2024-09-27 15:13:22.599679] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:08:20.786 [2024-09-27 15:13:22.599687] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:08:20.786 [2024-09-27 15:13:22.599762] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:20.786 [2024-09-27 15:13:22.599863] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:08:20.786 [2024-09-27 15:13:22.599966] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.786 [2024-09-27 15:13:22.599968] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@864 -- # return 0 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@10 -- # set +x 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:08:21.724 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@19 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:08:21.724 [2024-09-27 15:13:23.504130] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1c5a4a0/0x1c5e990) succeed. 00:08:21.724 [2024-09-27 15:13:23.514678] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1c5bae0/0x1ca0030) succeed. 00:08:21.983 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:22.242 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@21 -- # malloc_bdevs='Malloc0 ' 00:08:22.242 15:13:23 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@22 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:22.502 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@22 -- # malloc_bdevs+=Malloc1 00:08:22.502 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:22.502 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@24 -- # raid_malloc_bdevs='Malloc2 ' 00:08:22.502 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@25 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:22.761 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@25 -- # raid_malloc_bdevs+=Malloc3 00:08:22.761 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@26 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_raid_create -n raid0 -z 64 -r 0 -b 'Malloc2 Malloc3' 00:08:23.020 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:23.279 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@29 -- # concat_malloc_bdevs='Malloc4 ' 00:08:23.279 15:13:24 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@30 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:23.539 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@30 -- # concat_malloc_bdevs+='Malloc5 ' 00:08:23.539 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:08:23.798 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@31 -- # concat_malloc_bdevs+=Malloc6 00:08:23.798 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@32 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_raid_create -n concat0 -r concat -z 64 -b 'Malloc4 Malloc5 Malloc6' 00:08:23.798 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:08:24.058 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@35 -- # for malloc_bdev in $malloc_bdevs 00:08:24.058 15:13:25 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:08:24.316 15:13:26 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@35 -- # for malloc_bdev in $malloc_bdevs 00:08:24.316 15:13:26 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:08:24.575 15:13:26 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@38 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:08:24.575 [2024-09-27 15:13:26.406252] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:08:24.836 15:13:26 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@41 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 raid0 00:08:24.836 15:13:26 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@44 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 concat0 00:08:25.096 15:13:26 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@46 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:08:26.474 15:13:27 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@48 -- # waitforserial SPDKISFASTANDAWESOME 4 00:08:26.474 15:13:27 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1198 -- # local i=0 00:08:26.474 15:13:27 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:08:26.474 15:13:27 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1200 -- # [[ -n 4 ]] 00:08:26.474 15:13:27 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1201 -- # nvme_device_counter=4 00:08:26.474 15:13:27 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1205 -- # sleep 2 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1207 -- # nvme_devices=4 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1208 -- # return 0 00:08:28.378 15:13:29 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 4096 -d 1 -t write -r 1 -v 00:08:28.378 [global] 00:08:28.378 thread=1 00:08:28.378 invalidate=1 00:08:28.378 rw=write 00:08:28.378 time_based=1 00:08:28.378 runtime=1 00:08:28.378 ioengine=libaio 00:08:28.378 direct=1 00:08:28.378 bs=4096 00:08:28.378 iodepth=1 00:08:28.378 norandommap=0 00:08:28.378 numjobs=1 00:08:28.378 00:08:28.378 verify_dump=1 00:08:28.378 verify_backlog=512 00:08:28.378 verify_state_save=0 00:08:28.378 do_verify=1 00:08:28.378 verify=crc32c-intel 00:08:28.378 [job0] 00:08:28.378 filename=/dev/nvme0n1 00:08:28.378 [job1] 00:08:28.378 filename=/dev/nvme0n2 00:08:28.378 [job2] 00:08:28.378 filename=/dev/nvme0n3 00:08:28.378 [job3] 00:08:28.378 filename=/dev/nvme0n4 00:08:28.378 Could not set queue depth (nvme0n1) 00:08:28.378 Could not set queue depth (nvme0n2) 00:08:28.378 Could not set queue depth (nvme0n3) 00:08:28.378 Could not set queue depth (nvme0n4) 00:08:28.637 job0: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:28.637 job1: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:28.637 job2: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:28.637 job3: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:28.637 fio-3.35 00:08:28.637 Starting 4 threads 00:08:30.014 00:08:30.014 job0: (groupid=0, jobs=1): err= 0: pid=1708223: Fri Sep 27 15:13:31 2024 00:08:30.014 read: IOPS=5114, BW=20.0MiB/s (20.9MB/s)(20.0MiB/1001msec) 00:08:30.014 slat (nsec): min=8326, max=40128, avg=9031.57, stdev=1082.87 00:08:30.014 clat (usec): min=63, max=273, avg=85.07, stdev=15.56 00:08:30.014 lat (usec): min=77, max=282, avg=94.10, stdev=15.62 00:08:30.014 clat percentiles (usec): 00:08:30.014 | 1.00th=[ 73], 5.00th=[ 75], 10.00th=[ 77], 20.00th=[ 79], 00:08:30.014 | 30.00th=[ 80], 40.00th=[ 82], 50.00th=[ 83], 60.00th=[ 84], 00:08:30.014 | 70.00th=[ 86], 80.00th=[ 88], 90.00th=[ 92], 95.00th=[ 98], 00:08:30.014 | 99.00th=[ 176], 99.50th=[ 196], 99.90th=[ 223], 99.95th=[ 265], 00:08:30.014 | 99.99th=[ 273] 00:08:30.014 write: IOPS=5233, BW=20.4MiB/s (21.4MB/s)(20.5MiB/1001msec); 0 zone resets 00:08:30.014 slat (nsec): min=10680, max=45741, avg=11932.57, stdev=1371.41 00:08:30.014 clat (usec): min=60, max=260, avg=81.62, stdev=15.11 00:08:30.014 lat (usec): min=76, max=272, avg=93.56, stdev=15.22 00:08:30.014 clat percentiles (usec): 00:08:30.014 | 1.00th=[ 70], 5.00th=[ 72], 10.00th=[ 74], 20.00th=[ 76], 00:08:30.014 | 30.00th=[ 77], 40.00th=[ 78], 50.00th=[ 80], 60.00th=[ 81], 00:08:30.014 | 70.00th=[ 82], 80.00th=[ 85], 90.00th=[ 89], 95.00th=[ 94], 00:08:30.014 | 99.00th=[ 169], 99.50th=[ 194], 99.90th=[ 223], 99.95th=[ 235], 00:08:30.014 | 99.99th=[ 262] 00:08:30.014 bw ( KiB/s): min=20640, max=20640, per=27.84%, avg=20640.00, stdev= 0.00, samples=1 00:08:30.014 iops : min= 5160, max= 5160, avg=5160.00, stdev= 0.00, samples=1 00:08:30.014 lat (usec) : 100=96.24%, 250=3.72%, 500=0.05% 00:08:30.014 cpu : usr=7.50%, sys=10.20%, ctx=10360, majf=0, minf=1 00:08:30.014 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:30.014 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.014 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.014 issued rwts: total=5120,5239,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:30.014 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:30.014 job1: (groupid=0, jobs=1): err= 0: pid=1708224: Fri Sep 27 15:13:31 2024 00:08:30.014 read: IOPS=4649, BW=18.2MiB/s (19.0MB/s)(18.2MiB/1001msec) 00:08:30.014 slat (nsec): min=6812, max=34492, avg=9103.49, stdev=1100.70 00:08:30.014 clat (usec): min=67, max=188, avg=92.37, stdev=16.45 00:08:30.014 lat (usec): min=76, max=197, avg=101.47, stdev=16.37 00:08:30.014 clat percentiles (usec): 00:08:30.014 | 1.00th=[ 74], 5.00th=[ 77], 10.00th=[ 79], 20.00th=[ 81], 00:08:30.014 | 30.00th=[ 83], 40.00th=[ 85], 50.00th=[ 88], 60.00th=[ 90], 00:08:30.014 | 70.00th=[ 93], 80.00th=[ 99], 90.00th=[ 123], 95.00th=[ 130], 00:08:30.014 | 99.00th=[ 141], 99.50th=[ 143], 99.90th=[ 178], 99.95th=[ 182], 00:08:30.014 | 99.99th=[ 190] 00:08:30.014 write: IOPS=5114, BW=20.0MiB/s (20.9MB/s)(20.0MiB/1001msec); 0 zone resets 00:08:30.014 slat (nsec): min=9640, max=40470, avg=11955.95, stdev=1226.02 00:08:30.014 clat (usec): min=65, max=174, avg=85.98, stdev=13.36 00:08:30.014 lat (usec): min=78, max=186, avg=97.94, stdev=13.38 00:08:30.014 clat percentiles (usec): 00:08:30.014 | 1.00th=[ 70], 5.00th=[ 73], 10.00th=[ 75], 20.00th=[ 77], 00:08:30.014 | 30.00th=[ 79], 40.00th=[ 81], 50.00th=[ 83], 60.00th=[ 85], 00:08:30.014 | 70.00th=[ 88], 80.00th=[ 91], 90.00th=[ 105], 95.00th=[ 120], 00:08:30.015 | 99.00th=[ 131], 99.50th=[ 135], 99.90th=[ 145], 99.95th=[ 159], 00:08:30.015 | 99.99th=[ 176] 00:08:30.015 bw ( KiB/s): min=20480, max=20480, per=27.63%, avg=20480.00, stdev= 0.00, samples=1 00:08:30.015 iops : min= 5120, max= 5120, avg=5120.00, stdev= 0.00, samples=1 00:08:30.015 lat (usec) : 100=84.84%, 250=15.16% 00:08:30.015 cpu : usr=6.10%, sys=10.80%, ctx=9774, majf=0, minf=2 00:08:30.015 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:30.015 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.015 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.015 issued rwts: total=4654,5120,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:30.015 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:30.015 job2: (groupid=0, jobs=1): err= 0: pid=1708225: Fri Sep 27 15:13:31 2024 00:08:30.015 read: IOPS=3770, BW=14.7MiB/s (15.4MB/s)(14.7MiB/1001msec) 00:08:30.015 slat (nsec): min=8472, max=46854, avg=9421.26, stdev=1143.18 00:08:30.015 clat (usec): min=76, max=390, avg=116.89, stdev=24.40 00:08:30.015 lat (usec): min=85, max=408, avg=126.32, stdev=24.46 00:08:30.015 clat percentiles (usec): 00:08:30.015 | 1.00th=[ 84], 5.00th=[ 88], 10.00th=[ 90], 20.00th=[ 93], 00:08:30.015 | 30.00th=[ 97], 40.00th=[ 102], 50.00th=[ 118], 60.00th=[ 127], 00:08:30.015 | 70.00th=[ 133], 80.00th=[ 137], 90.00th=[ 145], 95.00th=[ 155], 00:08:30.015 | 99.00th=[ 184], 99.50th=[ 188], 99.90th=[ 221], 99.95th=[ 229], 00:08:30.015 | 99.99th=[ 392] 00:08:30.015 write: IOPS=4091, BW=16.0MiB/s (16.8MB/s)(16.0MiB/1001msec); 0 zone resets 00:08:30.015 slat (nsec): min=10826, max=40450, avg=12163.78, stdev=1452.66 00:08:30.015 clat (usec): min=68, max=388, avg=110.66, stdev=24.81 00:08:30.015 lat (usec): min=79, max=400, avg=122.82, stdev=24.78 00:08:30.015 clat percentiles (usec): 00:08:30.015 | 1.00th=[ 79], 5.00th=[ 82], 10.00th=[ 85], 20.00th=[ 88], 00:08:30.015 | 30.00th=[ 91], 40.00th=[ 95], 50.00th=[ 105], 60.00th=[ 120], 00:08:30.015 | 70.00th=[ 127], 80.00th=[ 135], 90.00th=[ 141], 95.00th=[ 149], 00:08:30.015 | 99.00th=[ 180], 99.50th=[ 186], 99.90th=[ 194], 99.95th=[ 204], 00:08:30.015 | 99.99th=[ 388] 00:08:30.015 bw ( KiB/s): min=19424, max=19424, per=26.20%, avg=19424.00, stdev= 0.00, samples=1 00:08:30.015 iops : min= 4856, max= 4856, avg=4856.00, stdev= 0.00, samples=1 00:08:30.015 lat (usec) : 100=41.98%, 250=57.99%, 500=0.03% 00:08:30.015 cpu : usr=5.90%, sys=7.80%, ctx=7870, majf=0, minf=1 00:08:30.015 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:30.015 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.015 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.015 issued rwts: total=3774,4096,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:30.015 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:30.015 job3: (groupid=0, jobs=1): err= 0: pid=1708226: Fri Sep 27 15:13:31 2024 00:08:30.015 read: IOPS=4054, BW=15.8MiB/s (16.6MB/s)(15.9MiB/1001msec) 00:08:30.015 slat (nsec): min=8494, max=28344, avg=9188.16, stdev=1005.93 00:08:30.015 clat (usec): min=78, max=386, avg=111.46, stdev=24.22 00:08:30.015 lat (usec): min=86, max=395, avg=120.65, stdev=24.20 00:08:30.015 clat percentiles (usec): 00:08:30.015 | 1.00th=[ 83], 5.00th=[ 87], 10.00th=[ 89], 20.00th=[ 92], 00:08:30.015 | 30.00th=[ 95], 40.00th=[ 97], 50.00th=[ 101], 60.00th=[ 106], 00:08:30.015 | 70.00th=[ 128], 80.00th=[ 137], 90.00th=[ 145], 95.00th=[ 153], 00:08:30.015 | 99.00th=[ 180], 99.50th=[ 186], 99.90th=[ 200], 99.95th=[ 202], 00:08:30.015 | 99.99th=[ 388] 00:08:30.015 write: IOPS=4091, BW=16.0MiB/s (16.8MB/s)(16.0MiB/1001msec); 0 zone resets 00:08:30.015 slat (nsec): min=10645, max=40948, avg=11933.59, stdev=1309.39 00:08:30.015 clat (usec): min=74, max=294, avg=107.45, stdev=24.45 00:08:30.015 lat (usec): min=85, max=306, avg=119.38, stdev=24.40 00:08:30.015 clat percentiles (usec): 00:08:30.015 | 1.00th=[ 79], 5.00th=[ 82], 10.00th=[ 85], 20.00th=[ 88], 00:08:30.015 | 30.00th=[ 90], 40.00th=[ 93], 50.00th=[ 97], 60.00th=[ 104], 00:08:30.015 | 70.00th=[ 124], 80.00th=[ 133], 90.00th=[ 141], 95.00th=[ 149], 00:08:30.015 | 99.00th=[ 180], 99.50th=[ 186], 99.90th=[ 200], 99.95th=[ 204], 00:08:30.015 | 99.99th=[ 293] 00:08:30.015 bw ( KiB/s): min=20480, max=20480, per=27.63%, avg=20480.00, stdev= 0.00, samples=1 00:08:30.015 iops : min= 5120, max= 5120, avg=5120.00, stdev= 0.00, samples=1 00:08:30.015 lat (usec) : 100=52.27%, 250=47.70%, 500=0.02% 00:08:30.015 cpu : usr=4.70%, sys=9.50%, ctx=8155, majf=0, minf=1 00:08:30.015 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:30.015 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.015 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:30.015 issued rwts: total=4059,4096,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:30.015 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:30.015 00:08:30.015 Run status group 0 (all jobs): 00:08:30.015 READ: bw=68.7MiB/s (72.0MB/s), 14.7MiB/s-20.0MiB/s (15.4MB/s-20.9MB/s), io=68.8MiB (72.1MB), run=1001-1001msec 00:08:30.015 WRITE: bw=72.4MiB/s (75.9MB/s), 16.0MiB/s-20.4MiB/s (16.8MB/s-21.4MB/s), io=72.5MiB (76.0MB), run=1001-1001msec 00:08:30.015 00:08:30.015 Disk stats (read/write): 00:08:30.015 nvme0n1: ios=4242/4608, merge=0/0, ticks=338/333, in_queue=671, util=85.87% 00:08:30.015 nvme0n2: ios=4077/4096, merge=0/0, ticks=365/344, in_queue=709, util=86.66% 00:08:30.015 nvme0n3: ios=3277/3584, merge=0/0, ticks=348/366, in_queue=714, util=88.92% 00:08:30.015 nvme0n4: ios=3561/3584, merge=0/0, ticks=350/355, in_queue=705, util=89.67% 00:08:30.015 15:13:31 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@51 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 4096 -d 1 -t randwrite -r 1 -v 00:08:30.015 [global] 00:08:30.015 thread=1 00:08:30.015 invalidate=1 00:08:30.015 rw=randwrite 00:08:30.015 time_based=1 00:08:30.015 runtime=1 00:08:30.015 ioengine=libaio 00:08:30.015 direct=1 00:08:30.015 bs=4096 00:08:30.015 iodepth=1 00:08:30.015 norandommap=0 00:08:30.015 numjobs=1 00:08:30.015 00:08:30.015 verify_dump=1 00:08:30.015 verify_backlog=512 00:08:30.015 verify_state_save=0 00:08:30.015 do_verify=1 00:08:30.015 verify=crc32c-intel 00:08:30.015 [job0] 00:08:30.015 filename=/dev/nvme0n1 00:08:30.015 [job1] 00:08:30.015 filename=/dev/nvme0n2 00:08:30.015 [job2] 00:08:30.015 filename=/dev/nvme0n3 00:08:30.015 [job3] 00:08:30.015 filename=/dev/nvme0n4 00:08:30.015 Could not set queue depth (nvme0n1) 00:08:30.015 Could not set queue depth (nvme0n2) 00:08:30.015 Could not set queue depth (nvme0n3) 00:08:30.015 Could not set queue depth (nvme0n4) 00:08:30.274 job0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:30.274 job1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:30.274 job2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:30.274 job3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:30.274 fio-3.35 00:08:30.274 Starting 4 threads 00:08:31.652 00:08:31.652 job0: (groupid=0, jobs=1): err= 0: pid=1708523: Fri Sep 27 15:13:33 2024 00:08:31.652 read: IOPS=4603, BW=18.0MiB/s (18.9MB/s)(18.0MiB/1001msec) 00:08:31.652 slat (nsec): min=8362, max=36283, avg=9047.56, stdev=1040.11 00:08:31.652 clat (usec): min=66, max=171, avg=94.32, stdev=21.11 00:08:31.652 lat (usec): min=75, max=180, avg=103.37, stdev=21.23 00:08:31.652 clat percentiles (usec): 00:08:31.652 | 1.00th=[ 71], 5.00th=[ 74], 10.00th=[ 75], 20.00th=[ 78], 00:08:31.652 | 30.00th=[ 79], 40.00th=[ 82], 50.00th=[ 84], 60.00th=[ 88], 00:08:31.652 | 70.00th=[ 115], 80.00th=[ 122], 90.00th=[ 127], 95.00th=[ 130], 00:08:31.652 | 99.00th=[ 137], 99.50th=[ 141], 99.90th=[ 147], 99.95th=[ 147], 00:08:31.652 | 99.99th=[ 172] 00:08:31.652 write: IOPS=4868, BW=19.0MiB/s (19.9MB/s)(19.0MiB/1001msec); 0 zone resets 00:08:31.652 slat (nsec): min=10506, max=51256, avg=11601.90, stdev=1277.11 00:08:31.652 clat (usec): min=62, max=171, avg=90.75, stdev=19.09 00:08:31.652 lat (usec): min=73, max=219, avg=102.35, stdev=19.18 00:08:31.652 clat percentiles (usec): 00:08:31.652 | 1.00th=[ 67], 5.00th=[ 70], 10.00th=[ 72], 20.00th=[ 74], 00:08:31.652 | 30.00th=[ 76], 40.00th=[ 79], 50.00th=[ 82], 60.00th=[ 91], 00:08:31.652 | 70.00th=[ 110], 80.00th=[ 114], 90.00th=[ 118], 95.00th=[ 121], 00:08:31.652 | 99.00th=[ 127], 99.50th=[ 129], 99.90th=[ 135], 99.95th=[ 141], 00:08:31.652 | 99.99th=[ 172] 00:08:31.652 bw ( KiB/s): min=20439, max=20439, per=30.23%, avg=20439.00, stdev= 0.00, samples=1 00:08:31.652 iops : min= 5109, max= 5109, avg=5109.00, stdev= 0.00, samples=1 00:08:31.652 lat (usec) : 100=63.97%, 250=36.03% 00:08:31.652 cpu : usr=6.60%, sys=9.60%, ctx=9481, majf=0, minf=1 00:08:31.652 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:31.652 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.652 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.652 issued rwts: total=4608,4873,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:31.652 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:31.652 job1: (groupid=0, jobs=1): err= 0: pid=1708524: Fri Sep 27 15:13:33 2024 00:08:31.652 read: IOPS=3580, BW=14.0MiB/s (14.7MB/s)(14.0MiB/1001msec) 00:08:31.652 slat (nsec): min=8466, max=42006, avg=9787.67, stdev=2128.05 00:08:31.652 clat (usec): min=79, max=209, avg=126.93, stdev=12.51 00:08:31.652 lat (usec): min=88, max=219, avg=136.72, stdev=12.92 00:08:31.652 clat percentiles (usec): 00:08:31.652 | 1.00th=[ 101], 5.00th=[ 111], 10.00th=[ 114], 20.00th=[ 118], 00:08:31.652 | 30.00th=[ 121], 40.00th=[ 123], 50.00th=[ 126], 60.00th=[ 129], 00:08:31.652 | 70.00th=[ 133], 80.00th=[ 137], 90.00th=[ 143], 95.00th=[ 149], 00:08:31.652 | 99.00th=[ 163], 99.50th=[ 176], 99.90th=[ 194], 99.95th=[ 200], 00:08:31.652 | 99.99th=[ 210] 00:08:31.652 write: IOPS=3739, BW=14.6MiB/s (15.3MB/s)(14.6MiB/1001msec); 0 zone resets 00:08:31.652 slat (nsec): min=8274, max=49356, avg=12165.86, stdev=2341.68 00:08:31.652 clat (usec): min=71, max=199, avg=118.79, stdev=12.23 00:08:31.652 lat (usec): min=83, max=235, avg=130.96, stdev=12.65 00:08:31.652 clat percentiles (usec): 00:08:31.652 | 1.00th=[ 95], 5.00th=[ 103], 10.00th=[ 106], 20.00th=[ 111], 00:08:31.652 | 30.00th=[ 113], 40.00th=[ 115], 50.00th=[ 117], 60.00th=[ 120], 00:08:31.652 | 70.00th=[ 123], 80.00th=[ 128], 90.00th=[ 135], 95.00th=[ 141], 00:08:31.652 | 99.00th=[ 153], 99.50th=[ 165], 99.90th=[ 190], 99.95th=[ 200], 00:08:31.652 | 99.99th=[ 200] 00:08:31.652 bw ( KiB/s): min=16384, max=16384, per=24.24%, avg=16384.00, stdev= 0.00, samples=1 00:08:31.652 iops : min= 4096, max= 4096, avg=4096.00, stdev= 0.00, samples=1 00:08:31.652 lat (usec) : 100=1.87%, 250=98.13% 00:08:31.652 cpu : usr=5.90%, sys=7.40%, ctx=7327, majf=0, minf=1 00:08:31.652 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:31.652 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.652 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.652 issued rwts: total=3584,3743,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:31.652 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:31.652 job2: (groupid=0, jobs=1): err= 0: pid=1708525: Fri Sep 27 15:13:33 2024 00:08:31.652 read: IOPS=4091, BW=16.0MiB/s (16.8MB/s)(16.0MiB/1001msec) 00:08:31.652 slat (nsec): min=8657, max=34873, avg=9248.39, stdev=1040.87 00:08:31.652 clat (usec): min=73, max=414, avg=108.02, stdev=16.63 00:08:31.652 lat (usec): min=82, max=430, avg=117.27, stdev=16.81 00:08:31.652 clat percentiles (usec): 00:08:31.652 | 1.00th=[ 83], 5.00th=[ 86], 10.00th=[ 88], 20.00th=[ 92], 00:08:31.652 | 30.00th=[ 95], 40.00th=[ 99], 50.00th=[ 109], 60.00th=[ 118], 00:08:31.652 | 70.00th=[ 121], 80.00th=[ 124], 90.00th=[ 128], 95.00th=[ 131], 00:08:31.652 | 99.00th=[ 139], 99.50th=[ 143], 99.90th=[ 161], 99.95th=[ 178], 00:08:31.652 | 99.99th=[ 416] 00:08:31.652 write: IOPS=4391, BW=17.2MiB/s (18.0MB/s)(17.2MiB/1001msec); 0 zone resets 00:08:31.652 slat (nsec): min=10512, max=79435, avg=11897.03, stdev=1858.93 00:08:31.652 clat (usec): min=72, max=168, avg=101.19, stdev=13.65 00:08:31.653 lat (usec): min=84, max=201, avg=113.09, stdev=13.92 00:08:31.653 clat percentiles (usec): 00:08:31.653 | 1.00th=[ 78], 5.00th=[ 81], 10.00th=[ 84], 20.00th=[ 87], 00:08:31.653 | 30.00th=[ 91], 40.00th=[ 95], 50.00th=[ 104], 60.00th=[ 110], 00:08:31.653 | 70.00th=[ 112], 80.00th=[ 115], 90.00th=[ 118], 95.00th=[ 121], 00:08:31.653 | 99.00th=[ 127], 99.50th=[ 131], 99.90th=[ 141], 99.95th=[ 143], 00:08:31.653 | 99.99th=[ 169] 00:08:31.653 bw ( KiB/s): min=17672, max=17672, per=26.14%, avg=17672.00, stdev= 0.00, samples=1 00:08:31.653 iops : min= 4418, max= 4418, avg=4418.00, stdev= 0.00, samples=1 00:08:31.653 lat (usec) : 100=44.21%, 250=55.78%, 500=0.01% 00:08:31.653 cpu : usr=6.60%, sys=8.20%, ctx=8495, majf=0, minf=1 00:08:31.653 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:31.653 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.653 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.653 issued rwts: total=4096,4396,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:31.653 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:31.653 job3: (groupid=0, jobs=1): err= 0: pid=1708527: Fri Sep 27 15:13:33 2024 00:08:31.653 read: IOPS=3580, BW=14.0MiB/s (14.7MB/s)(14.0MiB/1001msec) 00:08:31.653 slat (nsec): min=8764, max=30001, avg=9324.75, stdev=965.75 00:08:31.653 clat (usec): min=86, max=191, avg=124.32, stdev=14.81 00:08:31.653 lat (usec): min=95, max=200, avg=133.64, stdev=14.81 00:08:31.653 clat percentiles (usec): 00:08:31.653 | 1.00th=[ 92], 5.00th=[ 98], 10.00th=[ 103], 20.00th=[ 115], 00:08:31.653 | 30.00th=[ 119], 40.00th=[ 122], 50.00th=[ 124], 60.00th=[ 128], 00:08:31.653 | 70.00th=[ 131], 80.00th=[ 137], 90.00th=[ 145], 95.00th=[ 149], 00:08:31.653 | 99.00th=[ 159], 99.50th=[ 165], 99.90th=[ 182], 99.95th=[ 190], 00:08:31.653 | 99.99th=[ 192] 00:08:31.653 write: IOPS=3902, BW=15.2MiB/s (16.0MB/s)(15.3MiB/1001msec); 0 zone resets 00:08:31.653 slat (nsec): min=10631, max=50814, avg=11793.04, stdev=1363.87 00:08:31.653 clat (usec): min=79, max=181, avg=116.80, stdev=14.54 00:08:31.653 lat (usec): min=90, max=220, avg=128.59, stdev=14.53 00:08:31.653 clat percentiles (usec): 00:08:31.653 | 1.00th=[ 86], 5.00th=[ 92], 10.00th=[ 98], 20.00th=[ 108], 00:08:31.653 | 30.00th=[ 112], 40.00th=[ 114], 50.00th=[ 116], 60.00th=[ 119], 00:08:31.653 | 70.00th=[ 123], 80.00th=[ 129], 90.00th=[ 137], 95.00th=[ 143], 00:08:31.653 | 99.00th=[ 155], 99.50th=[ 163], 99.90th=[ 178], 99.95th=[ 180], 00:08:31.653 | 99.99th=[ 182] 00:08:31.653 bw ( KiB/s): min=16351, max=16351, per=24.19%, avg=16351.00, stdev= 0.00, samples=1 00:08:31.653 iops : min= 4087, max= 4087, avg=4087.00, stdev= 0.00, samples=1 00:08:31.653 lat (usec) : 100=9.84%, 250=90.16% 00:08:31.653 cpu : usr=5.70%, sys=7.10%, ctx=7490, majf=0, minf=1 00:08:31.653 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:31.653 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.653 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:31.653 issued rwts: total=3584,3906,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:31.653 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:31.653 00:08:31.653 Run status group 0 (all jobs): 00:08:31.653 READ: bw=61.9MiB/s (64.9MB/s), 14.0MiB/s-18.0MiB/s (14.7MB/s-18.9MB/s), io=62.0MiB (65.0MB), run=1001-1001msec 00:08:31.653 WRITE: bw=66.0MiB/s (69.2MB/s), 14.6MiB/s-19.0MiB/s (15.3MB/s-19.9MB/s), io=66.1MiB (69.3MB), run=1001-1001msec 00:08:31.653 00:08:31.653 Disk stats (read/write): 00:08:31.653 nvme0n1: ios=4134/4096, merge=0/0, ticks=365/324, in_queue=689, util=84.17% 00:08:31.653 nvme0n2: ios=2961/3072, merge=0/0, ticks=348/340, in_queue=688, util=85.07% 00:08:31.653 nvme0n3: ios=3584/3607, merge=0/0, ticks=360/353, in_queue=713, util=88.32% 00:08:31.653 nvme0n4: ios=3072/3125, merge=0/0, ticks=365/347, in_queue=712, util=89.46% 00:08:31.653 15:13:33 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@52 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 4096 -d 128 -t write -r 1 -v 00:08:31.653 [global] 00:08:31.653 thread=1 00:08:31.653 invalidate=1 00:08:31.653 rw=write 00:08:31.653 time_based=1 00:08:31.653 runtime=1 00:08:31.653 ioengine=libaio 00:08:31.653 direct=1 00:08:31.653 bs=4096 00:08:31.653 iodepth=128 00:08:31.653 norandommap=0 00:08:31.653 numjobs=1 00:08:31.653 00:08:31.653 verify_dump=1 00:08:31.653 verify_backlog=512 00:08:31.653 verify_state_save=0 00:08:31.653 do_verify=1 00:08:31.653 verify=crc32c-intel 00:08:31.653 [job0] 00:08:31.653 filename=/dev/nvme0n1 00:08:31.653 [job1] 00:08:31.653 filename=/dev/nvme0n2 00:08:31.653 [job2] 00:08:31.653 filename=/dev/nvme0n3 00:08:31.653 [job3] 00:08:31.653 filename=/dev/nvme0n4 00:08:31.653 Could not set queue depth (nvme0n1) 00:08:31.653 Could not set queue depth (nvme0n2) 00:08:31.653 Could not set queue depth (nvme0n3) 00:08:31.653 Could not set queue depth (nvme0n4) 00:08:31.653 job0: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:31.653 job1: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:31.653 job2: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:31.653 job3: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:31.653 fio-3.35 00:08:31.653 Starting 4 threads 00:08:33.033 00:08:33.033 job0: (groupid=0, jobs=1): err= 0: pid=1708833: Fri Sep 27 15:13:34 2024 00:08:33.033 read: IOPS=7742, BW=30.2MiB/s (31.7MB/s)(30.3MiB/1002msec) 00:08:33.033 slat (usec): min=2, max=5268, avg=59.46, stdev=279.40 00:08:33.033 clat (usec): min=461, max=19609, avg=8167.26, stdev=3329.86 00:08:33.033 lat (usec): min=481, max=19616, avg=8226.72, stdev=3347.72 00:08:33.033 clat percentiles (usec): 00:08:33.033 | 1.00th=[ 1893], 5.00th=[ 3687], 10.00th=[ 4948], 20.00th=[ 5669], 00:08:33.033 | 30.00th=[ 6259], 40.00th=[ 6652], 50.00th=[ 7111], 60.00th=[ 8094], 00:08:33.033 | 70.00th=[ 9634], 80.00th=[11076], 90.00th=[12518], 95.00th=[14746], 00:08:33.033 | 99.00th=[17957], 99.50th=[18482], 99.90th=[19530], 99.95th=[19530], 00:08:33.033 | 99.99th=[19530] 00:08:33.033 write: IOPS=8175, BW=31.9MiB/s (33.5MB/s)(32.0MiB/1002msec); 0 zone resets 00:08:33.033 slat (usec): min=2, max=4558, avg=57.34, stdev=247.09 00:08:33.033 clat (usec): min=992, max=20656, avg=7770.46, stdev=3206.46 00:08:33.033 lat (usec): min=1213, max=20680, avg=7827.80, stdev=3228.65 00:08:33.033 clat percentiles (usec): 00:08:33.033 | 1.00th=[ 2343], 5.00th=[ 4293], 10.00th=[ 4817], 20.00th=[ 5407], 00:08:33.033 | 30.00th=[ 5800], 40.00th=[ 6194], 50.00th=[ 6521], 60.00th=[ 7308], 00:08:33.033 | 70.00th=[ 8979], 80.00th=[10552], 90.00th=[11994], 95.00th=[13698], 00:08:33.033 | 99.00th=[18220], 99.50th=[19006], 99.90th=[20579], 99.95th=[20579], 00:08:33.033 | 99.99th=[20579] 00:08:33.033 bw ( KiB/s): min=36864, max=36864, per=36.78%, avg=36864.00, stdev= 0.00, samples=1 00:08:33.033 iops : min= 9216, max= 9216, avg=9216.00, stdev= 0.00, samples=1 00:08:33.033 lat (usec) : 500=0.01%, 750=0.01%, 1000=0.01% 00:08:33.033 lat (msec) : 2=0.79%, 4=4.03%, 10=70.31%, 20=24.75%, 50=0.09% 00:08:33.033 cpu : usr=5.09%, sys=7.89%, ctx=1206, majf=0, minf=1 00:08:33.033 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.2%, >=64=99.6% 00:08:33.033 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:33.033 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:33.033 issued rwts: total=7758,8192,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:33.033 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:33.033 job1: (groupid=0, jobs=1): err= 0: pid=1708834: Fri Sep 27 15:13:34 2024 00:08:33.033 read: IOPS=6131, BW=24.0MiB/s (25.1MB/s)(24.0MiB/1002msec) 00:08:33.033 slat (usec): min=2, max=4663, avg=82.82, stdev=368.81 00:08:33.033 clat (usec): min=3599, max=21508, avg=10988.70, stdev=3241.43 00:08:33.033 lat (usec): min=3604, max=21512, avg=11071.52, stdev=3258.89 00:08:33.033 clat percentiles (usec): 00:08:33.033 | 1.00th=[ 4359], 5.00th=[ 5932], 10.00th=[ 6980], 20.00th=[ 8160], 00:08:33.033 | 30.00th=[ 9110], 40.00th=[10028], 50.00th=[11076], 60.00th=[11600], 00:08:33.033 | 70.00th=[12518], 80.00th=[13435], 90.00th=[15139], 95.00th=[17171], 00:08:33.033 | 99.00th=[19268], 99.50th=[19530], 99.90th=[19792], 99.95th=[21103], 00:08:33.033 | 99.99th=[21627] 00:08:33.033 write: IOPS=6177, BW=24.1MiB/s (25.3MB/s)(24.2MiB/1002msec); 0 zone resets 00:08:33.033 slat (usec): min=2, max=4204, avg=74.10, stdev=327.92 00:08:33.033 clat (usec): min=1214, max=18402, avg=9519.73, stdev=2683.63 00:08:33.033 lat (usec): min=2686, max=18422, avg=9593.83, stdev=2694.58 00:08:33.033 clat percentiles (usec): 00:08:33.033 | 1.00th=[ 4424], 5.00th=[ 5473], 10.00th=[ 6128], 20.00th=[ 6980], 00:08:33.033 | 30.00th=[ 7963], 40.00th=[ 8717], 50.00th=[ 9503], 60.00th=[10290], 00:08:33.033 | 70.00th=[10945], 80.00th=[11469], 90.00th=[13042], 95.00th=[14353], 00:08:33.033 | 99.00th=[16909], 99.50th=[17957], 99.90th=[18482], 99.95th=[18482], 00:08:33.033 | 99.99th=[18482] 00:08:33.033 bw ( KiB/s): min=26160, max=26160, per=26.10%, avg=26160.00, stdev= 0.00, samples=1 00:08:33.033 iops : min= 6540, max= 6540, avg=6540.00, stdev= 0.00, samples=1 00:08:33.033 lat (msec) : 2=0.01%, 4=0.41%, 10=47.43%, 20=52.12%, 50=0.03% 00:08:33.033 cpu : usr=3.10%, sys=7.39%, ctx=1064, majf=0, minf=1 00:08:33.033 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.3%, >=64=99.5% 00:08:33.033 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:33.033 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:33.033 issued rwts: total=6144,6190,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:33.033 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:33.033 job2: (groupid=0, jobs=1): err= 0: pid=1708835: Fri Sep 27 15:13:34 2024 00:08:33.033 read: IOPS=4653, BW=18.2MiB/s (19.1MB/s)(18.2MiB/1003msec) 00:08:33.033 slat (usec): min=2, max=4764, avg=98.56, stdev=434.01 00:08:33.033 clat (usec): min=1594, max=22090, avg=12936.34, stdev=3661.79 00:08:33.033 lat (usec): min=2375, max=22102, avg=13034.89, stdev=3670.70 00:08:33.033 clat percentiles (usec): 00:08:33.033 | 1.00th=[ 4948], 5.00th=[ 6915], 10.00th=[ 8225], 20.00th=[ 9503], 00:08:33.033 | 30.00th=[10814], 40.00th=[11994], 50.00th=[12780], 60.00th=[13698], 00:08:33.033 | 70.00th=[15008], 80.00th=[16450], 90.00th=[18220], 95.00th=[19006], 00:08:33.033 | 99.00th=[20055], 99.50th=[20317], 99.90th=[20841], 99.95th=[20841], 00:08:33.033 | 99.99th=[22152] 00:08:33.034 write: IOPS=5104, BW=19.9MiB/s (20.9MB/s)(20.0MiB/1003msec); 0 zone resets 00:08:33.034 slat (usec): min=2, max=4603, avg=100.29, stdev=385.22 00:08:33.034 clat (usec): min=4269, max=22430, avg=12964.41, stdev=3517.28 00:08:33.034 lat (usec): min=4272, max=25091, avg=13064.70, stdev=3534.70 00:08:33.034 clat percentiles (usec): 00:08:33.034 | 1.00th=[ 6521], 5.00th=[ 7635], 10.00th=[ 8094], 20.00th=[ 9372], 00:08:33.034 | 30.00th=[10683], 40.00th=[12256], 50.00th=[13042], 60.00th=[13829], 00:08:33.034 | 70.00th=[15008], 80.00th=[16057], 90.00th=[17695], 95.00th=[19006], 00:08:33.034 | 99.00th=[20579], 99.50th=[21365], 99.90th=[22414], 99.95th=[22414], 00:08:33.034 | 99.99th=[22414] 00:08:33.034 bw ( KiB/s): min=19856, max=20560, per=20.16%, avg=20208.00, stdev=497.80, samples=2 00:08:33.034 iops : min= 4964, max= 5140, avg=5052.00, stdev=124.45, samples=2 00:08:33.034 lat (msec) : 2=0.01%, 4=0.19%, 10=24.31%, 20=73.92%, 50=1.56% 00:08:33.034 cpu : usr=2.89%, sys=5.99%, ctx=911, majf=0, minf=1 00:08:33.034 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.2%, 32=0.3%, >=64=99.4% 00:08:33.034 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:33.034 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:33.034 issued rwts: total=4667,5120,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:33.034 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:33.034 job3: (groupid=0, jobs=1): err= 0: pid=1708836: Fri Sep 27 15:13:34 2024 00:08:33.034 read: IOPS=5587, BW=21.8MiB/s (22.9MB/s)(21.9MiB/1003msec) 00:08:33.034 slat (usec): min=2, max=5431, avg=88.35, stdev=376.02 00:08:33.034 clat (usec): min=1572, max=24741, avg=11385.45, stdev=4035.39 00:08:33.034 lat (usec): min=3424, max=24745, avg=11473.80, stdev=4056.53 00:08:33.034 clat percentiles (usec): 00:08:33.034 | 1.00th=[ 5080], 5.00th=[ 6390], 10.00th=[ 6783], 20.00th=[ 7504], 00:08:33.034 | 30.00th=[ 8291], 40.00th=[ 9634], 50.00th=[10945], 60.00th=[12649], 00:08:33.034 | 70.00th=[13304], 80.00th=[14615], 90.00th=[16909], 95.00th=[18744], 00:08:33.034 | 99.00th=[24249], 99.50th=[24511], 99.90th=[24773], 99.95th=[24773], 00:08:33.034 | 99.99th=[24773] 00:08:33.034 write: IOPS=5615, BW=21.9MiB/s (23.0MB/s)(22.0MiB/1003msec); 0 zone resets 00:08:33.034 slat (usec): min=2, max=6611, avg=84.39, stdev=352.79 00:08:33.034 clat (usec): min=3516, max=26613, avg=11211.01, stdev=4540.02 00:08:33.034 lat (usec): min=3526, max=26694, avg=11295.40, stdev=4569.72 00:08:33.034 clat percentiles (usec): 00:08:33.034 | 1.00th=[ 4817], 5.00th=[ 5735], 10.00th=[ 6390], 20.00th=[ 7308], 00:08:33.034 | 30.00th=[ 8160], 40.00th=[ 8455], 50.00th=[ 9503], 60.00th=[12256], 00:08:33.034 | 70.00th=[13173], 80.00th=[15795], 90.00th=[17957], 95.00th=[19006], 00:08:33.034 | 99.00th=[23987], 99.50th=[25297], 99.90th=[26084], 99.95th=[26608], 00:08:33.034 | 99.99th=[26608] 00:08:33.034 bw ( KiB/s): min=20480, max=24576, per=22.48%, avg=22528.00, stdev=2896.31, samples=2 00:08:33.034 iops : min= 5120, max= 6144, avg=5632.00, stdev=724.08, samples=2 00:08:33.034 lat (msec) : 2=0.01%, 4=0.34%, 10=47.88%, 20=49.06%, 50=2.71% 00:08:33.034 cpu : usr=3.59%, sys=6.39%, ctx=1064, majf=0, minf=1 00:08:33.034 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.3%, >=64=99.4% 00:08:33.034 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:33.034 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:33.034 issued rwts: total=5604,5632,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:33.034 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:33.034 00:08:33.034 Run status group 0 (all jobs): 00:08:33.034 READ: bw=94.1MiB/s (98.7MB/s), 18.2MiB/s-30.2MiB/s (19.1MB/s-31.7MB/s), io=94.4MiB (99.0MB), run=1002-1003msec 00:08:33.034 WRITE: bw=97.9MiB/s (103MB/s), 19.9MiB/s-31.9MiB/s (20.9MB/s-33.5MB/s), io=98.2MiB (103MB), run=1002-1003msec 00:08:33.034 00:08:33.034 Disk stats (read/write): 00:08:33.034 nvme0n1: ios=6750/7168, merge=0/0, ticks=20535/22427, in_queue=42962, util=85.87% 00:08:33.034 nvme0n2: ios=5120/5505, merge=0/0, ticks=18937/16602, in_queue=35539, util=86.25% 00:08:33.034 nvme0n3: ios=4096/4295, merge=0/0, ticks=14735/14593, in_queue=29328, util=88.82% 00:08:33.034 nvme0n4: ios=4729/5120, merge=0/0, ticks=15623/16739, in_queue=32362, util=89.36% 00:08:33.034 15:13:34 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@53 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 4096 -d 128 -t randwrite -r 1 -v 00:08:33.034 [global] 00:08:33.034 thread=1 00:08:33.034 invalidate=1 00:08:33.034 rw=randwrite 00:08:33.034 time_based=1 00:08:33.034 runtime=1 00:08:33.034 ioengine=libaio 00:08:33.034 direct=1 00:08:33.034 bs=4096 00:08:33.034 iodepth=128 00:08:33.034 norandommap=0 00:08:33.034 numjobs=1 00:08:33.034 00:08:33.034 verify_dump=1 00:08:33.034 verify_backlog=512 00:08:33.034 verify_state_save=0 00:08:33.034 do_verify=1 00:08:33.034 verify=crc32c-intel 00:08:33.034 [job0] 00:08:33.034 filename=/dev/nvme0n1 00:08:33.034 [job1] 00:08:33.034 filename=/dev/nvme0n2 00:08:33.034 [job2] 00:08:33.034 filename=/dev/nvme0n3 00:08:33.034 [job3] 00:08:33.034 filename=/dev/nvme0n4 00:08:33.034 Could not set queue depth (nvme0n1) 00:08:33.034 Could not set queue depth (nvme0n2) 00:08:33.034 Could not set queue depth (nvme0n3) 00:08:33.034 Could not set queue depth (nvme0n4) 00:08:33.293 job0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:33.293 job1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:33.293 job2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:33.294 job3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:08:33.294 fio-3.35 00:08:33.294 Starting 4 threads 00:08:34.675 00:08:34.675 job0: (groupid=0, jobs=1): err= 0: pid=1709145: Fri Sep 27 15:13:36 2024 00:08:34.675 read: IOPS=5342, BW=20.9MiB/s (21.9MB/s)(20.9MiB/1002msec) 00:08:34.675 slat (usec): min=2, max=4878, avg=88.68, stdev=393.57 00:08:34.675 clat (usec): min=723, max=24623, avg=11797.71, stdev=4450.43 00:08:34.675 lat (usec): min=2360, max=27071, avg=11886.39, stdev=4473.40 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4752], 5.00th=[ 6194], 10.00th=[ 6915], 20.00th=[ 7570], 00:08:34.675 | 30.00th=[ 8717], 40.00th=[ 9765], 50.00th=[10945], 60.00th=[12387], 00:08:34.675 | 70.00th=[13829], 80.00th=[15926], 90.00th=[18220], 95.00th=[20317], 00:08:34.675 | 99.00th=[22938], 99.50th=[23200], 99.90th=[23987], 99.95th=[24511], 00:08:34.675 | 99.99th=[24511] 00:08:34.675 write: IOPS=5620, BW=22.0MiB/s (23.0MB/s)(22.0MiB/1002msec); 0 zone resets 00:08:34.675 slat (usec): min=2, max=5425, avg=88.12, stdev=379.27 00:08:34.675 clat (usec): min=2574, max=22081, avg=11275.97, stdev=4007.36 00:08:34.675 lat (usec): min=2595, max=22090, avg=11364.09, stdev=4028.01 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4555], 5.00th=[ 6063], 10.00th=[ 6587], 20.00th=[ 7177], 00:08:34.675 | 30.00th=[ 8455], 40.00th=[ 9503], 50.00th=[10552], 60.00th=[11863], 00:08:34.675 | 70.00th=[13698], 80.00th=[15664], 90.00th=[16450], 95.00th=[18482], 00:08:34.675 | 99.00th=[20317], 99.50th=[21365], 99.90th=[21890], 99.95th=[22152], 00:08:34.675 | 99.99th=[22152] 00:08:34.675 bw ( KiB/s): min=19064, max=25992, per=23.11%, avg=22528.00, stdev=4898.84, samples=2 00:08:34.675 iops : min= 4766, max= 6498, avg=5632.00, stdev=1224.71, samples=2 00:08:34.675 lat (usec) : 750=0.01% 00:08:34.675 lat (msec) : 4=0.37%, 10=42.42%, 20=53.85%, 50=3.35% 00:08:34.675 cpu : usr=3.50%, sys=6.49%, ctx=1172, majf=0, minf=1 00:08:34.675 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.3%, >=64=99.4% 00:08:34.675 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:34.675 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:34.675 issued rwts: total=5353,5632,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:34.675 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:34.675 job1: (groupid=0, jobs=1): err= 0: pid=1709153: Fri Sep 27 15:13:36 2024 00:08:34.675 read: IOPS=8069, BW=31.5MiB/s (33.1MB/s)(31.6MiB/1003msec) 00:08:34.675 slat (usec): min=2, max=4993, avg=60.35, stdev=283.88 00:08:34.675 clat (usec): min=1060, max=19600, avg=7873.86, stdev=2355.81 00:08:34.675 lat (usec): min=2658, max=19611, avg=7934.21, stdev=2368.60 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4178], 5.00th=[ 5014], 10.00th=[ 5669], 20.00th=[ 6325], 00:08:34.675 | 30.00th=[ 6652], 40.00th=[ 6915], 50.00th=[ 7111], 60.00th=[ 7439], 00:08:34.675 | 70.00th=[ 8356], 80.00th=[ 9372], 90.00th=[11469], 95.00th=[12911], 00:08:34.675 | 99.00th=[15008], 99.50th=[16909], 99.90th=[17695], 99.95th=[17695], 00:08:34.675 | 99.99th=[19530] 00:08:34.675 write: IOPS=8167, BW=31.9MiB/s (33.5MB/s)(32.0MiB/1003msec); 0 zone resets 00:08:34.675 slat (usec): min=2, max=4866, avg=57.26, stdev=266.70 00:08:34.675 clat (usec): min=3382, max=17480, avg=7733.23, stdev=2675.76 00:08:34.675 lat (usec): min=3391, max=18952, avg=7790.49, stdev=2694.18 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4080], 5.00th=[ 4817], 10.00th=[ 5211], 20.00th=[ 5997], 00:08:34.675 | 30.00th=[ 6259], 40.00th=[ 6456], 50.00th=[ 6718], 60.00th=[ 7111], 00:08:34.675 | 70.00th=[ 8029], 80.00th=[ 9503], 90.00th=[12256], 95.00th=[13829], 00:08:34.675 | 99.00th=[15664], 99.50th=[16188], 99.90th=[16581], 99.95th=[16581], 00:08:34.675 | 99.99th=[17433] 00:08:34.675 bw ( KiB/s): min=32768, max=32768, per=33.61%, avg=32768.00, stdev= 0.00, samples=2 00:08:34.675 iops : min= 8192, max= 8192, avg=8192.00, stdev= 0.00, samples=2 00:08:34.675 lat (msec) : 2=0.01%, 4=0.63%, 10=81.94%, 20=17.42% 00:08:34.675 cpu : usr=5.29%, sys=8.98%, ctx=1419, majf=0, minf=2 00:08:34.675 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.2%, >=64=99.6% 00:08:34.675 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:34.675 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:34.675 issued rwts: total=8094,8192,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:34.675 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:34.675 job2: (groupid=0, jobs=1): err= 0: pid=1709173: Fri Sep 27 15:13:36 2024 00:08:34.675 read: IOPS=5204, BW=20.3MiB/s (21.3MB/s)(20.4MiB/1003msec) 00:08:34.675 slat (usec): min=2, max=5403, avg=91.83, stdev=406.87 00:08:34.675 clat (usec): min=2407, max=20542, avg=11850.35, stdev=3296.21 00:08:34.675 lat (usec): min=3153, max=20547, avg=11942.18, stdev=3310.95 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 6063], 5.00th=[ 7439], 10.00th=[ 7898], 20.00th=[ 8356], 00:08:34.675 | 30.00th=[ 9241], 40.00th=[10421], 50.00th=[11863], 60.00th=[13042], 00:08:34.675 | 70.00th=[14091], 80.00th=[15139], 90.00th=[16057], 95.00th=[16712], 00:08:34.675 | 99.00th=[19006], 99.50th=[19268], 99.90th=[20579], 99.95th=[20579], 00:08:34.675 | 99.99th=[20579] 00:08:34.675 write: IOPS=5615, BW=21.9MiB/s (23.0MB/s)(22.0MiB/1003msec); 0 zone resets 00:08:34.675 slat (usec): min=2, max=5751, avg=87.32, stdev=403.01 00:08:34.675 clat (usec): min=3900, max=20781, avg=11498.95, stdev=3626.10 00:08:34.675 lat (usec): min=4406, max=20785, avg=11586.26, stdev=3640.46 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4948], 5.00th=[ 6652], 10.00th=[ 7308], 20.00th=[ 7635], 00:08:34.675 | 30.00th=[ 8225], 40.00th=[10028], 50.00th=[11469], 60.00th=[12911], 00:08:34.675 | 70.00th=[14222], 80.00th=[15139], 90.00th=[15926], 95.00th=[16712], 00:08:34.675 | 99.00th=[20055], 99.50th=[20055], 99.90th=[20841], 99.95th=[20841], 00:08:34.675 | 99.99th=[20841] 00:08:34.675 bw ( KiB/s): min=20392, max=24448, per=23.00%, avg=22420.00, stdev=2868.03, samples=2 00:08:34.675 iops : min= 5098, max= 6112, avg=5605.00, stdev=717.01, samples=2 00:08:34.675 lat (msec) : 4=0.14%, 10=37.85%, 20=61.36%, 50=0.65% 00:08:34.675 cpu : usr=3.39%, sys=6.29%, ctx=1078, majf=0, minf=1 00:08:34.675 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.3%, >=64=99.4% 00:08:34.675 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:34.675 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:34.675 issued rwts: total=5220,5632,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:34.675 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:34.675 job3: (groupid=0, jobs=1): err= 0: pid=1709181: Fri Sep 27 15:13:36 2024 00:08:34.675 read: IOPS=4594, BW=17.9MiB/s (18.8MB/s)(18.0MiB/1003msec) 00:08:34.675 slat (usec): min=2, max=5559, avg=100.37, stdev=473.80 00:08:34.675 clat (usec): min=3220, max=20359, avg=13036.88, stdev=3248.03 00:08:34.675 lat (usec): min=3228, max=20367, avg=13137.24, stdev=3251.78 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4948], 5.00th=[ 7177], 10.00th=[ 8717], 20.00th=[10028], 00:08:34.675 | 30.00th=[11076], 40.00th=[12518], 50.00th=[13698], 60.00th=[14353], 00:08:34.675 | 70.00th=[15139], 80.00th=[15795], 90.00th=[16450], 95.00th=[17957], 00:08:34.675 | 99.00th=[19792], 99.50th=[19792], 99.90th=[20317], 99.95th=[20317], 00:08:34.675 | 99.99th=[20317] 00:08:34.675 write: IOPS=4974, BW=19.4MiB/s (20.4MB/s)(19.5MiB/1003msec); 0 zone resets 00:08:34.675 slat (usec): min=2, max=7690, avg=102.37, stdev=467.51 00:08:34.675 clat (usec): min=686, max=22860, avg=13387.48, stdev=3559.77 00:08:34.675 lat (usec): min=3190, max=23144, avg=13489.85, stdev=3568.79 00:08:34.675 clat percentiles (usec): 00:08:34.675 | 1.00th=[ 4883], 5.00th=[ 6456], 10.00th=[ 7963], 20.00th=[10683], 00:08:34.675 | 30.00th=[11600], 40.00th=[13042], 50.00th=[13960], 60.00th=[15139], 00:08:34.675 | 70.00th=[15533], 80.00th=[16057], 90.00th=[17171], 95.00th=[18744], 00:08:34.675 | 99.00th=[21103], 99.50th=[21365], 99.90th=[21890], 99.95th=[22676], 00:08:34.675 | 99.99th=[22938] 00:08:34.675 bw ( KiB/s): min=18408, max=20480, per=19.95%, avg=19444.00, stdev=1465.13, samples=2 00:08:34.675 iops : min= 4602, max= 5120, avg=4861.00, stdev=366.28, samples=2 00:08:34.675 lat (usec) : 750=0.01% 00:08:34.675 lat (msec) : 4=0.54%, 10=17.58%, 20=80.76%, 50=1.10% 00:08:34.675 cpu : usr=3.19%, sys=5.99%, ctx=774, majf=0, minf=1 00:08:34.675 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.2%, 32=0.3%, >=64=99.3% 00:08:34.675 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:34.675 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:34.675 issued rwts: total=4608,4989,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:34.675 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:34.675 00:08:34.676 Run status group 0 (all jobs): 00:08:34.676 READ: bw=90.6MiB/s (95.0MB/s), 17.9MiB/s-31.5MiB/s (18.8MB/s-33.1MB/s), io=90.9MiB (95.3MB), run=1002-1003msec 00:08:34.676 WRITE: bw=95.2MiB/s (99.8MB/s), 19.4MiB/s-31.9MiB/s (20.4MB/s-33.5MB/s), io=95.5MiB (100MB), run=1002-1003msec 00:08:34.676 00:08:34.676 Disk stats (read/write): 00:08:34.676 nvme0n1: ios=4362/4608, merge=0/0, ticks=13602/14779, in_queue=28381, util=83.97% 00:08:34.676 nvme0n2: ios=6919/7168, merge=0/0, ticks=15024/14322, in_queue=29346, util=84.34% 00:08:34.676 nvme0n3: ios=4451/4608, merge=0/0, ticks=14329/14364, in_queue=28693, util=87.35% 00:08:34.676 nvme0n4: ios=3661/4096, merge=0/0, ticks=13003/14822, in_queue=27825, util=88.70% 00:08:34.676 15:13:36 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@55 -- # sync 00:08:34.676 15:13:36 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@59 -- # fio_pid=1709326 00:08:34.676 15:13:36 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@61 -- # sleep 3 00:08:34.676 15:13:36 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@58 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 4096 -d 1 -t read -r 10 00:08:34.676 [global] 00:08:34.676 thread=1 00:08:34.676 invalidate=1 00:08:34.676 rw=read 00:08:34.676 time_based=1 00:08:34.676 runtime=10 00:08:34.676 ioengine=libaio 00:08:34.676 direct=1 00:08:34.676 bs=4096 00:08:34.676 iodepth=1 00:08:34.676 norandommap=1 00:08:34.676 numjobs=1 00:08:34.676 00:08:34.676 [job0] 00:08:34.676 filename=/dev/nvme0n1 00:08:34.676 [job1] 00:08:34.676 filename=/dev/nvme0n2 00:08:34.676 [job2] 00:08:34.676 filename=/dev/nvme0n3 00:08:34.676 [job3] 00:08:34.676 filename=/dev/nvme0n4 00:08:34.676 Could not set queue depth (nvme0n1) 00:08:34.676 Could not set queue depth (nvme0n2) 00:08:34.676 Could not set queue depth (nvme0n3) 00:08:34.676 Could not set queue depth (nvme0n4) 00:08:34.934 job0: (g=0): rw=read, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:34.934 job1: (g=0): rw=read, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:34.934 job2: (g=0): rw=read, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:34.934 job3: (g=0): rw=read, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=1 00:08:34.934 fio-3.35 00:08:34.934 Starting 4 threads 00:08:37.468 15:13:39 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_raid_delete concat0 00:08:37.726 fio: io_u error on file /dev/nvme0n4: Operation not supported: read offset=77529088, buflen=4096 00:08:37.726 fio: pid=1709589, err=95/file:io_u.c:1889, func=io_u error, error=Operation not supported 00:08:37.726 15:13:39 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_raid_delete raid0 00:08:38.054 fio: io_u error on file /dev/nvme0n3: Operation not supported: read offset=84598784, buflen=4096 00:08:38.054 fio: pid=1709582, err=95/file:io_u.c:1889, func=io_u error, error=Operation not supported 00:08:38.054 15:13:39 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:38.054 15:13:39 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc0 00:08:38.054 fio: io_u error on file /dev/nvme0n1: Operation not supported: read offset=38350848, buflen=4096 00:08:38.054 fio: pid=1709544, err=95/file:io_u.c:1889, func=io_u error, error=Operation not supported 00:08:38.362 15:13:39 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:38.362 15:13:39 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc1 00:08:38.362 fio: io_u error on file /dev/nvme0n2: Operation not supported: read offset=37240832, buflen=4096 00:08:38.362 fio: pid=1709562, err=95/file:io_u.c:1889, func=io_u error, error=Operation not supported 00:08:38.362 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:38.362 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc2 00:08:38.362 00:08:38.362 job0: (groupid=0, jobs=1): err=95 (file:io_u.c:1889, func=io_u error, error=Operation not supported): pid=1709544: Fri Sep 27 15:13:40 2024 00:08:38.362 read: IOPS=8155, BW=31.9MiB/s (33.4MB/s)(101MiB/3157msec) 00:08:38.362 slat (usec): min=8, max=12800, avg=10.58, stdev=126.15 00:08:38.362 clat (usec): min=52, max=384, avg=110.28, stdev=33.67 00:08:38.362 lat (usec): min=61, max=12902, avg=120.86, stdev=130.46 00:08:38.362 clat percentiles (usec): 00:08:38.362 | 1.00th=[ 69], 5.00th=[ 75], 10.00th=[ 77], 20.00th=[ 79], 00:08:38.362 | 30.00th=[ 81], 40.00th=[ 84], 50.00th=[ 89], 60.00th=[ 135], 00:08:38.362 | 70.00th=[ 145], 80.00th=[ 147], 90.00th=[ 151], 95.00th=[ 157], 00:08:38.362 | 99.00th=[ 167], 99.50th=[ 178], 99.90th=[ 212], 99.95th=[ 217], 00:08:38.362 | 99.99th=[ 229] 00:08:38.362 bw ( KiB/s): min=24576, max=42952, per=30.28%, avg=32396.67, stdev=7843.49, samples=6 00:08:38.362 iops : min= 6144, max=10738, avg=8099.17, stdev=1960.87, samples=6 00:08:38.362 lat (usec) : 100=53.81%, 250=46.19%, 500=0.01% 00:08:38.362 cpu : usr=3.20%, sys=8.59%, ctx=25754, majf=0, minf=1 00:08:38.362 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:38.362 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 complete : 0=0.1%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 issued rwts: total=25748,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:38.362 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:38.362 job1: (groupid=0, jobs=1): err=95 (file:io_u.c:1889, func=io_u error, error=Operation not supported): pid=1709562: Fri Sep 27 15:13:40 2024 00:08:38.362 read: IOPS=7504, BW=29.3MiB/s (30.7MB/s)(99.5MiB/3395msec) 00:08:38.362 slat (usec): min=7, max=12993, avg=11.26, stdev=150.31 00:08:38.362 clat (usec): min=37, max=323, avg=119.58, stdev=35.19 00:08:38.362 lat (usec): min=59, max=13073, avg=130.84, stdev=153.98 00:08:38.362 clat percentiles (usec): 00:08:38.362 | 1.00th=[ 55], 5.00th=[ 59], 10.00th=[ 64], 20.00th=[ 79], 00:08:38.362 | 30.00th=[ 87], 40.00th=[ 123], 50.00th=[ 135], 60.00th=[ 143], 00:08:38.362 | 70.00th=[ 147], 80.00th=[ 149], 90.00th=[ 155], 95.00th=[ 161], 00:08:38.362 | 99.00th=[ 169], 99.50th=[ 176], 99.90th=[ 198], 99.95th=[ 206], 00:08:38.362 | 99.99th=[ 217] 00:08:38.362 bw ( KiB/s): min=24760, max=35412, per=26.45%, avg=28294.00, stdev=4060.47, samples=6 00:08:38.362 iops : min= 6190, max= 8853, avg=7073.50, stdev=1015.12, samples=6 00:08:38.362 lat (usec) : 50=0.02%, 100=32.73%, 250=67.25%, 500=0.01% 00:08:38.362 cpu : usr=3.01%, sys=7.90%, ctx=25484, majf=0, minf=2 00:08:38.362 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:38.362 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 complete : 0=0.1%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 issued rwts: total=25477,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:38.362 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:38.362 job2: (groupid=0, jobs=1): err=95 (file:io_u.c:1889, func=io_u error, error=Operation not supported): pid=1709582: Fri Sep 27 15:13:40 2024 00:08:38.362 read: IOPS=7006, BW=27.4MiB/s (28.7MB/s)(80.7MiB/2948msec) 00:08:38.362 slat (usec): min=8, max=11800, avg=10.26, stdev=98.61 00:08:38.362 clat (usec): min=73, max=326, avg=130.73, stdev=26.00 00:08:38.362 lat (usec): min=83, max=11927, avg=140.99, stdev=101.92 00:08:38.362 clat percentiles (usec): 00:08:38.362 | 1.00th=[ 81], 5.00th=[ 85], 10.00th=[ 88], 20.00th=[ 97], 00:08:38.362 | 30.00th=[ 123], 40.00th=[ 131], 50.00th=[ 141], 60.00th=[ 145], 00:08:38.362 | 70.00th=[ 147], 80.00th=[ 151], 90.00th=[ 157], 95.00th=[ 163], 00:08:38.362 | 99.00th=[ 176], 99.50th=[ 188], 99.90th=[ 212], 99.95th=[ 217], 00:08:38.362 | 99.99th=[ 227] 00:08:38.362 bw ( KiB/s): min=24624, max=33920, per=25.72%, avg=27520.00, stdev=3770.75, samples=5 00:08:38.362 iops : min= 6156, max= 8480, avg=6880.00, stdev=942.69, samples=5 00:08:38.362 lat (usec) : 100=20.92%, 250=79.08%, 500=0.01% 00:08:38.362 cpu : usr=3.05%, sys=7.19%, ctx=20658, majf=0, minf=2 00:08:38.362 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:38.362 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 complete : 0=0.1%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 issued rwts: total=20655,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:38.362 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:38.362 job3: (groupid=0, jobs=1): err=95 (file:io_u.c:1889, func=io_u error, error=Operation not supported): pid=1709589: Fri Sep 27 15:13:40 2024 00:08:38.362 read: IOPS=6918, BW=27.0MiB/s (28.3MB/s)(73.9MiB/2736msec) 00:08:38.362 slat (nsec): min=8260, max=83972, avg=9202.75, stdev=1298.45 00:08:38.362 clat (usec): min=71, max=233, avg=133.51, stdev=25.41 00:08:38.362 lat (usec): min=84, max=243, avg=142.71, stdev=25.47 00:08:38.362 clat percentiles (usec): 00:08:38.362 | 1.00th=[ 84], 5.00th=[ 89], 10.00th=[ 92], 20.00th=[ 99], 00:08:38.362 | 30.00th=[ 124], 40.00th=[ 141], 50.00th=[ 145], 60.00th=[ 147], 00:08:38.362 | 70.00th=[ 149], 80.00th=[ 153], 90.00th=[ 159], 95.00th=[ 163], 00:08:38.362 | 99.00th=[ 174], 99.50th=[ 182], 99.90th=[ 204], 99.95th=[ 215], 00:08:38.362 | 99.99th=[ 231] 00:08:38.362 bw ( KiB/s): min=24840, max=35184, per=25.86%, avg=27665.60, stdev=4283.41, samples=5 00:08:38.362 iops : min= 6210, max= 8796, avg=6916.40, stdev=1070.85, samples=5 00:08:38.362 lat (usec) : 100=20.80%, 250=79.20% 00:08:38.362 cpu : usr=3.07%, sys=7.09%, ctx=18930, majf=0, minf=1 00:08:38.362 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:38.362 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 complete : 0=0.1%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:38.362 issued rwts: total=18929,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:38.362 latency : target=0, window=0, percentile=100.00%, depth=1 00:08:38.362 00:08:38.362 Run status group 0 (all jobs): 00:08:38.362 READ: bw=104MiB/s (110MB/s), 27.0MiB/s-31.9MiB/s (28.3MB/s-33.4MB/s), io=355MiB (372MB), run=2736-3395msec 00:08:38.362 00:08:38.362 Disk stats (read/write): 00:08:38.362 nvme0n1: ios=25006/0, merge=0/0, ticks=2626/0, in_queue=2626, util=93.90% 00:08:38.362 nvme0n2: ios=24975/0, merge=0/0, ticks=2886/0, in_queue=2886, util=94.36% 00:08:38.362 nvme0n3: ios=19615/0, merge=0/0, ticks=2473/0, in_queue=2473, util=95.72% 00:08:38.362 nvme0n4: ios=17852/0, merge=0/0, ticks=2280/0, in_queue=2280, util=96.42% 00:08:38.621 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:38.621 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc3 00:08:38.880 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:38.880 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc4 00:08:39.138 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:39.138 15:13:40 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc5 00:08:39.397 15:13:41 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@65 -- # for malloc_bdev in $malloc_bdevs $raid_malloc_bdevs $concat_malloc_bdevs 00:08:39.397 15:13:41 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@66 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_delete Malloc6 00:08:39.397 15:13:41 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@69 -- # fio_status=0 00:08:39.397 15:13:41 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@70 -- # wait 1709326 00:08:39.397 15:13:41 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@70 -- # fio_status=4 00:08:39.397 15:13:41 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@72 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:08:40.335 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@73 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1219 -- # local i=0 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1231 -- # return 0 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@75 -- # '[' 4 -eq 0 ']' 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@80 -- # echo 'nvmf hotplug test: fio failed as expected' 00:08:40.335 nvmf hotplug test: fio failed as expected 00:08:40.335 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@83 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@85 -- # rm -f ./local-job0-0-verify.state 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@86 -- # rm -f ./local-job1-1-verify.state 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@87 -- # rm -f ./local-job2-2-verify.state 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@89 -- # trap - SIGINT SIGTERM EXIT 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- target/fio.sh@91 -- # nvmftestfini 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@331 -- # nvmfcleanup 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@99 -- # sync 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@102 -- # set +e 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@103 -- # for i in {1..20} 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:08:40.594 rmmod nvme_rdma 00:08:40.594 rmmod nvme_fabrics 00:08:40.594 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@106 -- # set -e 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@107 -- # return 0 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@332 -- # '[' -n 1706955 ']' 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@333 -- # killprocess 1706955 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@950 -- # '[' -z 1706955 ']' 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@954 -- # kill -0 1706955 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@955 -- # uname 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1706955 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1706955' 00:08:40.854 killing process with pid 1706955 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@969 -- # kill 1706955 00:08:40.854 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@974 -- # wait 1706955 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@338 -- # nvmf_fini 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@264 -- # local dev 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@267 -- # remove_target_ns 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@22 -- # _remove_target_ns 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@268 -- # delete_main_bridge 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@130 -- # return 0 00:08:41.113 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@41 -- # _dev=0 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@41 -- # dev_map=() 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/setup.sh@284 -- # iptr 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@538 -- # iptables-save 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- nvmf/common.sh@538 -- # iptables-restore 00:08:41.114 00:08:41.114 real 0m27.719s 00:08:41.114 user 1m40.738s 00:08:41.114 sys 0m10.740s 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_fio_target -- common/autotest_common.sh@10 -- # set +x 00:08:41.114 ************************************ 00:08:41.114 END TEST nvmf_fio_target 00:08:41.114 ************************************ 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@28 -- # run_test nvmf_bdevio /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevio.sh --transport=rdma 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:08:41.114 ************************************ 00:08:41.114 START TEST nvmf_bdevio 00:08:41.114 ************************************ 00:08:41.114 15:13:42 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevio.sh --transport=rdma 00:08:41.375 * Looking for test storage... 00:08:41.375 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1681 -- # lcov --version 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@336 -- # IFS=.-: 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@336 -- # read -ra ver1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@337 -- # IFS=.-: 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@337 -- # read -ra ver2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@338 -- # local 'op=<' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@340 -- # ver1_l=2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@341 -- # ver2_l=1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@344 -- # case "$op" in 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@345 -- # : 1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@365 -- # decimal 1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@353 -- # local d=1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@355 -- # echo 1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@365 -- # ver1[v]=1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@366 -- # decimal 2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@353 -- # local d=2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@355 -- # echo 2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@366 -- # ver2[v]=2 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@368 -- # return 0 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:41.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:41.375 --rc genhtml_branch_coverage=1 00:08:41.375 --rc genhtml_function_coverage=1 00:08:41.375 --rc genhtml_legend=1 00:08:41.375 --rc geninfo_all_blocks=1 00:08:41.375 --rc geninfo_unexecuted_blocks=1 00:08:41.375 00:08:41.375 ' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:41.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:41.375 --rc genhtml_branch_coverage=1 00:08:41.375 --rc genhtml_function_coverage=1 00:08:41.375 --rc genhtml_legend=1 00:08:41.375 --rc geninfo_all_blocks=1 00:08:41.375 --rc geninfo_unexecuted_blocks=1 00:08:41.375 00:08:41.375 ' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:41.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:41.375 --rc genhtml_branch_coverage=1 00:08:41.375 --rc genhtml_function_coverage=1 00:08:41.375 --rc genhtml_legend=1 00:08:41.375 --rc geninfo_all_blocks=1 00:08:41.375 --rc geninfo_unexecuted_blocks=1 00:08:41.375 00:08:41.375 ' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:41.375 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:41.375 --rc genhtml_branch_coverage=1 00:08:41.375 --rc genhtml_function_coverage=1 00:08:41.375 --rc genhtml_legend=1 00:08:41.375 --rc geninfo_all_blocks=1 00:08:41.375 --rc geninfo_unexecuted_blocks=1 00:08:41.375 00:08:41.375 ' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@7 -- # uname -s 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@15 -- # shopt -s extglob 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- paths/export.sh@5 -- # export PATH 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:08:41.375 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@50 -- # : 0 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:08:41.376 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@54 -- # have_pci_nics=0 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@11 -- # MALLOC_BDEV_SIZE=64 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@14 -- # nvmftestinit 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@292 -- # prepare_net_devs 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@254 -- # local -g is_hw=no 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@256 -- # remove_target_ns 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@22 -- # _remove_target_ns 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@125 -- # xtrace_disable 00:08:41.376 15:13:43 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@131 -- # pci_devs=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@131 -- # local -a pci_devs 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@132 -- # pci_net_devs=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@133 -- # pci_drivers=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@133 -- # local -A pci_drivers 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@135 -- # net_devs=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@135 -- # local -ga net_devs 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@136 -- # e810=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@136 -- # local -ga e810 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@137 -- # x722=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@137 -- # local -ga x722 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@138 -- # mlx=() 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@138 -- # local -ga mlx 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:08:49.500 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:08:49.501 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:08:49.501 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:08:49.501 Found net devices under 0000:18:00.0: mlx_0_0 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:08:49.501 Found net devices under 0000:18:00.1: mlx_0_1 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@249 -- # get_rdma_if_list 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@75 -- # rdma_devs=() 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@89 -- # continue 2 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@89 -- # continue 2 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@258 -- # is_hw=yes 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@61 -- # uname 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@65 -- # modprobe ib_cm 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@66 -- # modprobe ib_core 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@67 -- # modprobe ib_umad 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@69 -- # modprobe iw_cm 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@27 -- # local -gA dev_map 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@28 -- # local -g _dev 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@44 -- # ips=() 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@58 -- # key_initiator=target1 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@11 -- # local val=167772161 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:08:49.501 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:08:49.502 10.0.0.1 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@11 -- # local val=167772162 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:08:49.502 15:13:49 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:08:49.502 10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@38 -- # ping_ips 1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@107 -- # local dev=target0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:49.502 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:49.502 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.037 ms 00:08:49.502 00:08:49.502 --- 10.0.0.2 ping statistics --- 00:08:49.502 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:49.502 rtt min/avg/max/mdev = 0.037/0.037/0.037/0.000 ms 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@107 -- # local dev=target0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:49.502 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:49.502 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.040 ms 00:08:49.502 00:08:49.502 --- 10.0.0.2 ping statistics --- 00:08:49.502 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:49.502 rtt min/avg/max/mdev = 0.040/0.040/0.040/0.000 ms 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@98 -- # (( pair++ )) 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@266 -- # return 0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@107 -- # local dev=target0 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:49.502 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@107 -- # local dev=target1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@107 -- # local dev=target0 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@107 -- # local dev=target1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@16 -- # nvmfappstart -m 0x78 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@324 -- # nvmfpid=1713300 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x78 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@325 -- # waitforlisten 1713300 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@831 -- # '[' -z 1713300 ']' 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:49.503 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:49.503 15:13:50 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.503 [2024-09-27 15:13:50.252904] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:08:49.503 [2024-09-27 15:13:50.252965] [ DPDK EAL parameters: nvmf -c 0x78 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:08:49.503 [2024-09-27 15:13:50.339893] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:49.503 [2024-09-27 15:13:50.430120] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:08:49.503 [2024-09-27 15:13:50.430168] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:08:49.503 [2024-09-27 15:13:50.430179] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:08:49.503 [2024-09-27 15:13:50.430205] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:08:49.503 [2024-09-27 15:13:50.430213] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:08:49.503 [2024-09-27 15:13:50.430334] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:08:49.503 [2024-09-27 15:13:50.430418] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 5 00:08:49.503 [2024-09-27 15:13:50.430502] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:08:49.503 [2024-09-27 15:13:50.430503] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 6 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@864 -- # return 0 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@18 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.503 [2024-09-27 15:13:51.182167] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1acdda0/0x1ad2290) succeed. 00:08:49.503 [2024-09-27 15:13:51.192725] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1acf3e0/0x1b13930) succeed. 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@19 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.503 Malloc0 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@20 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.503 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@21 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@22 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:49.763 [2024-09-27 15:13:51.367816] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/bdev/bdevio/bdevio --json /dev/fd/62 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@24 -- # gen_nvmf_target_json 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@368 -- # config=() 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@368 -- # local subsystem config 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:08:49.763 { 00:08:49.763 "params": { 00:08:49.763 "name": "Nvme$subsystem", 00:08:49.763 "trtype": "$TEST_TRANSPORT", 00:08:49.763 "traddr": "$NVMF_FIRST_TARGET_IP", 00:08:49.763 "adrfam": "ipv4", 00:08:49.763 "trsvcid": "$NVMF_PORT", 00:08:49.763 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:08:49.763 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:08:49.763 "hdgst": ${hdgst:-false}, 00:08:49.763 "ddgst": ${ddgst:-false} 00:08:49.763 }, 00:08:49.763 "method": "bdev_nvme_attach_controller" 00:08:49.763 } 00:08:49.763 EOF 00:08:49.763 )") 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@390 -- # cat 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@392 -- # jq . 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@393 -- # IFS=, 00:08:49.763 15:13:51 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:08:49.763 "params": { 00:08:49.763 "name": "Nvme1", 00:08:49.763 "trtype": "rdma", 00:08:49.763 "traddr": "10.0.0.2", 00:08:49.763 "adrfam": "ipv4", 00:08:49.763 "trsvcid": "4420", 00:08:49.763 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:08:49.763 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:08:49.763 "hdgst": false, 00:08:49.763 "ddgst": false 00:08:49.763 }, 00:08:49.763 "method": "bdev_nvme_attach_controller" 00:08:49.763 }' 00:08:49.763 [2024-09-27 15:13:51.417926] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:08:49.763 [2024-09-27 15:13:51.417985] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1713504 ] 00:08:49.763 [2024-09-27 15:13:51.504485] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:49.763 [2024-09-27 15:13:51.591951] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:08:49.763 [2024-09-27 15:13:51.592052] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.763 [2024-09-27 15:13:51.592053] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:08:50.022 I/O targets: 00:08:50.022 Nvme1n1: 131072 blocks of 512 bytes (64 MiB) 00:08:50.023 00:08:50.023 00:08:50.023 CUnit - A unit testing framework for C - Version 2.1-3 00:08:50.023 http://cunit.sourceforge.net/ 00:08:50.023 00:08:50.023 00:08:50.023 Suite: bdevio tests on: Nvme1n1 00:08:50.023 Test: blockdev write read block ...passed 00:08:50.023 Test: blockdev write zeroes read block ...passed 00:08:50.023 Test: blockdev write zeroes read no split ...passed 00:08:50.023 Test: blockdev write zeroes read split ...passed 00:08:50.023 Test: blockdev write zeroes read split partial ...passed 00:08:50.023 Test: blockdev reset ...[2024-09-27 15:13:51.807257] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] resetting controller 00:08:50.023 [2024-09-27 15:13:51.830149] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:08:50.023 [2024-09-27 15:13:51.856582] bdev_nvme.c:2183:_bdev_nvme_reset_ctrlr_complete: *NOTICE*: Resetting controller successful. 00:08:50.023 passed 00:08:50.023 Test: blockdev write read 8 blocks ...passed 00:08:50.023 Test: blockdev write read size > 128k ...passed 00:08:50.023 Test: blockdev write read invalid size ...passed 00:08:50.023 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:08:50.023 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:08:50.023 Test: blockdev write read max offset ...passed 00:08:50.023 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:08:50.023 Test: blockdev writev readv 8 blocks ...passed 00:08:50.023 Test: blockdev writev readv 30 x 1block ...passed 00:08:50.023 Test: blockdev writev readv block ...passed 00:08:50.023 Test: blockdev writev readv size > 128k ...passed 00:08:50.023 Test: blockdev writev readv size > 128k in two iovs ...passed 00:08:50.023 Test: blockdev comparev and writev ...[2024-09-27 15:13:51.860042] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860071] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:0 cdw0:0 sqhd:0021 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860083] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860093] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - FAILED FUSED (00/09) qid:1 cid:1 cdw0:0 sqhd:0022 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860278] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860290] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:1 cdw0:0 sqhd:0023 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860300] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860310] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - FAILED FUSED (00/09) qid:1 cid:0 cdw0:0 sqhd:0024 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860486] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860498] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:0 cdw0:0 sqhd:0025 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860509] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860518] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - FAILED FUSED (00/09) qid:1 cid:1 cdw0:0 sqhd:0026 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860686] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860697] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:1 cdw0:0 sqhd:0027 p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.860707] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x200 00:08:50.023 [2024-09-27 15:13:51.860717] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - FAILED FUSED (00/09) qid:1 cid:0 cdw0:0 sqhd:0028 p:0 m:0 dnr:0 00:08:50.023 passed 00:08:50.023 Test: blockdev nvme passthru rw ...passed 00:08:50.023 Test: blockdev nvme passthru vendor specific ...[2024-09-27 15:13:51.861032] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:08:50.023 [2024-09-27 15:13:51.861044] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:0 cdw0:0 sqhd:002c p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.861089] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:08:50.023 [2024-09-27 15:13:51.861100] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:0 cdw0:0 sqhd:002d p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.861147] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:08:50.023 [2024-09-27 15:13:51.861158] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:0 cdw0:0 sqhd:002e p:0 m:0 dnr:0 00:08:50.023 [2024-09-27 15:13:51.861199] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:08:50.023 [2024-09-27 15:13:51.861209] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:0 cdw0:0 sqhd:002f p:0 m:0 dnr:0 00:08:50.023 passed 00:08:50.023 Test: blockdev nvme admin passthru ...passed 00:08:50.023 Test: blockdev copy ...passed 00:08:50.023 00:08:50.023 Run Summary: Type Total Ran Passed Failed Inactive 00:08:50.023 suites 1 1 n/a 0 0 00:08:50.023 tests 23 23 23 0 0 00:08:50.023 asserts 152 152 152 0 n/a 00:08:50.023 00:08:50.023 Elapsed time = 0.175 seconds 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@26 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@28 -- # trap - SIGINT SIGTERM EXIT 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- target/bdevio.sh@30 -- # nvmftestfini 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@331 -- # nvmfcleanup 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@99 -- # sync 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@102 -- # set +e 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@103 -- # for i in {1..20} 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:08:50.283 rmmod nvme_rdma 00:08:50.283 rmmod nvme_fabrics 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@106 -- # set -e 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@107 -- # return 0 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@332 -- # '[' -n 1713300 ']' 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@333 -- # killprocess 1713300 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@950 -- # '[' -z 1713300 ']' 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@954 -- # kill -0 1713300 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@955 -- # uname 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:50.283 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1713300 00:08:50.542 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@956 -- # process_name=reactor_3 00:08:50.542 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@960 -- # '[' reactor_3 = sudo ']' 00:08:50.542 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1713300' 00:08:50.542 killing process with pid 1713300 00:08:50.542 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@969 -- # kill 1713300 00:08:50.542 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@974 -- # wait 1713300 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@338 -- # nvmf_fini 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@264 -- # local dev 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@267 -- # remove_target_ns 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@22 -- # _remove_target_ns 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@268 -- # delete_main_bridge 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@130 -- # return 0 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@41 -- # _dev=0 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@41 -- # dev_map=() 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/setup.sh@284 -- # iptr 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@538 -- # iptables-save 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- nvmf/common.sh@538 -- # iptables-restore 00:08:50.801 00:08:50.801 real 0m9.606s 00:08:50.801 user 0m11.396s 00:08:50.801 sys 0m6.065s 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core.nvmf_bdevio -- common/autotest_common.sh@10 -- # set +x 00:08:50.801 ************************************ 00:08:50.801 END TEST nvmf_bdevio 00:08:50.801 ************************************ 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@30 -- # [[ rdma == \t\c\p ]] 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core -- nvmf/nvmf_target_core.sh@38 -- # trap - SIGINT SIGTERM EXIT 00:08:50.801 00:08:50.801 real 4m7.775s 00:08:50.801 user 10m41.524s 00:08:50.801 sys 1m28.665s 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:50.801 15:13:52 nvmf_rdma.nvmf_target_core -- common/autotest_common.sh@10 -- # set +x 00:08:50.801 ************************************ 00:08:50.801 END TEST nvmf_target_core 00:08:50.801 ************************************ 00:08:50.801 15:13:52 nvmf_rdma -- nvmf/nvmf.sh@11 -- # run_test nvmf_target_extra /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_target_extra.sh --transport=rdma 00:08:50.801 15:13:52 nvmf_rdma -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:50.801 15:13:52 nvmf_rdma -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:50.801 15:13:52 nvmf_rdma -- common/autotest_common.sh@10 -- # set +x 00:08:51.061 ************************************ 00:08:51.061 START TEST nvmf_target_extra 00:08:51.061 ************************************ 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_target_extra.sh --transport=rdma 00:08:51.061 * Looking for test storage... 00:08:51.061 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1681 -- # lcov --version 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@336 -- # IFS=.-: 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@336 -- # read -ra ver1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@337 -- # IFS=.-: 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@337 -- # read -ra ver2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@338 -- # local 'op=<' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@340 -- # ver1_l=2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@341 -- # ver2_l=1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@344 -- # case "$op" in 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@345 -- # : 1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@365 -- # decimal 1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@353 -- # local d=1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@355 -- # echo 1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@365 -- # ver1[v]=1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@366 -- # decimal 2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@353 -- # local d=2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@355 -- # echo 2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@366 -- # ver2[v]=2 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@368 -- # return 0 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:51.061 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.061 --rc genhtml_branch_coverage=1 00:08:51.061 --rc genhtml_function_coverage=1 00:08:51.061 --rc genhtml_legend=1 00:08:51.061 --rc geninfo_all_blocks=1 00:08:51.061 --rc geninfo_unexecuted_blocks=1 00:08:51.061 00:08:51.061 ' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:51.061 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.061 --rc genhtml_branch_coverage=1 00:08:51.061 --rc genhtml_function_coverage=1 00:08:51.061 --rc genhtml_legend=1 00:08:51.061 --rc geninfo_all_blocks=1 00:08:51.061 --rc geninfo_unexecuted_blocks=1 00:08:51.061 00:08:51.061 ' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:51.061 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.061 --rc genhtml_branch_coverage=1 00:08:51.061 --rc genhtml_function_coverage=1 00:08:51.061 --rc genhtml_legend=1 00:08:51.061 --rc geninfo_all_blocks=1 00:08:51.061 --rc geninfo_unexecuted_blocks=1 00:08:51.061 00:08:51.061 ' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:51.061 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.061 --rc genhtml_branch_coverage=1 00:08:51.061 --rc genhtml_function_coverage=1 00:08:51.061 --rc genhtml_legend=1 00:08:51.061 --rc geninfo_all_blocks=1 00:08:51.061 --rc geninfo_unexecuted_blocks=1 00:08:51.061 00:08:51.061 ' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@7 -- # uname -s 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@15 -- # shopt -s extglob 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.061 15:13:52 nvmf_rdma.nvmf_target_extra -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- paths/export.sh@5 -- # export PATH 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@50 -- # : 0 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:08:51.062 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/common.sh@54 -- # have_pci_nics=0 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@11 -- # trap 'exit 1' SIGINT SIGTERM EXIT 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@13 -- # TEST_ARGS=("$@") 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@15 -- # [[ 0 -eq 0 ]] 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@16 -- # run_test nvmf_example /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_example.sh --transport=rdma 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:51.062 15:13:52 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:08:51.320 ************************************ 00:08:51.320 START TEST nvmf_example 00:08:51.320 ************************************ 00:08:51.320 15:13:52 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvmf_example.sh --transport=rdma 00:08:51.320 * Looking for test storage... 00:08:51.320 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1681 -- # lcov --version 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@336 -- # IFS=.-: 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@336 -- # read -ra ver1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@337 -- # IFS=.-: 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@337 -- # read -ra ver2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@338 -- # local 'op=<' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@340 -- # ver1_l=2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@341 -- # ver2_l=1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@344 -- # case "$op" in 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@345 -- # : 1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@365 -- # decimal 1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@353 -- # local d=1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@355 -- # echo 1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@365 -- # ver1[v]=1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@366 -- # decimal 2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@353 -- # local d=2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@355 -- # echo 2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@366 -- # ver2[v]=2 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@368 -- # return 0 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:51.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.320 --rc genhtml_branch_coverage=1 00:08:51.320 --rc genhtml_function_coverage=1 00:08:51.320 --rc genhtml_legend=1 00:08:51.320 --rc geninfo_all_blocks=1 00:08:51.320 --rc geninfo_unexecuted_blocks=1 00:08:51.320 00:08:51.320 ' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:51.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.320 --rc genhtml_branch_coverage=1 00:08:51.320 --rc genhtml_function_coverage=1 00:08:51.320 --rc genhtml_legend=1 00:08:51.320 --rc geninfo_all_blocks=1 00:08:51.320 --rc geninfo_unexecuted_blocks=1 00:08:51.320 00:08:51.320 ' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:51.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.320 --rc genhtml_branch_coverage=1 00:08:51.320 --rc genhtml_function_coverage=1 00:08:51.320 --rc genhtml_legend=1 00:08:51.320 --rc geninfo_all_blocks=1 00:08:51.320 --rc geninfo_unexecuted_blocks=1 00:08:51.320 00:08:51.320 ' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:51.320 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:51.320 --rc genhtml_branch_coverage=1 00:08:51.320 --rc genhtml_function_coverage=1 00:08:51.320 --rc genhtml_legend=1 00:08:51.320 --rc geninfo_all_blocks=1 00:08:51.320 --rc geninfo_unexecuted_blocks=1 00:08:51.320 00:08:51.320 ' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@7 -- # uname -s 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@15 -- # shopt -s extglob 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- paths/export.sh@5 -- # export PATH 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:08:51.320 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@50 -- # : 0 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:08:51.579 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@54 -- # have_pci_nics=0 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@11 -- # NVMF_EXAMPLE=("$SPDK_EXAMPLE_DIR/nvmf") 00:08:51.579 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@13 -- # MALLOC_BDEV_SIZE=64 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@14 -- # MALLOC_BLOCK_SIZE=512 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@24 -- # build_nvmf_example_args 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@17 -- # '[' 0 -eq 1 ']' 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@20 -- # NVMF_EXAMPLE+=(-i "$NVMF_APP_SHM_ID" -g 10000) 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@21 -- # NVMF_EXAMPLE+=("${NO_HUGE[@]}") 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@40 -- # timing_enter nvmf_example_test 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@41 -- # nvmftestinit 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@292 -- # prepare_net_devs 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@254 -- # local -g is_hw=no 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@256 -- # remove_target_ns 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@22 -- # _remove_target_ns 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@125 -- # xtrace_disable 00:08:51.580 15:13:53 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@131 -- # pci_devs=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@131 -- # local -a pci_devs 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@132 -- # pci_net_devs=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@133 -- # pci_drivers=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@133 -- # local -A pci_drivers 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@135 -- # net_devs=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@135 -- # local -ga net_devs 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@136 -- # e810=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@136 -- # local -ga e810 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@137 -- # x722=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@137 -- # local -ga x722 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@138 -- # mlx=() 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@138 -- # local -ga mlx 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:08:58.152 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:58.152 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:08:58.153 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:08:58.153 Found net devices under 0000:18:00.0: mlx_0_0 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:08:58.153 Found net devices under 0000:18:00.1: mlx_0_1 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@249 -- # get_rdma_if_list 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@75 -- # rdma_devs=() 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@89 -- # continue 2 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@89 -- # continue 2 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@258 -- # is_hw=yes 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@61 -- # uname 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@65 -- # modprobe ib_cm 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@66 -- # modprobe ib_core 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@67 -- # modprobe ib_umad 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@69 -- # modprobe iw_cm 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@27 -- # local -gA dev_map 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@28 -- # local -g _dev 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@44 -- # ips=() 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@58 -- # key_initiator=target1 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@11 -- # local val=167772161 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:08:58.153 15:13:59 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:08:58.415 10.0.0.1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@11 -- # local val=167772162 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:08:58.415 10.0.0.2 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@38 -- # ping_ips 1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@107 -- # local dev=target0 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:58.415 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:58.415 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:58.415 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:08:58.415 00:08:58.415 --- 10.0.0.2 ping statistics --- 00:08:58.415 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:58.416 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@107 -- # local dev=target0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:08:58.416 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:08:58.416 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.027 ms 00:08:58.416 00:08:58.416 --- 10.0.0.2 ping statistics --- 00:08:58.416 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:08:58.416 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@98 -- # (( pair++ )) 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@266 -- # return 0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@107 -- # local dev=target0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@107 -- # local dev=target1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # get_net_dev target0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@107 -- # local dev=target0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # get_net_dev target1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@107 -- # local dev=target1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@42 -- # nvmfexamplestart '-m 0xF' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@27 -- # timing_enter start_nvmf_example 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@29 -- # '[' rdma == tcp ']' 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@34 -- # nvmfpid=1716677 00:08:58.416 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/nvmf -i 0 -g 10000 -m 0xF 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@35 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@36 -- # waitforlisten 1716677 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@831 -- # '[' -z 1716677 ']' 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:58.417 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:58.417 15:14:00 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@864 -- # return 0 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@37 -- # timing_exit start_nvmf_example 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@730 -- # xtrace_disable 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@45 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:59.354 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@47 -- # rpc_cmd bdev_malloc_create 64 512 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@47 -- # malloc_bdevs='Malloc0 ' 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@49 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@52 -- # for malloc_bdev in $malloc_bdevs 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@53 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@57 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@59 -- # perf=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf 00:08:59.613 15:14:01 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@61 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 64 -o 4096 -w randrw -M 30 -t 10 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2016-06.io.spdk:cnode1' 00:09:11.827 Initializing NVMe Controllers 00:09:11.827 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:09:11.827 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:09:11.827 Initialization complete. Launching workers. 00:09:11.827 ======================================================== 00:09:11.827 Latency(us) 00:09:11.827 Device Information : IOPS MiB/s Average min max 00:09:11.827 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 23771.48 92.86 2693.93 639.47 14999.95 00:09:11.827 ======================================================== 00:09:11.827 Total : 23771.48 92.86 2693.93 639.47 14999.95 00:09:11.827 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@65 -- # trap - SIGINT SIGTERM EXIT 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@66 -- # nvmftestfini 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@331 -- # nvmfcleanup 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@99 -- # sync 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@102 -- # set +e 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@103 -- # for i in {1..20} 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:09:11.827 rmmod nvme_rdma 00:09:11.827 rmmod nvme_fabrics 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@106 -- # set -e 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@107 -- # return 0 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@332 -- # '[' -n 1716677 ']' 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@333 -- # killprocess 1716677 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@950 -- # '[' -z 1716677 ']' 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@954 -- # kill -0 1716677 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@955 -- # uname 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1716677 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@956 -- # process_name=nvmf 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@960 -- # '[' nvmf = sudo ']' 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1716677' 00:09:11.827 killing process with pid 1716677 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@969 -- # kill 1716677 00:09:11.827 15:14:12 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@974 -- # wait 1716677 00:09:11.827 nvmf threads initialize successfully 00:09:11.827 bdev subsystem init successfully 00:09:11.827 created a nvmf target service 00:09:11.827 create targets's poll groups done 00:09:11.827 all subsystems of target started 00:09:11.827 nvmf target is running 00:09:11.827 all subsystems of target stopped 00:09:11.827 destroy targets's poll groups done 00:09:11.827 destroyed the nvmf target service 00:09:11.827 bdev subsystem finish successfully 00:09:11.827 nvmf threads destroy successfully 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@338 -- # nvmf_fini 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@264 -- # local dev 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@267 -- # remove_target_ns 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@268 -- # delete_main_bridge 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@130 -- # return 0 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@41 -- # _dev=0 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@41 -- # dev_map=() 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/setup.sh@284 -- # iptr 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@538 -- # iptables-save 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- nvmf/common.sh@538 -- # iptables-restore 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- target/nvmf_example.sh@67 -- # timing_exit nvmf_example_test 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:09:11.827 00:09:11.827 real 0m20.193s 00:09:11.827 user 0m52.603s 00:09:11.827 sys 0m5.925s 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_example -- common/autotest_common.sh@10 -- # set +x 00:09:11.827 ************************************ 00:09:11.827 END TEST nvmf_example 00:09:11.827 ************************************ 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@17 -- # run_test nvmf_filesystem /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/filesystem.sh --transport=rdma 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:09:11.827 ************************************ 00:09:11.827 START TEST nvmf_filesystem 00:09:11.827 ************************************ 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/filesystem.sh --transport=rdma 00:09:11.827 * Looking for test storage... 00:09:11.827 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:11.827 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1681 -- # lcov --version 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@336 -- # IFS=.-: 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@336 -- # read -ra ver1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@337 -- # IFS=.-: 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@337 -- # read -ra ver2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@338 -- # local 'op=<' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@340 -- # ver1_l=2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@341 -- # ver2_l=1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@344 -- # case "$op" in 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@345 -- # : 1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@365 -- # decimal 1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@353 -- # local d=1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@355 -- # echo 1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@365 -- # ver1[v]=1 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@366 -- # decimal 2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@353 -- # local d=2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@355 -- # echo 2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@366 -- # ver2[v]=2 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@368 -- # return 0 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:09:11.828 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:11.828 --rc genhtml_branch_coverage=1 00:09:11.828 --rc genhtml_function_coverage=1 00:09:11.828 --rc genhtml_legend=1 00:09:11.828 --rc geninfo_all_blocks=1 00:09:11.828 --rc geninfo_unexecuted_blocks=1 00:09:11.828 00:09:11.828 ' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:09:11.828 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:11.828 --rc genhtml_branch_coverage=1 00:09:11.828 --rc genhtml_function_coverage=1 00:09:11.828 --rc genhtml_legend=1 00:09:11.828 --rc geninfo_all_blocks=1 00:09:11.828 --rc geninfo_unexecuted_blocks=1 00:09:11.828 00:09:11.828 ' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:09:11.828 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:11.828 --rc genhtml_branch_coverage=1 00:09:11.828 --rc genhtml_function_coverage=1 00:09:11.828 --rc genhtml_legend=1 00:09:11.828 --rc geninfo_all_blocks=1 00:09:11.828 --rc geninfo_unexecuted_blocks=1 00:09:11.828 00:09:11.828 ' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:09:11.828 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:11.828 --rc genhtml_branch_coverage=1 00:09:11.828 --rc genhtml_function_coverage=1 00:09:11.828 --rc genhtml_legend=1 00:09:11.828 --rc geninfo_all_blocks=1 00:09:11.828 --rc geninfo_unexecuted_blocks=1 00:09:11.828 00:09:11.828 ' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@34 -- # set -e 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@36 -- # shopt -s extglob 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output ']' 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/build_config.sh ]] 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/build_config.sh 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@22 -- # CONFIG_CET=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@36 -- # CONFIG_FUZZER=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:09:11.828 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR= 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@71 -- # CONFIG_SHARED=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@75 -- # CONFIG_FC=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/build_config.sh@89 -- # CONFIG_URING=n 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/applications.sh 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/applications.sh 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/nvmf-phy-autotest/spdk 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/nvmf-phy-autotest/spdk/include/spdk/config.h ]] 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:09:11.829 #define SPDK_CONFIG_H 00:09:11.829 #define SPDK_CONFIG_AIO_FSDEV 1 00:09:11.829 #define SPDK_CONFIG_APPS 1 00:09:11.829 #define SPDK_CONFIG_ARCH native 00:09:11.829 #undef SPDK_CONFIG_ASAN 00:09:11.829 #undef SPDK_CONFIG_AVAHI 00:09:11.829 #undef SPDK_CONFIG_CET 00:09:11.829 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:09:11.829 #define SPDK_CONFIG_COVERAGE 1 00:09:11.829 #define SPDK_CONFIG_CROSS_PREFIX 00:09:11.829 #undef SPDK_CONFIG_CRYPTO 00:09:11.829 #undef SPDK_CONFIG_CRYPTO_MLX5 00:09:11.829 #undef SPDK_CONFIG_CUSTOMOCF 00:09:11.829 #undef SPDK_CONFIG_DAOS 00:09:11.829 #define SPDK_CONFIG_DAOS_DIR 00:09:11.829 #define SPDK_CONFIG_DEBUG 1 00:09:11.829 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:09:11.829 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build 00:09:11.829 #define SPDK_CONFIG_DPDK_INC_DIR 00:09:11.829 #define SPDK_CONFIG_DPDK_LIB_DIR 00:09:11.829 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:09:11.829 #undef SPDK_CONFIG_DPDK_UADK 00:09:11.829 #define SPDK_CONFIG_ENV /var/jenkins/workspace/nvmf-phy-autotest/spdk/lib/env_dpdk 00:09:11.829 #define SPDK_CONFIG_EXAMPLES 1 00:09:11.829 #undef SPDK_CONFIG_FC 00:09:11.829 #define SPDK_CONFIG_FC_PATH 00:09:11.829 #define SPDK_CONFIG_FIO_PLUGIN 1 00:09:11.829 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:09:11.829 #define SPDK_CONFIG_FSDEV 1 00:09:11.829 #undef SPDK_CONFIG_FUSE 00:09:11.829 #undef SPDK_CONFIG_FUZZER 00:09:11.829 #define SPDK_CONFIG_FUZZER_LIB 00:09:11.829 #undef SPDK_CONFIG_GOLANG 00:09:11.829 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:09:11.829 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:09:11.829 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:09:11.829 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:09:11.829 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:09:11.829 #undef SPDK_CONFIG_HAVE_LIBBSD 00:09:11.829 #undef SPDK_CONFIG_HAVE_LZ4 00:09:11.829 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:09:11.829 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:09:11.829 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:09:11.829 #define SPDK_CONFIG_IDXD 1 00:09:11.829 #define SPDK_CONFIG_IDXD_KERNEL 1 00:09:11.829 #undef SPDK_CONFIG_IPSEC_MB 00:09:11.829 #define SPDK_CONFIG_IPSEC_MB_DIR 00:09:11.829 #define SPDK_CONFIG_ISAL 1 00:09:11.829 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:09:11.829 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:09:11.829 #define SPDK_CONFIG_LIBDIR 00:09:11.829 #undef SPDK_CONFIG_LTO 00:09:11.829 #define SPDK_CONFIG_MAX_LCORES 128 00:09:11.829 #define SPDK_CONFIG_NVME_CUSE 1 00:09:11.829 #undef SPDK_CONFIG_OCF 00:09:11.829 #define SPDK_CONFIG_OCF_PATH 00:09:11.829 #define SPDK_CONFIG_OPENSSL_PATH 00:09:11.829 #undef SPDK_CONFIG_PGO_CAPTURE 00:09:11.829 #define SPDK_CONFIG_PGO_DIR 00:09:11.829 #undef SPDK_CONFIG_PGO_USE 00:09:11.829 #define SPDK_CONFIG_PREFIX /usr/local 00:09:11.829 #undef SPDK_CONFIG_RAID5F 00:09:11.829 #undef SPDK_CONFIG_RBD 00:09:11.829 #define SPDK_CONFIG_RDMA 1 00:09:11.829 #define SPDK_CONFIG_RDMA_PROV verbs 00:09:11.829 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:09:11.829 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:09:11.829 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:09:11.829 #define SPDK_CONFIG_SHARED 1 00:09:11.829 #undef SPDK_CONFIG_SMA 00:09:11.829 #define SPDK_CONFIG_TESTS 1 00:09:11.829 #undef SPDK_CONFIG_TSAN 00:09:11.829 #define SPDK_CONFIG_UBLK 1 00:09:11.829 #define SPDK_CONFIG_UBSAN 1 00:09:11.829 #undef SPDK_CONFIG_UNIT_TESTS 00:09:11.829 #undef SPDK_CONFIG_URING 00:09:11.829 #define SPDK_CONFIG_URING_PATH 00:09:11.829 #undef SPDK_CONFIG_URING_ZNS 00:09:11.829 #undef SPDK_CONFIG_USDT 00:09:11.829 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:09:11.829 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:09:11.829 #undef SPDK_CONFIG_VFIO_USER 00:09:11.829 #define SPDK_CONFIG_VFIO_USER_DIR 00:09:11.829 #define SPDK_CONFIG_VHOST 1 00:09:11.829 #define SPDK_CONFIG_VIRTIO 1 00:09:11.829 #undef SPDK_CONFIG_VTUNE 00:09:11.829 #define SPDK_CONFIG_VTUNE_DIR 00:09:11.829 #define SPDK_CONFIG_WERROR 1 00:09:11.829 #define SPDK_CONFIG_WPDK_DIR 00:09:11.829 #undef SPDK_CONFIG_XNVME 00:09:11.829 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@15 -- # shopt -s extglob 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:11.829 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@5 -- # export PATH 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/common 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@6 -- # dirname /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/common 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@6 -- # readlink -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@7 -- # readlink -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/perf/pm/../../../ 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/nvmf-phy-autotest/spdk 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@64 -- # TEST_TAG=N/A 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/nvmf-phy-autotest/spdk/.run_test_name 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@68 -- # uname -s 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@68 -- # PM_OS=Linux 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@76 -- # SUDO[0]= 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@76 -- # SUDO[1]='sudo -E' 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@81 -- # [[ Linux == Linux ]] 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/power ]] 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@58 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@62 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@64 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@66 -- # : 1 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@68 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@70 -- # : 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@72 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@74 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@76 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@78 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@80 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@82 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@84 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@86 -- # : 1 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@88 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@90 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@92 -- # : 1 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@94 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@96 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@98 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@100 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@102 -- # : rdma 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@104 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@106 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@108 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@110 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@112 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@114 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@116 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@118 -- # : 0 00:09:11.830 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@120 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@122 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@124 -- # : 1 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@126 -- # : 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@128 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@130 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@132 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@134 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@136 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@138 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@140 -- # : 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@142 -- # : true 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@144 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@146 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@148 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@150 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@152 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@154 -- # : mlx5 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@156 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@158 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@160 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@162 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@164 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@166 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@169 -- # : 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@171 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@173 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@175 -- # : 0 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python:/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/nvmf-phy-autotest/spdk/python 00:09:11.831 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@204 -- # cat 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@267 -- # _LCOV= 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@273 -- # lcov_opt= 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@277 -- # export valgrind= 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@277 -- # valgrind= 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@283 -- # uname -s 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@287 -- # MAKE=make 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j72 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@307 -- # TEST_MODE= 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@308 -- # for i in "$@" 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@309 -- # case "$i" in 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@314 -- # TEST_TRANSPORT=rdma 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@329 -- # [[ -z 1718562 ]] 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@329 -- # kill -0 1718562 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@342 -- # local mount target_dir 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.IeGfGG 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target /tmp/spdk.IeGfGG/tests/target /tmp/spdk.IeGfGG 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@338 -- # df -T 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=51767021568 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=61734457344 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=9967435776 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=30852431872 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=30867226624 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=14794752 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:09:11.832 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=12324052992 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346892288 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=22839296 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=30866735104 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=30867230720 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=495616 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # avails["$mount"]=6173429760 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173442048 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:09:11.833 * Looking for test storage... 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@379 -- # local target_space new_size 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@383 -- # mount=/ 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@385 -- # target_space=51767021568 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@392 -- # new_size=12182028288 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:11.833 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@400 -- # return 0 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1668 -- # set -o errtrace 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1673 -- # true 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1675 -- # xtrace_fd 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@25 -- # [[ -n 15 ]] 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/15 ]] 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@27 -- # exec 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@29 -- # exec 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@31 -- # xtrace_restore 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@18 -- # set -x 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1681 -- # lcov --version 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@336 -- # IFS=.-: 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@336 -- # read -ra ver1 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@337 -- # IFS=.-: 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@337 -- # read -ra ver2 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@338 -- # local 'op=<' 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@340 -- # ver1_l=2 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@341 -- # ver2_l=1 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@344 -- # case "$op" in 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@345 -- # : 1 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:11.833 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@365 -- # decimal 1 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@353 -- # local d=1 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@355 -- # echo 1 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@365 -- # ver1[v]=1 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@366 -- # decimal 2 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@353 -- # local d=2 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@355 -- # echo 2 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@366 -- # ver2[v]=2 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@368 -- # return 0 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:12.093 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:09:12.093 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:12.093 --rc genhtml_branch_coverage=1 00:09:12.093 --rc genhtml_function_coverage=1 00:09:12.093 --rc genhtml_legend=1 00:09:12.093 --rc geninfo_all_blocks=1 00:09:12.093 --rc geninfo_unexecuted_blocks=1 00:09:12.093 00:09:12.093 ' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:09:12.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:12.094 --rc genhtml_branch_coverage=1 00:09:12.094 --rc genhtml_function_coverage=1 00:09:12.094 --rc genhtml_legend=1 00:09:12.094 --rc geninfo_all_blocks=1 00:09:12.094 --rc geninfo_unexecuted_blocks=1 00:09:12.094 00:09:12.094 ' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:09:12.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:12.094 --rc genhtml_branch_coverage=1 00:09:12.094 --rc genhtml_function_coverage=1 00:09:12.094 --rc genhtml_legend=1 00:09:12.094 --rc geninfo_all_blocks=1 00:09:12.094 --rc geninfo_unexecuted_blocks=1 00:09:12.094 00:09:12.094 ' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:09:12.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:12.094 --rc genhtml_branch_coverage=1 00:09:12.094 --rc genhtml_function_coverage=1 00:09:12.094 --rc genhtml_legend=1 00:09:12.094 --rc geninfo_all_blocks=1 00:09:12.094 --rc geninfo_unexecuted_blocks=1 00:09:12.094 00:09:12.094 ' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@7 -- # uname -s 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@15 -- # shopt -s extglob 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@5 -- # export PATH 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@50 -- # : 0 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:09:12.094 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@54 -- # have_pci_nics=0 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@12 -- # MALLOC_BDEV_SIZE=512 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@13 -- # MALLOC_BLOCK_SIZE=512 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@15 -- # nvmftestinit 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@292 -- # prepare_net_devs 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@254 -- # local -g is_hw=no 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@256 -- # remove_target_ns 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@125 -- # xtrace_disable 00:09:12.094 15:14:13 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@10 -- # set +x 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@131 -- # pci_devs=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@131 -- # local -a pci_devs 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@132 -- # pci_net_devs=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@133 -- # pci_drivers=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@133 -- # local -A pci_drivers 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@135 -- # net_devs=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@135 -- # local -ga net_devs 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@136 -- # e810=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@136 -- # local -ga e810 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@137 -- # x722=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@137 -- # local -ga x722 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@138 -- # mlx=() 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@138 -- # local -ga mlx 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:09:18.670 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:09:18.670 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:09:18.670 Found net devices under 0000:18:00.0: mlx_0_0 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:09:18.670 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:09:18.671 Found net devices under 0000:18:00.1: mlx_0_1 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@249 -- # get_rdma_if_list 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@75 -- # rdma_devs=() 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@89 -- # continue 2 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@89 -- # continue 2 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@258 -- # is_hw=yes 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@61 -- # uname 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@65 -- # modprobe ib_cm 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@66 -- # modprobe ib_core 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@67 -- # modprobe ib_umad 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@69 -- # modprobe iw_cm 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@27 -- # local -gA dev_map 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@28 -- # local -g _dev 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@44 -- # ips=() 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@58 -- # key_initiator=target1 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@11 -- # local val=167772161 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:09:18.671 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:09:18.931 10.0.0.1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@11 -- # local val=167772162 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:09:18.931 10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@38 -- # ping_ips 1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@107 -- # local dev=target0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:09:18.931 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:09:18.931 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:09:18.931 00:09:18.931 --- 10.0.0.2 ping statistics --- 00:09:18.931 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:09:18.931 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@107 -- # local dev=target0 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:18.931 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:09:18.932 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:09:18.932 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.025 ms 00:09:18.932 00:09:18.932 --- 10.0.0.2 ping statistics --- 00:09:18.932 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:09:18.932 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@98 -- # (( pair++ )) 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@266 -- # return 0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@107 -- # local dev=target0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # get_net_dev target1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@107 -- # local dev=target1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@107 -- # local dev=target0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # get_net_dev target1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@107 -- # local dev=target1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@105 -- # run_test nvmf_filesystem_no_in_capsule nvmf_filesystem_part 0 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:18.932 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@10 -- # set +x 00:09:19.191 ************************************ 00:09:19.191 START TEST nvmf_filesystem_no_in_capsule 00:09:19.191 ************************************ 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1125 -- # nvmf_filesystem_part 0 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@47 -- # in_capsule=0 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@49 -- # nvmfappstart -m 0xF 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- nvmf/common.sh@324 -- # nvmfpid=1721474 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- nvmf/common.sh@325 -- # waitforlisten 1721474 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@831 -- # '[' -z 1721474 ']' 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:19.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:19.191 15:14:20 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:19.191 [2024-09-27 15:14:20.837884] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:09:19.191 [2024-09-27 15:14:20.837940] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:19.191 [2024-09-27 15:14:20.925676] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:19.191 [2024-09-27 15:14:21.014328] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:09:19.191 [2024-09-27 15:14:21.014383] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:09:19.191 [2024-09-27 15:14:21.014394] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:09:19.191 [2024-09-27 15:14:21.014403] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:09:19.191 [2024-09-27 15:14:21.014410] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:09:19.191 [2024-09-27 15:14:21.014496] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:19.191 [2024-09-27 15:14:21.014597] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:19.191 [2024-09-27 15:14:21.014718] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.191 [2024-09-27 15:14:21.014719] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@864 -- # return 0 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@50 -- # malloc_name=Malloc1 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@52 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 -c 0 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.128 [2024-09-27 15:14:21.749929] rdma.c:2734:nvmf_rdma_create: *WARNING*: In capsule data size is set to 256, this is minimum size required to support msdbd=16 00:09:20.128 [2024-09-27 15:14:21.771181] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x21ee4a0/0x21f2990) succeed. 00:09:20.128 [2024-09-27 15:14:21.781721] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x21efae0/0x2234030) succeed. 00:09:20.128 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:20.129 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@53 -- # rpc_cmd bdev_malloc_create 512 512 -b Malloc1 00:09:20.129 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:20.129 15:14:21 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.389 Malloc1 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@54 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@55 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@56 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.389 [2024-09-27 15:14:22.036846] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@58 -- # get_bdev_size Malloc1 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1378 -- # local bdev_name=Malloc1 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1379 -- # local bdev_info 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1380 -- # local bs 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1381 -- # local nb 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1382 -- # rpc_cmd bdev_get_bdevs -b Malloc1 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1382 -- # bdev_info='[ 00:09:20.389 { 00:09:20.389 "name": "Malloc1", 00:09:20.389 "aliases": [ 00:09:20.389 "77e21bae-cff7-46a2-8d5c-76f01bb13629" 00:09:20.389 ], 00:09:20.389 "product_name": "Malloc disk", 00:09:20.389 "block_size": 512, 00:09:20.389 "num_blocks": 1048576, 00:09:20.389 "uuid": "77e21bae-cff7-46a2-8d5c-76f01bb13629", 00:09:20.389 "assigned_rate_limits": { 00:09:20.389 "rw_ios_per_sec": 0, 00:09:20.389 "rw_mbytes_per_sec": 0, 00:09:20.389 "r_mbytes_per_sec": 0, 00:09:20.389 "w_mbytes_per_sec": 0 00:09:20.389 }, 00:09:20.389 "claimed": true, 00:09:20.389 "claim_type": "exclusive_write", 00:09:20.389 "zoned": false, 00:09:20.389 "supported_io_types": { 00:09:20.389 "read": true, 00:09:20.389 "write": true, 00:09:20.389 "unmap": true, 00:09:20.389 "flush": true, 00:09:20.389 "reset": true, 00:09:20.389 "nvme_admin": false, 00:09:20.389 "nvme_io": false, 00:09:20.389 "nvme_io_md": false, 00:09:20.389 "write_zeroes": true, 00:09:20.389 "zcopy": true, 00:09:20.389 "get_zone_info": false, 00:09:20.389 "zone_management": false, 00:09:20.389 "zone_append": false, 00:09:20.389 "compare": false, 00:09:20.389 "compare_and_write": false, 00:09:20.389 "abort": true, 00:09:20.389 "seek_hole": false, 00:09:20.389 "seek_data": false, 00:09:20.389 "copy": true, 00:09:20.389 "nvme_iov_md": false 00:09:20.389 }, 00:09:20.389 "memory_domains": [ 00:09:20.389 { 00:09:20.389 "dma_device_id": "system", 00:09:20.389 "dma_device_type": 1 00:09:20.389 }, 00:09:20.389 { 00:09:20.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:20.389 "dma_device_type": 2 00:09:20.389 } 00:09:20.389 ], 00:09:20.389 "driver_specific": {} 00:09:20.389 } 00:09:20.389 ]' 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1383 -- # jq '.[] .block_size' 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1383 -- # bs=512 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1384 -- # jq '.[] .num_blocks' 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1384 -- # nb=1048576 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1387 -- # bdev_size=512 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1388 -- # echo 512 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@58 -- # malloc_size=536870912 00:09:20.389 15:14:22 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@60 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:09:21.327 15:14:23 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@62 -- # waitforserial SPDKISFASTANDAWESOME 00:09:21.327 15:14:23 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1198 -- # local i=0 00:09:21.327 15:14:23 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:09:21.327 15:14:23 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:09:21.327 15:14:23 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1205 -- # sleep 2 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1208 -- # return 0 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@63 -- # lsblk -l -o NAME,SERIAL 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@63 -- # grep -oP '([\w]*)(?=\s+SPDKISFASTANDAWESOME)' 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@63 -- # nvme_name=nvme0n1 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@64 -- # sec_size_to_bytes nvme0n1 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- setup/common.sh@76 -- # local dev=nvme0n1 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- setup/common.sh@80 -- # echo 536870912 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@64 -- # nvme_size=536870912 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@66 -- # mkdir -p /mnt/device 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@67 -- # (( nvme_size == malloc_size )) 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@68 -- # parted -s /dev/nvme0n1 mklabel gpt mkpart SPDK_TEST 0% 100% 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@69 -- # partprobe 00:09:23.892 15:14:25 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@70 -- # sleep 1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@76 -- # '[' 0 -eq 0 ']' 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@77 -- # run_test filesystem_ext4 nvmf_filesystem_create ext4 nvme0n1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:24.832 ************************************ 00:09:24.832 START TEST filesystem_ext4 00:09:24.832 ************************************ 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@1125 -- # nvmf_filesystem_create ext4 nvme0n1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@18 -- # fstype=ext4 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@19 -- # nvme_name=nvme0n1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@21 -- # make_filesystem ext4 /dev/nvme0n1p1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@926 -- # local fstype=ext4 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@927 -- # local dev_name=/dev/nvme0n1p1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@928 -- # local i=0 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@929 -- # local force 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@931 -- # '[' ext4 = ext4 ']' 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@932 -- # force=-F 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@937 -- # mkfs.ext4 -F /dev/nvme0n1p1 00:09:24.832 mke2fs 1.47.0 (5-Feb-2023) 00:09:24.832 Discarding device blocks: 0/522240 done 00:09:24.832 Creating filesystem with 522240 1k blocks and 130560 inodes 00:09:24.832 Filesystem UUID: c6b8ac4b-2c5e-4ddf-8d6f-2426cf8441a5 00:09:24.832 Superblock backups stored on blocks: 00:09:24.832 8193, 24577, 40961, 57345, 73729, 204801, 221185, 401409 00:09:24.832 00:09:24.832 Allocating group tables: 0/64 done 00:09:24.832 Writing inode tables: 0/64 done 00:09:24.832 Creating journal (8192 blocks): done 00:09:24.832 Writing superblocks and filesystem accounting information: 0/64 done 00:09:24.832 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@945 -- # return 0 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@23 -- # mount /dev/nvme0n1p1 /mnt/device 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@24 -- # touch /mnt/device/aaa 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@25 -- # sync 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@26 -- # rm /mnt/device/aaa 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@27 -- # sync 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@29 -- # i=0 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@30 -- # umount /mnt/device 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@37 -- # kill -0 1721474 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@40 -- # lsblk -l -o NAME 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@40 -- # grep -q -w nvme0n1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@43 -- # lsblk -l -o NAME 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- target/filesystem.sh@43 -- # grep -q -w nvme0n1p1 00:09:24.832 00:09:24.832 real 0m0.209s 00:09:24.832 user 0m0.024s 00:09:24.832 sys 0m0.077s 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_ext4 -- common/autotest_common.sh@10 -- # set +x 00:09:24.832 ************************************ 00:09:24.832 END TEST filesystem_ext4 00:09:24.832 ************************************ 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@78 -- # run_test filesystem_btrfs nvmf_filesystem_create btrfs nvme0n1 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:24.832 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:25.091 ************************************ 00:09:25.091 START TEST filesystem_btrfs 00:09:25.091 ************************************ 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@1125 -- # nvmf_filesystem_create btrfs nvme0n1 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@18 -- # fstype=btrfs 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@19 -- # nvme_name=nvme0n1 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@21 -- # make_filesystem btrfs /dev/nvme0n1p1 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@926 -- # local fstype=btrfs 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@927 -- # local dev_name=/dev/nvme0n1p1 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@928 -- # local i=0 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@929 -- # local force 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@931 -- # '[' btrfs = ext4 ']' 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@934 -- # force=-f 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@937 -- # mkfs.btrfs -f /dev/nvme0n1p1 00:09:25.091 btrfs-progs v6.8.1 00:09:25.091 See https://btrfs.readthedocs.io for more information. 00:09:25.091 00:09:25.091 Performing full device TRIM /dev/nvme0n1p1 (510.00MiB) ... 00:09:25.091 NOTE: several default settings have changed in version 5.15, please make sure 00:09:25.091 this does not affect your deployments: 00:09:25.091 - DUP for metadata (-m dup) 00:09:25.091 - enabled no-holes (-O no-holes) 00:09:25.091 - enabled free-space-tree (-R free-space-tree) 00:09:25.091 00:09:25.091 Label: (null) 00:09:25.091 UUID: 64bf8fdf-eb01-4c5d-a6d0-b70d1ada5b84 00:09:25.091 Node size: 16384 00:09:25.091 Sector size: 4096 (CPU page size: 4096) 00:09:25.091 Filesystem size: 510.00MiB 00:09:25.091 Block group profiles: 00:09:25.091 Data: single 8.00MiB 00:09:25.091 Metadata: DUP 32.00MiB 00:09:25.091 System: DUP 8.00MiB 00:09:25.091 SSD detected: yes 00:09:25.091 Zoned device: no 00:09:25.091 Features: extref, skinny-metadata, no-holes, free-space-tree 00:09:25.091 Checksum: crc32c 00:09:25.091 Number of devices: 1 00:09:25.091 Devices: 00:09:25.091 ID SIZE PATH 00:09:25.091 1 510.00MiB /dev/nvme0n1p1 00:09:25.091 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@945 -- # return 0 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@23 -- # mount /dev/nvme0n1p1 /mnt/device 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@24 -- # touch /mnt/device/aaa 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@25 -- # sync 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@26 -- # rm /mnt/device/aaa 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@27 -- # sync 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@29 -- # i=0 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@30 -- # umount /mnt/device 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@37 -- # kill -0 1721474 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@40 -- # lsblk -l -o NAME 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@40 -- # grep -q -w nvme0n1 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@43 -- # lsblk -l -o NAME 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- target/filesystem.sh@43 -- # grep -q -w nvme0n1p1 00:09:25.091 00:09:25.091 real 0m0.255s 00:09:25.091 user 0m0.037s 00:09:25.091 sys 0m0.117s 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:25.091 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_btrfs -- common/autotest_common.sh@10 -- # set +x 00:09:25.091 ************************************ 00:09:25.091 END TEST filesystem_btrfs 00:09:25.091 ************************************ 00:09:25.351 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@79 -- # run_test filesystem_xfs nvmf_filesystem_create xfs nvme0n1 00:09:25.351 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:25.351 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:25.351 15:14:26 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:25.351 ************************************ 00:09:25.351 START TEST filesystem_xfs 00:09:25.351 ************************************ 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@1125 -- # nvmf_filesystem_create xfs nvme0n1 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@18 -- # fstype=xfs 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@19 -- # nvme_name=nvme0n1 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@21 -- # make_filesystem xfs /dev/nvme0n1p1 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@926 -- # local fstype=xfs 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@927 -- # local dev_name=/dev/nvme0n1p1 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@928 -- # local i=0 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@929 -- # local force 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@931 -- # '[' xfs = ext4 ']' 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@934 -- # force=-f 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@937 -- # mkfs.xfs -f /dev/nvme0n1p1 00:09:25.351 meta-data=/dev/nvme0n1p1 isize=512 agcount=4, agsize=32640 blks 00:09:25.351 = sectsz=512 attr=2, projid32bit=1 00:09:25.351 = crc=1 finobt=1, sparse=1, rmapbt=0 00:09:25.351 = reflink=1 bigtime=1 inobtcount=1 nrext64=0 00:09:25.351 data = bsize=4096 blocks=130560, imaxpct=25 00:09:25.351 = sunit=0 swidth=0 blks 00:09:25.351 naming =version 2 bsize=4096 ascii-ci=0, ftype=1 00:09:25.351 log =internal log bsize=4096 blocks=16384, version=2 00:09:25.351 = sectsz=512 sunit=0 blks, lazy-count=1 00:09:25.351 realtime =none extsz=4096 blocks=0, rtextents=0 00:09:25.351 Discarding blocks...Done. 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@945 -- # return 0 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@23 -- # mount /dev/nvme0n1p1 /mnt/device 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@24 -- # touch /mnt/device/aaa 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@25 -- # sync 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@26 -- # rm /mnt/device/aaa 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@27 -- # sync 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@29 -- # i=0 00:09:25.351 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@30 -- # umount /mnt/device 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@37 -- # kill -0 1721474 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@40 -- # lsblk -l -o NAME 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@40 -- # grep -q -w nvme0n1 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@43 -- # lsblk -l -o NAME 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- target/filesystem.sh@43 -- # grep -q -w nvme0n1p1 00:09:25.610 00:09:25.610 real 0m0.207s 00:09:25.610 user 0m0.026s 00:09:25.610 sys 0m0.080s 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule.filesystem_xfs -- common/autotest_common.sh@10 -- # set +x 00:09:25.610 ************************************ 00:09:25.610 END TEST filesystem_xfs 00:09:25.610 ************************************ 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@91 -- # flock /dev/nvme0n1 parted -s /dev/nvme0n1 rm 1 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@93 -- # sync 00:09:25.610 15:14:27 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@94 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:09:26.576 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@95 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1219 -- # local i=0 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1231 -- # return 0 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@97 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@99 -- # trap - SIGINT SIGTERM EXIT 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@101 -- # killprocess 1721474 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@950 -- # '[' -z 1721474 ']' 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@954 -- # kill -0 1721474 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@955 -- # uname 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1721474 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1721474' 00:09:26.576 killing process with pid 1721474 00:09:26.576 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@969 -- # kill 1721474 00:09:26.577 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@974 -- # wait 1721474 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- target/filesystem.sh@102 -- # nvmfpid= 00:09:27.144 00:09:27.144 real 0m8.009s 00:09:27.144 user 0m31.077s 00:09:27.144 sys 0m1.297s 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_no_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:27.144 ************************************ 00:09:27.144 END TEST nvmf_filesystem_no_in_capsule 00:09:27.144 ************************************ 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@106 -- # run_test nvmf_filesystem_in_capsule nvmf_filesystem_part 4096 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@10 -- # set +x 00:09:27.144 ************************************ 00:09:27.144 START TEST nvmf_filesystem_in_capsule 00:09:27.144 ************************************ 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1125 -- # nvmf_filesystem_part 4096 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@47 -- # in_capsule=4096 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@49 -- # nvmfappstart -m 0xF 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- nvmf/common.sh@324 -- # nvmfpid=1722778 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- nvmf/common.sh@325 -- # waitforlisten 1722778 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@831 -- # '[' -z 1722778 ']' 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:27.144 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:27.144 15:14:28 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:27.144 [2024-09-27 15:14:28.942648] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:09:27.144 [2024-09-27 15:14:28.942709] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:27.404 [2024-09-27 15:14:29.027624] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:27.404 [2024-09-27 15:14:29.118354] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:09:27.404 [2024-09-27 15:14:29.118398] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:09:27.404 [2024-09-27 15:14:29.118408] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:09:27.404 [2024-09-27 15:14:29.118416] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:09:27.404 [2024-09-27 15:14:29.118423] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:09:27.404 [2024-09-27 15:14:29.118544] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:27.404 [2024-09-27 15:14:29.118644] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:27.404 [2024-09-27 15:14:29.118731] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:27.404 [2024-09-27 15:14:29.118733] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.969 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:27.969 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@864 -- # return 0 00:09:27.969 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:09:27.969 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:27.969 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.228 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:09:28.228 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@50 -- # malloc_name=Malloc1 00:09:28.228 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@52 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 -c 4096 00:09:28.228 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:28.228 15:14:29 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.228 [2024-09-27 15:14:29.883175] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1e3d4a0/0x1e41990) succeed. 00:09:28.228 [2024-09-27 15:14:29.894313] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1e3eae0/0x1e83030) succeed. 00:09:28.228 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:28.228 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@53 -- # rpc_cmd bdev_malloc_create 512 512 -b Malloc1 00:09:28.228 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:28.228 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.488 Malloc1 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@54 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@55 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@56 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.488 [2024-09-27 15:14:30.191769] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@58 -- # get_bdev_size Malloc1 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1378 -- # local bdev_name=Malloc1 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1379 -- # local bdev_info 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1380 -- # local bs 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1381 -- # local nb 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1382 -- # rpc_cmd bdev_get_bdevs -b Malloc1 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1382 -- # bdev_info='[ 00:09:28.488 { 00:09:28.488 "name": "Malloc1", 00:09:28.488 "aliases": [ 00:09:28.488 "a1dd8b56-a72d-4023-bddd-c35a782a7be4" 00:09:28.488 ], 00:09:28.488 "product_name": "Malloc disk", 00:09:28.488 "block_size": 512, 00:09:28.488 "num_blocks": 1048576, 00:09:28.488 "uuid": "a1dd8b56-a72d-4023-bddd-c35a782a7be4", 00:09:28.488 "assigned_rate_limits": { 00:09:28.488 "rw_ios_per_sec": 0, 00:09:28.488 "rw_mbytes_per_sec": 0, 00:09:28.488 "r_mbytes_per_sec": 0, 00:09:28.488 "w_mbytes_per_sec": 0 00:09:28.488 }, 00:09:28.488 "claimed": true, 00:09:28.488 "claim_type": "exclusive_write", 00:09:28.488 "zoned": false, 00:09:28.488 "supported_io_types": { 00:09:28.488 "read": true, 00:09:28.488 "write": true, 00:09:28.488 "unmap": true, 00:09:28.488 "flush": true, 00:09:28.488 "reset": true, 00:09:28.488 "nvme_admin": false, 00:09:28.488 "nvme_io": false, 00:09:28.488 "nvme_io_md": false, 00:09:28.488 "write_zeroes": true, 00:09:28.488 "zcopy": true, 00:09:28.488 "get_zone_info": false, 00:09:28.488 "zone_management": false, 00:09:28.488 "zone_append": false, 00:09:28.488 "compare": false, 00:09:28.488 "compare_and_write": false, 00:09:28.488 "abort": true, 00:09:28.488 "seek_hole": false, 00:09:28.488 "seek_data": false, 00:09:28.488 "copy": true, 00:09:28.488 "nvme_iov_md": false 00:09:28.488 }, 00:09:28.488 "memory_domains": [ 00:09:28.488 { 00:09:28.488 "dma_device_id": "system", 00:09:28.488 "dma_device_type": 1 00:09:28.488 }, 00:09:28.488 { 00:09:28.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:28.488 "dma_device_type": 2 00:09:28.488 } 00:09:28.488 ], 00:09:28.488 "driver_specific": {} 00:09:28.488 } 00:09:28.488 ]' 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1383 -- # jq '.[] .block_size' 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1383 -- # bs=512 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1384 -- # jq '.[] .num_blocks' 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1384 -- # nb=1048576 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1387 -- # bdev_size=512 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1388 -- # echo 512 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@58 -- # malloc_size=536870912 00:09:28.488 15:14:30 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@60 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:09:29.866 15:14:31 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@62 -- # waitforserial SPDKISFASTANDAWESOME 00:09:29.866 15:14:31 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1198 -- # local i=0 00:09:29.866 15:14:31 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:09:29.866 15:14:31 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:09:29.866 15:14:31 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1205 -- # sleep 2 00:09:31.888 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1208 -- # return 0 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@63 -- # lsblk -l -o NAME,SERIAL 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@63 -- # grep -oP '([\w]*)(?=\s+SPDKISFASTANDAWESOME)' 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@63 -- # nvme_name=nvme0n1 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@64 -- # sec_size_to_bytes nvme0n1 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- setup/common.sh@76 -- # local dev=nvme0n1 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- setup/common.sh@80 -- # echo 536870912 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@64 -- # nvme_size=536870912 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@66 -- # mkdir -p /mnt/device 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@67 -- # (( nvme_size == malloc_size )) 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@68 -- # parted -s /dev/nvme0n1 mklabel gpt mkpart SPDK_TEST 0% 100% 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@69 -- # partprobe 00:09:31.889 15:14:33 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@70 -- # sleep 1 00:09:32.824 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@76 -- # '[' 4096 -eq 0 ']' 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@81 -- # run_test filesystem_in_capsule_ext4 nvmf_filesystem_create ext4 nvme0n1 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:32.825 ************************************ 00:09:32.825 START TEST filesystem_in_capsule_ext4 00:09:32.825 ************************************ 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@1125 -- # nvmf_filesystem_create ext4 nvme0n1 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@18 -- # fstype=ext4 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@19 -- # nvme_name=nvme0n1 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@21 -- # make_filesystem ext4 /dev/nvme0n1p1 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@926 -- # local fstype=ext4 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@927 -- # local dev_name=/dev/nvme0n1p1 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@928 -- # local i=0 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@929 -- # local force 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@931 -- # '[' ext4 = ext4 ']' 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@932 -- # force=-F 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@937 -- # mkfs.ext4 -F /dev/nvme0n1p1 00:09:32.825 mke2fs 1.47.0 (5-Feb-2023) 00:09:32.825 Discarding device blocks: 0/522240 done 00:09:32.825 Creating filesystem with 522240 1k blocks and 130560 inodes 00:09:32.825 Filesystem UUID: 8250bdfd-ca4f-4663-b364-5e00982a60b3 00:09:32.825 Superblock backups stored on blocks: 00:09:32.825 8193, 24577, 40961, 57345, 73729, 204801, 221185, 401409 00:09:32.825 00:09:32.825 Allocating group tables: 0/64 done 00:09:32.825 Writing inode tables: 0/64 done 00:09:32.825 Creating journal (8192 blocks): done 00:09:32.825 Writing superblocks and filesystem accounting information: 0/64 done 00:09:32.825 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@945 -- # return 0 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@23 -- # mount /dev/nvme0n1p1 /mnt/device 00:09:32.825 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@24 -- # touch /mnt/device/aaa 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@25 -- # sync 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@26 -- # rm /mnt/device/aaa 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@27 -- # sync 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@29 -- # i=0 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@30 -- # umount /mnt/device 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@37 -- # kill -0 1722778 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@40 -- # lsblk -l -o NAME 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@40 -- # grep -q -w nvme0n1 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@43 -- # lsblk -l -o NAME 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- target/filesystem.sh@43 -- # grep -q -w nvme0n1p1 00:09:33.085 00:09:33.085 real 0m0.209s 00:09:33.085 user 0m0.029s 00:09:33.085 sys 0m0.074s 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_ext4 -- common/autotest_common.sh@10 -- # set +x 00:09:33.085 ************************************ 00:09:33.085 END TEST filesystem_in_capsule_ext4 00:09:33.085 ************************************ 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@82 -- # run_test filesystem_in_capsule_btrfs nvmf_filesystem_create btrfs nvme0n1 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:33.085 ************************************ 00:09:33.085 START TEST filesystem_in_capsule_btrfs 00:09:33.085 ************************************ 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@1125 -- # nvmf_filesystem_create btrfs nvme0n1 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@18 -- # fstype=btrfs 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@19 -- # nvme_name=nvme0n1 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@21 -- # make_filesystem btrfs /dev/nvme0n1p1 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@926 -- # local fstype=btrfs 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@927 -- # local dev_name=/dev/nvme0n1p1 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@928 -- # local i=0 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@929 -- # local force 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@931 -- # '[' btrfs = ext4 ']' 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@934 -- # force=-f 00:09:33.085 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@937 -- # mkfs.btrfs -f /dev/nvme0n1p1 00:09:33.345 btrfs-progs v6.8.1 00:09:33.345 See https://btrfs.readthedocs.io for more information. 00:09:33.345 00:09:33.345 Performing full device TRIM /dev/nvme0n1p1 (510.00MiB) ... 00:09:33.345 NOTE: several default settings have changed in version 5.15, please make sure 00:09:33.345 this does not affect your deployments: 00:09:33.345 - DUP for metadata (-m dup) 00:09:33.345 - enabled no-holes (-O no-holes) 00:09:33.345 - enabled free-space-tree (-R free-space-tree) 00:09:33.345 00:09:33.345 Label: (null) 00:09:33.345 UUID: 4d295102-6ec8-4528-ad37-53627818c7b8 00:09:33.345 Node size: 16384 00:09:33.345 Sector size: 4096 (CPU page size: 4096) 00:09:33.345 Filesystem size: 510.00MiB 00:09:33.345 Block group profiles: 00:09:33.345 Data: single 8.00MiB 00:09:33.345 Metadata: DUP 32.00MiB 00:09:33.345 System: DUP 8.00MiB 00:09:33.345 SSD detected: yes 00:09:33.345 Zoned device: no 00:09:33.345 Features: extref, skinny-metadata, no-holes, free-space-tree 00:09:33.345 Checksum: crc32c 00:09:33.345 Number of devices: 1 00:09:33.345 Devices: 00:09:33.345 ID SIZE PATH 00:09:33.345 1 510.00MiB /dev/nvme0n1p1 00:09:33.345 00:09:33.345 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@945 -- # return 0 00:09:33.345 15:14:34 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@23 -- # mount /dev/nvme0n1p1 /mnt/device 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@24 -- # touch /mnt/device/aaa 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@25 -- # sync 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@26 -- # rm /mnt/device/aaa 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@27 -- # sync 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@29 -- # i=0 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@30 -- # umount /mnt/device 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@37 -- # kill -0 1722778 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@40 -- # lsblk -l -o NAME 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@40 -- # grep -q -w nvme0n1 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@43 -- # lsblk -l -o NAME 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- target/filesystem.sh@43 -- # grep -q -w nvme0n1p1 00:09:33.345 00:09:33.345 real 0m0.266s 00:09:33.345 user 0m0.034s 00:09:33.345 sys 0m0.125s 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_btrfs -- common/autotest_common.sh@10 -- # set +x 00:09:33.345 ************************************ 00:09:33.345 END TEST filesystem_in_capsule_btrfs 00:09:33.345 ************************************ 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@83 -- # run_test filesystem_in_capsule_xfs nvmf_filesystem_create xfs nvme0n1 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:33.345 ************************************ 00:09:33.345 START TEST filesystem_in_capsule_xfs 00:09:33.345 ************************************ 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@1125 -- # nvmf_filesystem_create xfs nvme0n1 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@18 -- # fstype=xfs 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@19 -- # nvme_name=nvme0n1 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@21 -- # make_filesystem xfs /dev/nvme0n1p1 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@926 -- # local fstype=xfs 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@927 -- # local dev_name=/dev/nvme0n1p1 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@928 -- # local i=0 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@929 -- # local force 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@931 -- # '[' xfs = ext4 ']' 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@934 -- # force=-f 00:09:33.345 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@937 -- # mkfs.xfs -f /dev/nvme0n1p1 00:09:33.605 meta-data=/dev/nvme0n1p1 isize=512 agcount=4, agsize=32640 blks 00:09:33.605 = sectsz=512 attr=2, projid32bit=1 00:09:33.605 = crc=1 finobt=1, sparse=1, rmapbt=0 00:09:33.605 = reflink=1 bigtime=1 inobtcount=1 nrext64=0 00:09:33.605 data = bsize=4096 blocks=130560, imaxpct=25 00:09:33.605 = sunit=0 swidth=0 blks 00:09:33.605 naming =version 2 bsize=4096 ascii-ci=0, ftype=1 00:09:33.605 log =internal log bsize=4096 blocks=16384, version=2 00:09:33.605 = sectsz=512 sunit=0 blks, lazy-count=1 00:09:33.605 realtime =none extsz=4096 blocks=0, rtextents=0 00:09:33.605 Discarding blocks...Done. 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@945 -- # return 0 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@23 -- # mount /dev/nvme0n1p1 /mnt/device 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@24 -- # touch /mnt/device/aaa 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@25 -- # sync 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@26 -- # rm /mnt/device/aaa 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@27 -- # sync 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@29 -- # i=0 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@30 -- # umount /mnt/device 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@37 -- # kill -0 1722778 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@40 -- # lsblk -l -o NAME 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@40 -- # grep -q -w nvme0n1 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@43 -- # lsblk -l -o NAME 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- target/filesystem.sh@43 -- # grep -q -w nvme0n1p1 00:09:33.605 00:09:33.605 real 0m0.213s 00:09:33.605 user 0m0.026s 00:09:33.605 sys 0m0.074s 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule.filesystem_in_capsule_xfs -- common/autotest_common.sh@10 -- # set +x 00:09:33.605 ************************************ 00:09:33.605 END TEST filesystem_in_capsule_xfs 00:09:33.605 ************************************ 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@91 -- # flock /dev/nvme0n1 parted -s /dev/nvme0n1 rm 1 00:09:33.605 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@93 -- # sync 00:09:33.864 15:14:35 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@94 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:09:34.801 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@95 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1219 -- # local i=0 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1231 -- # return 0 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@97 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@99 -- # trap - SIGINT SIGTERM EXIT 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@101 -- # killprocess 1722778 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@950 -- # '[' -z 1722778 ']' 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@954 -- # kill -0 1722778 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@955 -- # uname 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1722778 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1722778' 00:09:34.801 killing process with pid 1722778 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@969 -- # kill 1722778 00:09:34.801 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@974 -- # wait 1722778 00:09:35.371 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- target/filesystem.sh@102 -- # nvmfpid= 00:09:35.371 00:09:35.371 real 0m8.084s 00:09:35.371 user 0m31.302s 00:09:35.371 sys 0m1.329s 00:09:35.371 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:35.371 15:14:36 nvmf_rdma.nvmf_target_extra.nvmf_filesystem.nvmf_filesystem_in_capsule -- common/autotest_common.sh@10 -- # set +x 00:09:35.371 ************************************ 00:09:35.371 END TEST nvmf_filesystem_in_capsule 00:09:35.371 ************************************ 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- target/filesystem.sh@108 -- # nvmftestfini 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@331 -- # nvmfcleanup 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@99 -- # sync 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@102 -- # set +e 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@103 -- # for i in {1..20} 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:09:35.371 rmmod nvme_rdma 00:09:35.371 rmmod nvme_fabrics 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@106 -- # set -e 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@107 -- # return 0 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@332 -- # '[' -n '' ']' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@338 -- # nvmf_fini 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@264 -- # local dev 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@267 -- # remove_target_ns 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@268 -- # delete_main_bridge 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@130 -- # return 0 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@41 -- # _dev=0 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@41 -- # dev_map=() 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/setup.sh@284 -- # iptr 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@538 -- # iptables-save 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- nvmf/common.sh@538 -- # iptables-restore 00:09:35.371 00:09:35.371 real 0m23.881s 00:09:35.371 user 1m4.696s 00:09:35.371 sys 0m8.306s 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_filesystem -- common/autotest_common.sh@10 -- # set +x 00:09:35.371 ************************************ 00:09:35.371 END TEST nvmf_filesystem 00:09:35.371 ************************************ 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@18 -- # run_test nvmf_target_discovery /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/discovery.sh --transport=rdma 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:09:35.371 ************************************ 00:09:35.371 START TEST nvmf_target_discovery 00:09:35.371 ************************************ 00:09:35.371 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/discovery.sh --transport=rdma 00:09:35.632 * Looking for test storage... 00:09:35.632 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1681 -- # lcov --version 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@336 -- # IFS=.-: 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@336 -- # read -ra ver1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@337 -- # IFS=.-: 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@337 -- # read -ra ver2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@338 -- # local 'op=<' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@340 -- # ver1_l=2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@341 -- # ver2_l=1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@344 -- # case "$op" in 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@345 -- # : 1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@365 -- # decimal 1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@353 -- # local d=1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@355 -- # echo 1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@365 -- # ver1[v]=1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@366 -- # decimal 2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@353 -- # local d=2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@355 -- # echo 2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@366 -- # ver2[v]=2 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@368 -- # return 0 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:09:35.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:35.632 --rc genhtml_branch_coverage=1 00:09:35.632 --rc genhtml_function_coverage=1 00:09:35.632 --rc genhtml_legend=1 00:09:35.632 --rc geninfo_all_blocks=1 00:09:35.632 --rc geninfo_unexecuted_blocks=1 00:09:35.632 00:09:35.632 ' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:09:35.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:35.632 --rc genhtml_branch_coverage=1 00:09:35.632 --rc genhtml_function_coverage=1 00:09:35.632 --rc genhtml_legend=1 00:09:35.632 --rc geninfo_all_blocks=1 00:09:35.632 --rc geninfo_unexecuted_blocks=1 00:09:35.632 00:09:35.632 ' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:09:35.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:35.632 --rc genhtml_branch_coverage=1 00:09:35.632 --rc genhtml_function_coverage=1 00:09:35.632 --rc genhtml_legend=1 00:09:35.632 --rc geninfo_all_blocks=1 00:09:35.632 --rc geninfo_unexecuted_blocks=1 00:09:35.632 00:09:35.632 ' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:09:35.632 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:35.632 --rc genhtml_branch_coverage=1 00:09:35.632 --rc genhtml_function_coverage=1 00:09:35.632 --rc genhtml_legend=1 00:09:35.632 --rc geninfo_all_blocks=1 00:09:35.632 --rc geninfo_unexecuted_blocks=1 00:09:35.632 00:09:35.632 ' 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@7 -- # uname -s 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:35.632 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@15 -- # shopt -s extglob 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- paths/export.sh@5 -- # export PATH 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@50 -- # : 0 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:09:35.633 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@54 -- # have_pci_nics=0 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@11 -- # NULL_BDEV_SIZE=102400 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@12 -- # NULL_BLOCK_SIZE=512 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@13 -- # NVMF_PORT_REFERRAL=4430 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@15 -- # nvmftestinit 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@292 -- # prepare_net_devs 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@254 -- # local -g is_hw=no 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@256 -- # remove_target_ns 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@125 -- # xtrace_disable 00:09:35.633 15:14:37 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@131 -- # pci_devs=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@131 -- # local -a pci_devs 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@132 -- # pci_net_devs=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@133 -- # pci_drivers=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@133 -- # local -A pci_drivers 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@135 -- # net_devs=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@135 -- # local -ga net_devs 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@136 -- # e810=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@136 -- # local -ga e810 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@137 -- # x722=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@137 -- # local -ga x722 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@138 -- # mlx=() 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@138 -- # local -ga mlx 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:09:43.759 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:09:43.760 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:09:43.760 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:09:43.760 Found net devices under 0000:18:00.0: mlx_0_0 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:09:43.760 Found net devices under 0000:18:00.1: mlx_0_1 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@249 -- # get_rdma_if_list 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@75 -- # rdma_devs=() 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@89 -- # continue 2 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@89 -- # continue 2 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@258 -- # is_hw=yes 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@61 -- # uname 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@65 -- # modprobe ib_cm 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@66 -- # modprobe ib_core 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@67 -- # modprobe ib_umad 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@69 -- # modprobe iw_cm 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@27 -- # local -gA dev_map 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@28 -- # local -g _dev 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@44 -- # ips=() 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:09:43.760 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@58 -- # key_initiator=target1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@11 -- # local val=167772161 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:09:43.761 10.0.0.1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@11 -- # local val=167772162 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:09:43.761 10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@38 -- # ping_ips 1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@107 -- # local dev=target0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:09:43.761 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:09:43.761 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.035 ms 00:09:43.761 00:09:43.761 --- 10.0.0.2 ping statistics --- 00:09:43.761 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:09:43.761 rtt min/avg/max/mdev = 0.035/0.035/0.035/0.000 ms 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@107 -- # local dev=target0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:09:43.761 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:09:43.761 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.027 ms 00:09:43.761 00:09:43.761 --- 10.0.0.2 ping statistics --- 00:09:43.761 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:09:43.761 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@98 -- # (( pair++ )) 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@266 -- # return 0 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:09:43.761 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@107 -- # local dev=target0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # get_net_dev target1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@107 -- # local dev=target1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@107 -- # local dev=target0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # get_net_dev target1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@107 -- # local dev=target1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@16 -- # nvmfappstart -m 0xF 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@324 -- # nvmfpid=1726988 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@325 -- # waitforlisten 1726988 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@831 -- # '[' -z 1726988 ']' 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:43.762 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:43.762 15:14:44 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:43.762 [2024-09-27 15:14:44.543153] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:09:43.762 [2024-09-27 15:14:44.543219] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:43.762 [2024-09-27 15:14:44.630964] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:43.762 [2024-09-27 15:14:44.718586] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:09:43.762 [2024-09-27 15:14:44.718623] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:09:43.762 [2024-09-27 15:14:44.718633] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:09:43.762 [2024-09-27 15:14:44.718642] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:09:43.762 [2024-09-27 15:14:44.718648] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:09:43.762 [2024-09-27 15:14:44.718703] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:43.762 [2024-09-27 15:14:44.718806] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:43.762 [2024-09-27 15:14:44.718906] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.762 [2024-09-27 15:14:44.718907] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@864 -- # return 0 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@18 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:43.763 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:43.763 [2024-09-27 15:14:45.489112] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x14c64a0/0x14ca990) succeed. 00:09:43.763 [2024-09-27 15:14:45.499519] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x14c7ae0/0x150c030) succeed. 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@21 -- # seq 1 4 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@21 -- # for i in $(seq 1 4) 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@22 -- # rpc_cmd bdev_null_create Null1 102400 512 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 Null1 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@24 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Null1 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@25 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 [2024-09-27 15:14:45.671356] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@21 -- # for i in $(seq 1 4) 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@22 -- # rpc_cmd bdev_null_create Null2 102400 512 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 Null2 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode2 -a -s SPDK00000000000002 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@24 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode2 Null2 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@25 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode2 -t rdma -a 10.0.0.2 -s 4420 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@21 -- # for i in $(seq 1 4) 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@22 -- # rpc_cmd bdev_null_create Null3 102400 512 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.022 Null3 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.022 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode3 -a -s SPDK00000000000003 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@24 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode3 Null3 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@25 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode3 -t rdma -a 10.0.0.2 -s 4420 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@21 -- # for i in $(seq 1 4) 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@22 -- # rpc_cmd bdev_null_create Null4 102400 512 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 Null4 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode4 -a -s SPDK00000000000004 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@24 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode4 Null4 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@25 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode4 -t rdma -a 10.0.0.2 -s 4420 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@27 -- # rpc_cmd nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@30 -- # rpc_cmd nvmf_discovery_add_referral -t rdma -a 10.0.0.2 -s 4430 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.023 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@32 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 4420 00:09:44.281 00:09:44.281 Discovery Log Number of Records 6, Generation counter 6 00:09:44.281 =====Discovery Log Entry 0====== 00:09:44.281 trtype: rdma 00:09:44.281 adrfam: ipv4 00:09:44.281 subtype: current discovery subsystem 00:09:44.281 treq: not required 00:09:44.281 portid: 0 00:09:44.281 trsvcid: 4420 00:09:44.281 subnqn: nqn.2014-08.org.nvmexpress.discovery 00:09:44.281 traddr: 10.0.0.2 00:09:44.281 eflags: explicit discovery connections, duplicate discovery information 00:09:44.281 rdma_prtype: not specified 00:09:44.281 rdma_qptype: connected 00:09:44.281 rdma_cms: rdma-cm 00:09:44.281 rdma_pkey: 0x0000 00:09:44.281 =====Discovery Log Entry 1====== 00:09:44.281 trtype: rdma 00:09:44.281 adrfam: ipv4 00:09:44.281 subtype: nvme subsystem 00:09:44.281 treq: not required 00:09:44.281 portid: 0 00:09:44.281 trsvcid: 4420 00:09:44.281 subnqn: nqn.2016-06.io.spdk:cnode1 00:09:44.281 traddr: 10.0.0.2 00:09:44.281 eflags: none 00:09:44.281 rdma_prtype: not specified 00:09:44.281 rdma_qptype: connected 00:09:44.281 rdma_cms: rdma-cm 00:09:44.281 rdma_pkey: 0x0000 00:09:44.281 =====Discovery Log Entry 2====== 00:09:44.281 trtype: rdma 00:09:44.281 adrfam: ipv4 00:09:44.281 subtype: nvme subsystem 00:09:44.281 treq: not required 00:09:44.281 portid: 0 00:09:44.281 trsvcid: 4420 00:09:44.281 subnqn: nqn.2016-06.io.spdk:cnode2 00:09:44.281 traddr: 10.0.0.2 00:09:44.281 eflags: none 00:09:44.281 rdma_prtype: not specified 00:09:44.281 rdma_qptype: connected 00:09:44.281 rdma_cms: rdma-cm 00:09:44.281 rdma_pkey: 0x0000 00:09:44.281 =====Discovery Log Entry 3====== 00:09:44.281 trtype: rdma 00:09:44.281 adrfam: ipv4 00:09:44.281 subtype: nvme subsystem 00:09:44.281 treq: not required 00:09:44.281 portid: 0 00:09:44.281 trsvcid: 4420 00:09:44.281 subnqn: nqn.2016-06.io.spdk:cnode3 00:09:44.281 traddr: 10.0.0.2 00:09:44.281 eflags: none 00:09:44.281 rdma_prtype: not specified 00:09:44.281 rdma_qptype: connected 00:09:44.281 rdma_cms: rdma-cm 00:09:44.281 rdma_pkey: 0x0000 00:09:44.281 =====Discovery Log Entry 4====== 00:09:44.281 trtype: rdma 00:09:44.281 adrfam: ipv4 00:09:44.281 subtype: nvme subsystem 00:09:44.281 treq: not required 00:09:44.281 portid: 0 00:09:44.281 trsvcid: 4420 00:09:44.281 subnqn: nqn.2016-06.io.spdk:cnode4 00:09:44.281 traddr: 10.0.0.2 00:09:44.281 eflags: none 00:09:44.281 rdma_prtype: not specified 00:09:44.281 rdma_qptype: connected 00:09:44.281 rdma_cms: rdma-cm 00:09:44.281 rdma_pkey: 0x0000 00:09:44.281 =====Discovery Log Entry 5====== 00:09:44.281 trtype: rdma 00:09:44.281 adrfam: ipv4 00:09:44.281 subtype: discovery subsystem referral 00:09:44.281 treq: not required 00:09:44.281 portid: 0 00:09:44.281 trsvcid: 4430 00:09:44.281 subnqn: nqn.2014-08.org.nvmexpress.discovery 00:09:44.281 traddr: 10.0.0.2 00:09:44.281 eflags: none 00:09:44.281 rdma_prtype: unrecognized 00:09:44.281 rdma_qptype: unrecognized 00:09:44.281 rdma_cms: unrecognized 00:09:44.281 rdma_pkey: 0x0000 00:09:44.281 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@34 -- # echo 'Perform nvmf subsystem discovery via RPC' 00:09:44.281 Perform nvmf subsystem discovery via RPC 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@35 -- # rpc_cmd nvmf_get_subsystems 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 [ 00:09:44.282 { 00:09:44.282 "nqn": "nqn.2014-08.org.nvmexpress.discovery", 00:09:44.282 "subtype": "Discovery", 00:09:44.282 "listen_addresses": [ 00:09:44.282 { 00:09:44.282 "trtype": "RDMA", 00:09:44.282 "adrfam": "IPv4", 00:09:44.282 "traddr": "10.0.0.2", 00:09:44.282 "trsvcid": "4420" 00:09:44.282 } 00:09:44.282 ], 00:09:44.282 "allow_any_host": true, 00:09:44.282 "hosts": [] 00:09:44.282 }, 00:09:44.282 { 00:09:44.282 "nqn": "nqn.2016-06.io.spdk:cnode1", 00:09:44.282 "subtype": "NVMe", 00:09:44.282 "listen_addresses": [ 00:09:44.282 { 00:09:44.282 "trtype": "RDMA", 00:09:44.282 "adrfam": "IPv4", 00:09:44.282 "traddr": "10.0.0.2", 00:09:44.282 "trsvcid": "4420" 00:09:44.282 } 00:09:44.282 ], 00:09:44.282 "allow_any_host": true, 00:09:44.282 "hosts": [], 00:09:44.282 "serial_number": "SPDK00000000000001", 00:09:44.282 "model_number": "SPDK bdev Controller", 00:09:44.282 "max_namespaces": 32, 00:09:44.282 "min_cntlid": 1, 00:09:44.282 "max_cntlid": 65519, 00:09:44.282 "namespaces": [ 00:09:44.282 { 00:09:44.282 "nsid": 1, 00:09:44.282 "bdev_name": "Null1", 00:09:44.282 "name": "Null1", 00:09:44.282 "nguid": "DA009798EC094BC2AF7DF9F7064F1D94", 00:09:44.282 "uuid": "da009798-ec09-4bc2-af7d-f9f7064f1d94" 00:09:44.282 } 00:09:44.282 ] 00:09:44.282 }, 00:09:44.282 { 00:09:44.282 "nqn": "nqn.2016-06.io.spdk:cnode2", 00:09:44.282 "subtype": "NVMe", 00:09:44.282 "listen_addresses": [ 00:09:44.282 { 00:09:44.282 "trtype": "RDMA", 00:09:44.282 "adrfam": "IPv4", 00:09:44.282 "traddr": "10.0.0.2", 00:09:44.282 "trsvcid": "4420" 00:09:44.282 } 00:09:44.282 ], 00:09:44.282 "allow_any_host": true, 00:09:44.282 "hosts": [], 00:09:44.282 "serial_number": "SPDK00000000000002", 00:09:44.282 "model_number": "SPDK bdev Controller", 00:09:44.282 "max_namespaces": 32, 00:09:44.282 "min_cntlid": 1, 00:09:44.282 "max_cntlid": 65519, 00:09:44.282 "namespaces": [ 00:09:44.282 { 00:09:44.282 "nsid": 1, 00:09:44.282 "bdev_name": "Null2", 00:09:44.282 "name": "Null2", 00:09:44.282 "nguid": "0AB7A0E5B7034E88A8FFA8F685924D78", 00:09:44.282 "uuid": "0ab7a0e5-b703-4e88-a8ff-a8f685924d78" 00:09:44.282 } 00:09:44.282 ] 00:09:44.282 }, 00:09:44.282 { 00:09:44.282 "nqn": "nqn.2016-06.io.spdk:cnode3", 00:09:44.282 "subtype": "NVMe", 00:09:44.282 "listen_addresses": [ 00:09:44.282 { 00:09:44.282 "trtype": "RDMA", 00:09:44.282 "adrfam": "IPv4", 00:09:44.282 "traddr": "10.0.0.2", 00:09:44.282 "trsvcid": "4420" 00:09:44.282 } 00:09:44.282 ], 00:09:44.282 "allow_any_host": true, 00:09:44.282 "hosts": [], 00:09:44.282 "serial_number": "SPDK00000000000003", 00:09:44.282 "model_number": "SPDK bdev Controller", 00:09:44.282 "max_namespaces": 32, 00:09:44.282 "min_cntlid": 1, 00:09:44.282 "max_cntlid": 65519, 00:09:44.282 "namespaces": [ 00:09:44.282 { 00:09:44.282 "nsid": 1, 00:09:44.282 "bdev_name": "Null3", 00:09:44.282 "name": "Null3", 00:09:44.282 "nguid": "479A89316FE0487D8090CD0FAA3D0607", 00:09:44.282 "uuid": "479a8931-6fe0-487d-8090-cd0faa3d0607" 00:09:44.282 } 00:09:44.282 ] 00:09:44.282 }, 00:09:44.282 { 00:09:44.282 "nqn": "nqn.2016-06.io.spdk:cnode4", 00:09:44.282 "subtype": "NVMe", 00:09:44.282 "listen_addresses": [ 00:09:44.282 { 00:09:44.282 "trtype": "RDMA", 00:09:44.282 "adrfam": "IPv4", 00:09:44.282 "traddr": "10.0.0.2", 00:09:44.282 "trsvcid": "4420" 00:09:44.282 } 00:09:44.282 ], 00:09:44.282 "allow_any_host": true, 00:09:44.282 "hosts": [], 00:09:44.282 "serial_number": "SPDK00000000000004", 00:09:44.282 "model_number": "SPDK bdev Controller", 00:09:44.282 "max_namespaces": 32, 00:09:44.282 "min_cntlid": 1, 00:09:44.282 "max_cntlid": 65519, 00:09:44.282 "namespaces": [ 00:09:44.282 { 00:09:44.282 "nsid": 1, 00:09:44.282 "bdev_name": "Null4", 00:09:44.282 "name": "Null4", 00:09:44.282 "nguid": "C62C9E55BB3F4164B3D97A683AF25F83", 00:09:44.282 "uuid": "c62c9e55-bb3f-4164-b3d9-7a683af25f83" 00:09:44.282 } 00:09:44.282 ] 00:09:44.282 } 00:09:44.282 ] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@37 -- # seq 1 4 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@37 -- # for i in $(seq 1 4) 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@38 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@39 -- # rpc_cmd bdev_null_delete Null1 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@37 -- # for i in $(seq 1 4) 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@38 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode2 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@39 -- # rpc_cmd bdev_null_delete Null2 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@37 -- # for i in $(seq 1 4) 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@38 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode3 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@39 -- # rpc_cmd bdev_null_delete Null3 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@37 -- # for i in $(seq 1 4) 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@38 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode4 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:45 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@39 -- # rpc_cmd bdev_null_delete Null4 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@42 -- # rpc_cmd nvmf_discovery_remove_referral -t rdma -a 10.0.0.2 -s 4430 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@44 -- # rpc_cmd bdev_get_bdevs 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@44 -- # jq -r '.[].name' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@44 -- # check_bdevs= 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@45 -- # '[' -n '' ']' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@50 -- # trap - SIGINT SIGTERM EXIT 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- target/discovery.sh@52 -- # nvmftestfini 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@331 -- # nvmfcleanup 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@99 -- # sync 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@102 -- # set +e 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@103 -- # for i in {1..20} 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:09:44.282 rmmod nvme_rdma 00:09:44.282 rmmod nvme_fabrics 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@106 -- # set -e 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@107 -- # return 0 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@332 -- # '[' -n 1726988 ']' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@333 -- # killprocess 1726988 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@950 -- # '[' -z 1726988 ']' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@954 -- # kill -0 1726988 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@955 -- # uname 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:44.282 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1726988 00:09:44.541 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:44.541 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:44.541 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1726988' 00:09:44.541 killing process with pid 1726988 00:09:44.541 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@969 -- # kill 1726988 00:09:44.541 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@974 -- # wait 1726988 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@338 -- # nvmf_fini 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@264 -- # local dev 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@267 -- # remove_target_ns 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@268 -- # delete_main_bridge 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@130 -- # return 0 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@41 -- # _dev=0 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@41 -- # dev_map=() 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/setup.sh@284 -- # iptr 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@538 -- # iptables-save 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- nvmf/common.sh@538 -- # iptables-restore 00:09:44.801 00:09:44.801 real 0m9.299s 00:09:44.801 user 0m9.058s 00:09:44.801 sys 0m5.959s 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_target_discovery -- common/autotest_common.sh@10 -- # set +x 00:09:44.801 ************************************ 00:09:44.801 END TEST nvmf_target_discovery 00:09:44.801 ************************************ 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@19 -- # run_test nvmf_referrals /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/referrals.sh --transport=rdma 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:09:44.801 ************************************ 00:09:44.801 START TEST nvmf_referrals 00:09:44.801 ************************************ 00:09:44.801 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/referrals.sh --transport=rdma 00:09:45.062 * Looking for test storage... 00:09:45.062 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1681 -- # lcov --version 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@336 -- # IFS=.-: 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@336 -- # read -ra ver1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@337 -- # IFS=.-: 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@337 -- # read -ra ver2 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@338 -- # local 'op=<' 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@340 -- # ver1_l=2 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@341 -- # ver2_l=1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@344 -- # case "$op" in 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@345 -- # : 1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@365 -- # decimal 1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@353 -- # local d=1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@355 -- # echo 1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@365 -- # ver1[v]=1 00:09:45.062 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@366 -- # decimal 2 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@353 -- # local d=2 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@355 -- # echo 2 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@366 -- # ver2[v]=2 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@368 -- # return 0 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:09:45.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:45.063 --rc genhtml_branch_coverage=1 00:09:45.063 --rc genhtml_function_coverage=1 00:09:45.063 --rc genhtml_legend=1 00:09:45.063 --rc geninfo_all_blocks=1 00:09:45.063 --rc geninfo_unexecuted_blocks=1 00:09:45.063 00:09:45.063 ' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:09:45.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:45.063 --rc genhtml_branch_coverage=1 00:09:45.063 --rc genhtml_function_coverage=1 00:09:45.063 --rc genhtml_legend=1 00:09:45.063 --rc geninfo_all_blocks=1 00:09:45.063 --rc geninfo_unexecuted_blocks=1 00:09:45.063 00:09:45.063 ' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:09:45.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:45.063 --rc genhtml_branch_coverage=1 00:09:45.063 --rc genhtml_function_coverage=1 00:09:45.063 --rc genhtml_legend=1 00:09:45.063 --rc geninfo_all_blocks=1 00:09:45.063 --rc geninfo_unexecuted_blocks=1 00:09:45.063 00:09:45.063 ' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:09:45.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:45.063 --rc genhtml_branch_coverage=1 00:09:45.063 --rc genhtml_function_coverage=1 00:09:45.063 --rc genhtml_legend=1 00:09:45.063 --rc geninfo_all_blocks=1 00:09:45.063 --rc geninfo_unexecuted_blocks=1 00:09:45.063 00:09:45.063 ' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@7 -- # uname -s 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@15 -- # shopt -s extglob 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- paths/export.sh@5 -- # export PATH 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@50 -- # : 0 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:09:45.063 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@54 -- # have_pci_nics=0 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@11 -- # NVMF_REFERRAL_IP_1=127.0.0.2 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@12 -- # NVMF_REFERRAL_IP_2=127.0.0.3 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@13 -- # NVMF_REFERRAL_IP_3=127.0.0.4 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@14 -- # NVMF_PORT_REFERRAL=4430 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@15 -- # DISCOVERY_NQN=nqn.2014-08.org.nvmexpress.discovery 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@16 -- # NQN=nqn.2016-06.io.spdk:cnode1 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@37 -- # nvmftestinit 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@292 -- # prepare_net_devs 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@254 -- # local -g is_hw=no 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@256 -- # remove_target_ns 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@125 -- # xtrace_disable 00:09:45.063 15:14:46 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@131 -- # pci_devs=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@131 -- # local -a pci_devs 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@132 -- # pci_net_devs=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@133 -- # pci_drivers=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@133 -- # local -A pci_drivers 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@135 -- # net_devs=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@135 -- # local -ga net_devs 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@136 -- # e810=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@136 -- # local -ga e810 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@137 -- # x722=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@137 -- # local -ga x722 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@138 -- # mlx=() 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@138 -- # local -ga mlx 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:09:51.634 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:09:51.634 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:09:51.895 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:09:51.895 Found net devices under 0000:18:00.0: mlx_0_0 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:09:51.895 Found net devices under 0000:18:00.1: mlx_0_1 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@249 -- # get_rdma_if_list 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@75 -- # rdma_devs=() 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@89 -- # continue 2 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@89 -- # continue 2 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@258 -- # is_hw=yes 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@61 -- # uname 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@65 -- # modprobe ib_cm 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@66 -- # modprobe ib_core 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@67 -- # modprobe ib_umad 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@69 -- # modprobe iw_cm 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@27 -- # local -gA dev_map 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@28 -- # local -g _dev 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@44 -- # ips=() 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:09:51.895 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@58 -- # key_initiator=target1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@11 -- # local val=167772161 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:09:51.896 10.0.0.1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@11 -- # local val=167772162 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:09:51.896 10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@38 -- # ping_ips 1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@107 -- # local dev=target0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:09:51.896 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:09:51.896 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.037 ms 00:09:51.896 00:09:51.896 --- 10.0.0.2 ping statistics --- 00:09:51.896 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:09:51.896 rtt min/avg/max/mdev = 0.037/0.037/0.037/0.000 ms 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@107 -- # local dev=target0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:09:51.896 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:09:51.896 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.024 ms 00:09:51.896 00:09:51.896 --- 10.0.0.2 ping statistics --- 00:09:51.896 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:09:51.896 rtt min/avg/max/mdev = 0.024/0.024/0.024/0.000 ms 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@98 -- # (( pair++ )) 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@266 -- # return 0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:09:51.896 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@107 -- # local dev=target0 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # get_net_dev target1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@107 -- # local dev=target1 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:09:51.897 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # get_net_dev target0 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@107 -- # local dev=target0 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # get_net_dev target1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@107 -- # local dev=target1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@38 -- # nvmfappstart -m 0xF 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@724 -- # xtrace_disable 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@324 -- # nvmfpid=1730237 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@325 -- # waitforlisten 1730237 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@831 -- # '[' -z 1730237 ']' 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:52.156 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:52.157 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:52.157 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:52.157 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:52.157 15:14:53 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:52.157 [2024-09-27 15:14:53.877779] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:09:52.157 [2024-09-27 15:14:53.877841] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:52.157 [2024-09-27 15:14:53.966090] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:52.416 [2024-09-27 15:14:54.055399] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:09:52.416 [2024-09-27 15:14:54.055446] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:09:52.416 [2024-09-27 15:14:54.055456] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:09:52.416 [2024-09-27 15:14:54.055465] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:09:52.416 [2024-09-27 15:14:54.055472] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:09:52.416 [2024-09-27 15:14:54.055538] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:09:52.416 [2024-09-27 15:14:54.055641] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:09:52.416 [2024-09-27 15:14:54.055723] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.416 [2024-09-27 15:14:54.055724] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@864 -- # return 0 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@40 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:52.984 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:52.984 [2024-09-27 15:14:54.798973] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x5e34a0/0x5e7990) succeed. 00:09:52.984 [2024-09-27 15:14:54.809438] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x5e4ae0/0x629030) succeed. 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@41 -- # rpc_cmd nvmf_subsystem_add_listener -t rdma -a 10.0.0.2 -s 8009 discovery 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.243 [2024-09-27 15:14:54.938445] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 8009 *** 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@44 -- # rpc_cmd nvmf_discovery_add_referral -t rdma -a 127.0.0.2 -s 4430 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@45 -- # rpc_cmd nvmf_discovery_add_referral -t rdma -a 127.0.0.3 -s 4430 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@46 -- # rpc_cmd nvmf_discovery_add_referral -t rdma -a 127.0.0.4 -s 4430 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@48 -- # rpc_cmd nvmf_discovery_get_referrals 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@48 -- # jq length 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.243 15:14:54 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@48 -- # (( 3 == 3 )) 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@49 -- # get_referral_ips rpc 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ rpc == \r\p\c ]] 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # rpc_cmd nvmf_discovery_get_referrals 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # jq -r '.[].address.traddr' 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # sort 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # echo 127.0.0.2 127.0.0.3 127.0.0.4 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@49 -- # [[ 127.0.0.2 127.0.0.3 127.0.0.4 == \1\2\7\.\0\.\0\.\2\ \1\2\7\.\0\.\0\.\3\ \1\2\7\.\0\.\0\.\4 ]] 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@50 -- # get_referral_ips nvme 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ nvme == \r\p\c ]] 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@22 -- # [[ nvme == \n\v\m\e ]] 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # jq -r '.records[] | select(.subtype != "current discovery subsystem").traddr' 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:53.243 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # sort 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # echo 127.0.0.2 127.0.0.3 127.0.0.4 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@50 -- # [[ 127.0.0.2 127.0.0.3 127.0.0.4 == \1\2\7\.\0\.\0\.\2\ \1\2\7\.\0\.\0\.\3\ \1\2\7\.\0\.\0\.\4 ]] 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@52 -- # rpc_cmd nvmf_discovery_remove_referral -t rdma -a 127.0.0.2 -s 4430 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@53 -- # rpc_cmd nvmf_discovery_remove_referral -t rdma -a 127.0.0.3 -s 4430 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@54 -- # rpc_cmd nvmf_discovery_remove_referral -t rdma -a 127.0.0.4 -s 4430 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@56 -- # rpc_cmd nvmf_discovery_get_referrals 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@56 -- # jq length 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@56 -- # (( 0 == 0 )) 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@57 -- # get_referral_ips nvme 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ nvme == \r\p\c ]] 00:09:53.502 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@22 -- # [[ nvme == \n\v\m\e ]] 00:09:53.503 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:53.503 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # jq -r '.records[] | select(.subtype != "current discovery subsystem").traddr' 00:09:53.503 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # sort 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # echo 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@57 -- # [[ '' == '' ]] 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@60 -- # rpc_cmd nvmf_discovery_add_referral -t rdma -a 127.0.0.2 -s 4430 -n discovery 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@62 -- # rpc_cmd nvmf_discovery_add_referral -t rdma -a 127.0.0.2 -s 4430 -n nqn.2016-06.io.spdk:cnode1 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.760 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@65 -- # get_referral_ips rpc 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ rpc == \r\p\c ]] 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # rpc_cmd nvmf_discovery_get_referrals 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # jq -r '.[].address.traddr' 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # sort 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # echo 127.0.0.2 127.0.0.2 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@65 -- # [[ 127.0.0.2 127.0.0.2 == \1\2\7\.\0\.\0\.\2\ \1\2\7\.\0\.\0\.\2 ]] 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@66 -- # get_referral_ips nvme 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ nvme == \r\p\c ]] 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@22 -- # [[ nvme == \n\v\m\e ]] 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # jq -r '.records[] | select(.subtype != "current discovery subsystem").traddr' 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # sort 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # echo 127.0.0.2 127.0.0.2 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@66 -- # [[ 127.0.0.2 127.0.0.2 == \1\2\7\.\0\.\0\.\2\ \1\2\7\.\0\.\0\.\2 ]] 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@67 -- # get_discovery_entries 'nvme subsystem' 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@67 -- # jq -r .subnqn 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@31 -- # local 'subtype=nvme subsystem' 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@33 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:53.761 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@34 -- # jq '.records[] | select(.subtype == "nvme subsystem")' 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@67 -- # [[ nqn.2016-06.io.spdk:cnode1 == \n\q\n\.\2\0\1\6\-\0\6\.\i\o\.\s\p\d\k\:\c\n\o\d\e\1 ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@68 -- # get_discovery_entries 'discovery subsystem referral' 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@68 -- # jq -r .subnqn 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@31 -- # local 'subtype=discovery subsystem referral' 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@33 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@34 -- # jq '.records[] | select(.subtype == "discovery subsystem referral")' 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@68 -- # [[ nqn.2014-08.org.nvmexpress.discovery == \n\q\n\.\2\0\1\4\-\0\8\.\o\r\g\.\n\v\m\e\x\p\r\e\s\s\.\d\i\s\c\o\v\e\r\y ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@71 -- # rpc_cmd nvmf_discovery_remove_referral -t rdma -a 127.0.0.2 -s 4430 -n nqn.2016-06.io.spdk:cnode1 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@73 -- # get_referral_ips rpc 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ rpc == \r\p\c ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # rpc_cmd nvmf_discovery_get_referrals 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # jq -r '.[].address.traddr' 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # sort 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@21 -- # echo 127.0.0.2 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@73 -- # [[ 127.0.0.2 == \1\2\7\.\0\.\0\.\2 ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@74 -- # get_referral_ips nvme 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ nvme == \r\p\c ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@22 -- # [[ nvme == \n\v\m\e ]] 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # jq -r '.records[] | select(.subtype != "current discovery subsystem").traddr' 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:54.020 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # sort 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # echo 127.0.0.2 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@74 -- # [[ 127.0.0.2 == \1\2\7\.\0\.\0\.\2 ]] 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@75 -- # get_discovery_entries 'nvme subsystem' 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@75 -- # jq -r .subnqn 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@31 -- # local 'subtype=nvme subsystem' 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@33 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:54.279 15:14:55 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@34 -- # jq '.records[] | select(.subtype == "nvme subsystem")' 00:09:54.279 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@75 -- # [[ '' == '' ]] 00:09:54.279 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@76 -- # jq -r .subnqn 00:09:54.279 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@76 -- # get_discovery_entries 'discovery subsystem referral' 00:09:54.279 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@31 -- # local 'subtype=discovery subsystem referral' 00:09:54.279 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@34 -- # jq '.records[] | select(.subtype == "discovery subsystem referral")' 00:09:54.279 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@33 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@76 -- # [[ nqn.2014-08.org.nvmexpress.discovery == \n\q\n\.\2\0\1\4\-\0\8\.\o\r\g\.\n\v\m\e\x\p\r\e\s\s\.\d\i\s\c\o\v\e\r\y ]] 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@79 -- # rpc_cmd nvmf_discovery_remove_referral -t rdma -a 127.0.0.2 -s 4430 -n nqn.2014-08.org.nvmexpress.discovery 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@82 -- # rpc_cmd nvmf_discovery_get_referrals 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@82 -- # jq length 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@82 -- # (( 0 == 0 )) 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@83 -- # get_referral_ips nvme 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@19 -- # [[ nvme == \r\p\c ]] 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@22 -- # [[ nvme == \n\v\m\e ]] 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 8009 -o json 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # jq -r '.records[] | select(.subtype != "current discovery subsystem").traddr' 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # sort 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@26 -- # echo 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@83 -- # [[ '' == '' ]] 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@85 -- # trap - SIGINT SIGTERM EXIT 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- target/referrals.sh@86 -- # nvmftestfini 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@331 -- # nvmfcleanup 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@99 -- # sync 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@102 -- # set +e 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@103 -- # for i in {1..20} 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:09:54.538 rmmod nvme_rdma 00:09:54.538 rmmod nvme_fabrics 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@106 -- # set -e 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@107 -- # return 0 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@332 -- # '[' -n 1730237 ']' 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@333 -- # killprocess 1730237 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@950 -- # '[' -z 1730237 ']' 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@954 -- # kill -0 1730237 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@955 -- # uname 00:09:54.538 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:54.539 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1730237 00:09:54.798 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:54.798 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:54.798 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1730237' 00:09:54.798 killing process with pid 1730237 00:09:54.798 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@969 -- # kill 1730237 00:09:54.798 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@974 -- # wait 1730237 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@338 -- # nvmf_fini 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@264 -- # local dev 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@267 -- # remove_target_ns 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@268 -- # delete_main_bridge 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@130 -- # return 0 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@41 -- # _dev=0 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@41 -- # dev_map=() 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/setup.sh@284 -- # iptr 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@538 -- # iptables-save 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- nvmf/common.sh@538 -- # iptables-restore 00:09:55.057 00:09:55.057 real 0m10.159s 00:09:55.057 user 0m13.352s 00:09:55.057 sys 0m6.232s 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_referrals -- common/autotest_common.sh@10 -- # set +x 00:09:55.057 ************************************ 00:09:55.057 END TEST nvmf_referrals 00:09:55.057 ************************************ 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@20 -- # run_test nvmf_connect_disconnect /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/connect_disconnect.sh --transport=rdma 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:09:55.057 ************************************ 00:09:55.057 START TEST nvmf_connect_disconnect 00:09:55.057 ************************************ 00:09:55.057 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/connect_disconnect.sh --transport=rdma 00:09:55.317 * Looking for test storage... 00:09:55.317 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:09:55.317 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:09:55.317 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1681 -- # lcov --version 00:09:55.317 15:14:56 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@336 -- # IFS=.-: 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@336 -- # read -ra ver1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@337 -- # IFS=.-: 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@337 -- # read -ra ver2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@338 -- # local 'op=<' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@340 -- # ver1_l=2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@341 -- # ver2_l=1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@344 -- # case "$op" in 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@345 -- # : 1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@365 -- # decimal 1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@353 -- # local d=1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@355 -- # echo 1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@365 -- # ver1[v]=1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@366 -- # decimal 2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@353 -- # local d=2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@355 -- # echo 2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@366 -- # ver2[v]=2 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@368 -- # return 0 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:09:55.317 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:55.317 --rc genhtml_branch_coverage=1 00:09:55.317 --rc genhtml_function_coverage=1 00:09:55.317 --rc genhtml_legend=1 00:09:55.317 --rc geninfo_all_blocks=1 00:09:55.317 --rc geninfo_unexecuted_blocks=1 00:09:55.317 00:09:55.317 ' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:09:55.317 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:55.317 --rc genhtml_branch_coverage=1 00:09:55.317 --rc genhtml_function_coverage=1 00:09:55.317 --rc genhtml_legend=1 00:09:55.317 --rc geninfo_all_blocks=1 00:09:55.317 --rc geninfo_unexecuted_blocks=1 00:09:55.317 00:09:55.317 ' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:09:55.317 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:55.317 --rc genhtml_branch_coverage=1 00:09:55.317 --rc genhtml_function_coverage=1 00:09:55.317 --rc genhtml_legend=1 00:09:55.317 --rc geninfo_all_blocks=1 00:09:55.317 --rc geninfo_unexecuted_blocks=1 00:09:55.317 00:09:55.317 ' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:09:55.317 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:55.317 --rc genhtml_branch_coverage=1 00:09:55.317 --rc genhtml_function_coverage=1 00:09:55.317 --rc genhtml_legend=1 00:09:55.317 --rc geninfo_all_blocks=1 00:09:55.317 --rc geninfo_unexecuted_blocks=1 00:09:55.317 00:09:55.317 ' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@7 -- # uname -s 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@15 -- # shopt -s extglob 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- paths/export.sh@5 -- # export PATH 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:09:55.317 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@50 -- # : 0 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:09:55.318 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@54 -- # have_pci_nics=0 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@11 -- # MALLOC_BDEV_SIZE=64 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@15 -- # nvmftestinit 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@292 -- # prepare_net_devs 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@254 -- # local -g is_hw=no 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@256 -- # remove_target_ns 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@22 -- # _remove_target_ns 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@125 -- # xtrace_disable 00:09:55.318 15:14:57 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:01.889 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:10:01.889 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@131 -- # pci_devs=() 00:10:01.889 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@131 -- # local -a pci_devs 00:10:01.889 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@132 -- # pci_net_devs=() 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@133 -- # pci_drivers=() 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@133 -- # local -A pci_drivers 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@135 -- # net_devs=() 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@135 -- # local -ga net_devs 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@136 -- # e810=() 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@136 -- # local -ga e810 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@137 -- # x722=() 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@137 -- # local -ga x722 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@138 -- # mlx=() 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@138 -- # local -ga mlx 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:10:02.150 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:10:02.150 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:10:02.150 Found net devices under 0000:18:00.0: mlx_0_0 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:10:02.150 Found net devices under 0000:18:00.1: mlx_0_1 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@249 -- # get_rdma_if_list 00:10:02.150 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@75 -- # rdma_devs=() 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@89 -- # continue 2 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@89 -- # continue 2 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@258 -- # is_hw=yes 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@61 -- # uname 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@65 -- # modprobe ib_cm 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@66 -- # modprobe ib_core 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@67 -- # modprobe ib_umad 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@69 -- # modprobe iw_cm 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@27 -- # local -gA dev_map 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@28 -- # local -g _dev 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@44 -- # ips=() 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@58 -- # key_initiator=target1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@11 -- # local val=167772161 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:10:02.151 10.0.0.1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@11 -- # local val=167772162 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:10:02.151 10.0.0.2 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:10:02.151 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@38 -- # ping_ips 1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@107 -- # local dev=target0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:10:02.152 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:10:02.152 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.046 ms 00:10:02.152 00:10:02.152 --- 10.0.0.2 ping statistics --- 00:10:02.152 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:10:02.152 rtt min/avg/max/mdev = 0.046/0.046/0.046/0.000 ms 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@107 -- # local dev=target0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:10:02.152 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:10:02.152 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:10:02.152 00:10:02.152 --- 10.0.0.2 ping statistics --- 00:10:02.152 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:10:02.152 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@98 -- # (( pair++ )) 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@266 -- # return 0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@107 -- # local dev=target0 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:10:02.152 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # get_net_dev target1 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@107 -- # local dev=target1 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:10:02.153 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:10:02.412 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:10:02.412 15:15:03 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@107 -- # local dev=target0 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # get_net_dev target1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@107 -- # local dev=target1 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:10:02.412 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@16 -- # nvmfappstart -m 0xF 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@724 -- # xtrace_disable 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@324 -- # nvmfpid=1733802 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@325 -- # waitforlisten 1733802 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@831 -- # '[' -z 1733802 ']' 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:02.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:02.413 15:15:04 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:02.413 [2024-09-27 15:15:04.130804] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:10:02.413 [2024-09-27 15:15:04.130866] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:02.413 [2024-09-27 15:15:04.216435] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:02.671 [2024-09-27 15:15:04.313873] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:10:02.671 [2024-09-27 15:15:04.313915] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:10:02.671 [2024-09-27 15:15:04.313925] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:10:02.671 [2024-09-27 15:15:04.313934] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:10:02.671 [2024-09-27 15:15:04.313942] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:10:02.671 [2024-09-27 15:15:04.314000] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:10:02.671 [2024-09-27 15:15:04.314103] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:10:02.671 [2024-09-27 15:15:04.314131] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:02.671 [2024-09-27 15:15:04.314132] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:10:03.239 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:03.239 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@864 -- # return 0 00:10:03.239 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:10:03.240 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@730 -- # xtrace_disable 00:10:03.240 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:03.240 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:10:03.240 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@18 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 -c 0 00:10:03.240 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.240 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:03.240 [2024-09-27 15:15:05.058387] rdma.c:2734:nvmf_rdma_create: *WARNING*: In capsule data size is set to 256, this is minimum size required to support msdbd=16 00:10:03.240 [2024-09-27 15:15:05.079513] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0xf7c4a0/0xf80990) succeed. 00:10:03.498 [2024-09-27 15:15:05.089937] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0xf7dae0/0xfc2030) succeed. 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@20 -- # rpc_cmd bdev_malloc_create 64 512 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@20 -- # bdev=Malloc0 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@22 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@23 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@24 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:03.498 [2024-09-27 15:15:05.232399] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@26 -- # '[' 0 -eq 1 ']' 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@31 -- # num_iterations=5 00:10:03.498 15:15:05 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@34 -- # set +x 00:10:07.691 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:11.888 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:16.082 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:19.377 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:23.705 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@43 -- # trap - SIGINT SIGTERM EXIT 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- target/connect_disconnect.sh@45 -- # nvmftestfini 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@331 -- # nvmfcleanup 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@99 -- # sync 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@102 -- # set +e 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@103 -- # for i in {1..20} 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:10:23.705 rmmod nvme_rdma 00:10:23.705 rmmod nvme_fabrics 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@106 -- # set -e 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@107 -- # return 0 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@332 -- # '[' -n 1733802 ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@333 -- # killprocess 1733802 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@950 -- # '[' -z 1733802 ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@954 -- # kill -0 1733802 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@955 -- # uname 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1733802 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1733802' 00:10:23.705 killing process with pid 1733802 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@969 -- # kill 1733802 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@974 -- # wait 1733802 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@338 -- # nvmf_fini 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@264 -- # local dev 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@267 -- # remove_target_ns 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@22 -- # _remove_target_ns 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@268 -- # delete_main_bridge 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@130 -- # return 0 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@41 -- # _dev=0 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@41 -- # dev_map=() 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/setup.sh@284 -- # iptr 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@538 -- # iptables-save 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:10:23.705 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- nvmf/common.sh@538 -- # iptables-restore 00:10:23.964 00:10:23.964 real 0m28.734s 00:10:23.964 user 1m26.420s 00:10:23.964 sys 0m6.515s 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_disconnect -- common/autotest_common.sh@10 -- # set +x 00:10:23.964 ************************************ 00:10:23.964 END TEST nvmf_connect_disconnect 00:10:23.964 ************************************ 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@21 -- # run_test nvmf_multitarget /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget.sh --transport=rdma 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:10:23.964 ************************************ 00:10:23.964 START TEST nvmf_multitarget 00:10:23.964 ************************************ 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget.sh --transport=rdma 00:10:23.964 * Looking for test storage... 00:10:23.964 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1681 -- # lcov --version 00:10:23.964 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@333 -- # local ver1 ver1_l 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@334 -- # local ver2 ver2_l 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@336 -- # IFS=.-: 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@336 -- # read -ra ver1 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@337 -- # IFS=.-: 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@337 -- # read -ra ver2 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@338 -- # local 'op=<' 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@340 -- # ver1_l=2 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@341 -- # ver2_l=1 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@344 -- # case "$op" in 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@345 -- # : 1 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@364 -- # (( v = 0 )) 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:24.227 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@365 -- # decimal 1 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@353 -- # local d=1 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@355 -- # echo 1 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@365 -- # ver1[v]=1 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@366 -- # decimal 2 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@353 -- # local d=2 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@355 -- # echo 2 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@366 -- # ver2[v]=2 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@368 -- # return 0 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:10:24.228 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:24.228 --rc genhtml_branch_coverage=1 00:10:24.228 --rc genhtml_function_coverage=1 00:10:24.228 --rc genhtml_legend=1 00:10:24.228 --rc geninfo_all_blocks=1 00:10:24.228 --rc geninfo_unexecuted_blocks=1 00:10:24.228 00:10:24.228 ' 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:10:24.228 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:24.228 --rc genhtml_branch_coverage=1 00:10:24.228 --rc genhtml_function_coverage=1 00:10:24.228 --rc genhtml_legend=1 00:10:24.228 --rc geninfo_all_blocks=1 00:10:24.228 --rc geninfo_unexecuted_blocks=1 00:10:24.228 00:10:24.228 ' 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:10:24.228 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:24.228 --rc genhtml_branch_coverage=1 00:10:24.228 --rc genhtml_function_coverage=1 00:10:24.228 --rc genhtml_legend=1 00:10:24.228 --rc geninfo_all_blocks=1 00:10:24.228 --rc geninfo_unexecuted_blocks=1 00:10:24.228 00:10:24.228 ' 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:10:24.228 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:24.228 --rc genhtml_branch_coverage=1 00:10:24.228 --rc genhtml_function_coverage=1 00:10:24.228 --rc genhtml_legend=1 00:10:24.228 --rc geninfo_all_blocks=1 00:10:24.228 --rc geninfo_unexecuted_blocks=1 00:10:24.228 00:10:24.228 ' 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@7 -- # uname -s 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@15 -- # shopt -s extglob 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- paths/export.sh@5 -- # export PATH 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:10:24.228 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@50 -- # : 0 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:10:24.229 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@54 -- # have_pci_nics=0 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@13 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@15 -- # nvmftestinit 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@292 -- # prepare_net_devs 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@254 -- # local -g is_hw=no 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@256 -- # remove_target_ns 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@22 -- # _remove_target_ns 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@125 -- # xtrace_disable 00:10:24.229 15:15:25 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@10 -- # set +x 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@131 -- # pci_devs=() 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@131 -- # local -a pci_devs 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@132 -- # pci_net_devs=() 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@133 -- # pci_drivers=() 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@133 -- # local -A pci_drivers 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@135 -- # net_devs=() 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@135 -- # local -ga net_devs 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@136 -- # e810=() 00:10:30.800 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@136 -- # local -ga e810 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@137 -- # x722=() 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@137 -- # local -ga x722 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@138 -- # mlx=() 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@138 -- # local -ga mlx 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:10:30.801 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:10:30.801 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:10:30.801 Found net devices under 0000:18:00.0: mlx_0_0 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:10:30.801 Found net devices under 0000:18:00.1: mlx_0_1 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@249 -- # get_rdma_if_list 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@75 -- # rdma_devs=() 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@89 -- # continue 2 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@89 -- # continue 2 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@258 -- # is_hw=yes 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@61 -- # uname 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@65 -- # modprobe ib_cm 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@66 -- # modprobe ib_core 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@67 -- # modprobe ib_umad 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@69 -- # modprobe iw_cm 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:10:30.801 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@27 -- # local -gA dev_map 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@28 -- # local -g _dev 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@44 -- # ips=() 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@58 -- # key_initiator=target1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@11 -- # local val=167772161 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:10:31.063 10.0.0.1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@11 -- # local val=167772162 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:10:31.063 10.0.0.2 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@38 -- # ping_ips 1 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:31.063 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@107 -- # local dev=target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:10:31.064 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:10:31.064 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:10:31.064 00:10:31.064 --- 10.0.0.2 ping statistics --- 00:10:31.064 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:10:31.064 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@107 -- # local dev=target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:10:31.064 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:10:31.064 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:10:31.064 00:10:31.064 --- 10.0.0.2 ping statistics --- 00:10:31.064 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:10:31.064 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@98 -- # (( pair++ )) 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@266 -- # return 0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@107 -- # local dev=target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # get_net_dev target1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@107 -- # local dev=target1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@107 -- # local dev=target0 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # get_net_dev target1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@107 -- # local dev=target1 00:10:31.064 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@16 -- # nvmfappstart -m 0xF 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@724 -- # xtrace_disable 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@10 -- # set +x 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@324 -- # nvmfpid=1739962 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@325 -- # waitforlisten 1739962 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@831 -- # '[' -z 1739962 ']' 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:31.065 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:31.065 15:15:32 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@10 -- # set +x 00:10:31.324 [2024-09-27 15:15:32.932386] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:10:31.324 [2024-09-27 15:15:32.932453] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:31.324 [2024-09-27 15:15:33.018903] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:31.324 [2024-09-27 15:15:33.115800] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:10:31.324 [2024-09-27 15:15:33.115841] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:10:31.324 [2024-09-27 15:15:33.115851] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:10:31.324 [2024-09-27 15:15:33.115860] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:10:31.324 [2024-09-27 15:15:33.115868] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:10:31.324 [2024-09-27 15:15:33.115937] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:10:31.324 [2024-09-27 15:15:33.116040] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:10:31.324 [2024-09-27 15:15:33.116122] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:31.324 [2024-09-27 15:15:33.116123] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@864 -- # return 0 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@730 -- # xtrace_disable 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@10 -- # set +x 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@18 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; nvmftestfini $1; exit 1' SIGINT SIGTERM EXIT 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@21 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_get_targets 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@21 -- # jq length 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@21 -- # '[' 1 '!=' 1 ']' 00:10:32.263 15:15:33 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@25 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_create_target -n nvmf_tgt_1 -s 32 00:10:32.263 "nvmf_tgt_1" 00:10:32.263 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@26 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_create_target -n nvmf_tgt_2 -s 32 00:10:32.523 "nvmf_tgt_2" 00:10:32.523 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_get_targets 00:10:32.523 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@28 -- # jq length 00:10:32.523 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@28 -- # '[' 3 '!=' 3 ']' 00:10:32.523 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@32 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_delete_target -n nvmf_tgt_1 00:10:32.783 true 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_delete_target -n nvmf_tgt_2 00:10:32.783 true 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@35 -- # jq length 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@35 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_get_targets 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@35 -- # '[' 1 '!=' 1 ']' 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- target/multitarget.sh@41 -- # nvmftestfini 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@331 -- # nvmfcleanup 00:10:32.783 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@99 -- # sync 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@102 -- # set +e 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@103 -- # for i in {1..20} 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:10:33.043 rmmod nvme_rdma 00:10:33.043 rmmod nvme_fabrics 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@106 -- # set -e 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@107 -- # return 0 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@332 -- # '[' -n 1739962 ']' 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@333 -- # killprocess 1739962 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@950 -- # '[' -z 1739962 ']' 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@954 -- # kill -0 1739962 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@955 -- # uname 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1739962 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1739962' 00:10:33.043 killing process with pid 1739962 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@969 -- # kill 1739962 00:10:33.043 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@974 -- # wait 1739962 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@338 -- # nvmf_fini 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@264 -- # local dev 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@267 -- # remove_target_ns 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@22 -- # _remove_target_ns 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@268 -- # delete_main_bridge 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@130 -- # return 0 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@41 -- # _dev=0 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@41 -- # dev_map=() 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/setup.sh@284 -- # iptr 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@538 -- # iptables-save 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- nvmf/common.sh@538 -- # iptables-restore 00:10:33.303 00:10:33.303 real 0m9.316s 00:10:33.303 user 0m10.101s 00:10:33.303 sys 0m5.922s 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:33.303 15:15:34 nvmf_rdma.nvmf_target_extra.nvmf_multitarget -- common/autotest_common.sh@10 -- # set +x 00:10:33.303 ************************************ 00:10:33.303 END TEST nvmf_multitarget 00:10:33.303 ************************************ 00:10:33.303 15:15:35 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@22 -- # run_test nvmf_rpc /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpc.sh --transport=rdma 00:10:33.303 15:15:35 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:33.303 15:15:35 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:33.303 15:15:35 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:10:33.303 ************************************ 00:10:33.303 START TEST nvmf_rpc 00:10:33.303 ************************************ 00:10:33.303 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpc.sh --transport=rdma 00:10:33.303 * Looking for test storage... 00:10:33.564 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@344 -- # case "$op" in 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@345 -- # : 1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@365 -- # decimal 1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@353 -- # local d=1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@355 -- # echo 1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@366 -- # decimal 2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@353 -- # local d=2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@355 -- # echo 2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@368 -- # return 0 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:10:33.564 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:33.564 --rc genhtml_branch_coverage=1 00:10:33.564 --rc genhtml_function_coverage=1 00:10:33.564 --rc genhtml_legend=1 00:10:33.564 --rc geninfo_all_blocks=1 00:10:33.564 --rc geninfo_unexecuted_blocks=1 00:10:33.564 00:10:33.564 ' 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:10:33.564 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:33.564 --rc genhtml_branch_coverage=1 00:10:33.564 --rc genhtml_function_coverage=1 00:10:33.564 --rc genhtml_legend=1 00:10:33.564 --rc geninfo_all_blocks=1 00:10:33.564 --rc geninfo_unexecuted_blocks=1 00:10:33.564 00:10:33.564 ' 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:10:33.564 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:33.564 --rc genhtml_branch_coverage=1 00:10:33.564 --rc genhtml_function_coverage=1 00:10:33.564 --rc genhtml_legend=1 00:10:33.564 --rc geninfo_all_blocks=1 00:10:33.564 --rc geninfo_unexecuted_blocks=1 00:10:33.564 00:10:33.564 ' 00:10:33.564 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:10:33.564 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:10:33.564 --rc genhtml_branch_coverage=1 00:10:33.564 --rc genhtml_function_coverage=1 00:10:33.565 --rc genhtml_legend=1 00:10:33.565 --rc geninfo_all_blocks=1 00:10:33.565 --rc geninfo_unexecuted_blocks=1 00:10:33.565 00:10:33.565 ' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@7 -- # uname -s 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@15 -- # shopt -s extglob 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- paths/export.sh@5 -- # export PATH 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@50 -- # : 0 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:10:33.565 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@54 -- # have_pci_nics=0 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@11 -- # loops=5 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@23 -- # nvmftestinit 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@292 -- # prepare_net_devs 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@254 -- # local -g is_hw=no 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@256 -- # remove_target_ns 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@22 -- # _remove_target_ns 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@125 -- # xtrace_disable 00:10:33.565 15:15:35 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@131 -- # pci_devs=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@131 -- # local -a pci_devs 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@132 -- # pci_net_devs=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@133 -- # pci_drivers=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@133 -- # local -A pci_drivers 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@135 -- # net_devs=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@135 -- # local -ga net_devs 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@136 -- # e810=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@136 -- # local -ga e810 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@137 -- # x722=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@137 -- # local -ga x722 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@138 -- # mlx=() 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@138 -- # local -ga mlx 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:10:41.701 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:10:41.701 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:10:41.702 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:10:41.702 Found net devices under 0000:18:00.0: mlx_0_0 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:10:41.702 Found net devices under 0000:18:00.1: mlx_0_1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@249 -- # get_rdma_if_list 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@75 -- # rdma_devs=() 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@89 -- # continue 2 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@89 -- # continue 2 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@258 -- # is_hw=yes 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@61 -- # uname 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@65 -- # modprobe ib_cm 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@66 -- # modprobe ib_core 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@67 -- # modprobe ib_umad 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@69 -- # modprobe iw_cm 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@27 -- # local -gA dev_map 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@28 -- # local -g _dev 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@44 -- # ips=() 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@58 -- # key_initiator=target1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@11 -- # local val=167772161 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:10:41.702 10.0.0.1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@11 -- # local val=167772162 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:10:41.702 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:10:41.703 10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@38 -- # ping_ips 1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@107 -- # local dev=target0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:10:41.703 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:10:41.703 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.030 ms 00:10:41.703 00:10:41.703 --- 10.0.0.2 ping statistics --- 00:10:41.703 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:10:41.703 rtt min/avg/max/mdev = 0.030/0.030/0.030/0.000 ms 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@107 -- # local dev=target0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:10:41.703 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:10:41.703 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.025 ms 00:10:41.703 00:10:41.703 --- 10.0.0.2 ping statistics --- 00:10:41.703 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:10:41.703 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@98 -- # (( pair++ )) 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@266 -- # return 0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@107 -- # local dev=target0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # get_net_dev target1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@107 -- # local dev=target1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:10:41.703 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@107 -- # local dev=target0 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # get_net_dev target1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@107 -- # local dev=target1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@24 -- # nvmfappstart -m 0xF 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@724 -- # xtrace_disable 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@324 -- # nvmfpid=1743268 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@325 -- # waitforlisten 1743268 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@831 -- # '[' -z 1743268 ']' 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:41.704 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:41.704 15:15:42 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.704 [2024-09-27 15:15:42.405029] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:10:41.704 [2024-09-27 15:15:42.405098] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:41.704 [2024-09-27 15:15:42.493822] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:10:41.704 [2024-09-27 15:15:42.583721] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:10:41.704 [2024-09-27 15:15:42.583763] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:10:41.704 [2024-09-27 15:15:42.583772] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:10:41.704 [2024-09-27 15:15:42.583781] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:10:41.704 [2024-09-27 15:15:42.583787] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:10:41.704 [2024-09-27 15:15:42.583901] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:10:41.704 [2024-09-27 15:15:42.584005] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:10:41.704 [2024-09-27 15:15:42.584098] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:10:41.704 [2024-09-27 15:15:42.584098] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@864 -- # return 0 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@730 -- # xtrace_disable 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@26 -- # rpc_cmd nvmf_get_stats 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@26 -- # stats='{ 00:10:41.704 "tick_rate": 2300000000, 00:10:41.704 "poll_groups": [ 00:10:41.704 { 00:10:41.704 "name": "nvmf_tgt_poll_group_000", 00:10:41.704 "admin_qpairs": 0, 00:10:41.704 "io_qpairs": 0, 00:10:41.704 "current_admin_qpairs": 0, 00:10:41.704 "current_io_qpairs": 0, 00:10:41.704 "pending_bdev_io": 0, 00:10:41.704 "completed_nvme_io": 0, 00:10:41.704 "transports": [] 00:10:41.704 }, 00:10:41.704 { 00:10:41.704 "name": "nvmf_tgt_poll_group_001", 00:10:41.704 "admin_qpairs": 0, 00:10:41.704 "io_qpairs": 0, 00:10:41.704 "current_admin_qpairs": 0, 00:10:41.704 "current_io_qpairs": 0, 00:10:41.704 "pending_bdev_io": 0, 00:10:41.704 "completed_nvme_io": 0, 00:10:41.704 "transports": [] 00:10:41.704 }, 00:10:41.704 { 00:10:41.704 "name": "nvmf_tgt_poll_group_002", 00:10:41.704 "admin_qpairs": 0, 00:10:41.704 "io_qpairs": 0, 00:10:41.704 "current_admin_qpairs": 0, 00:10:41.704 "current_io_qpairs": 0, 00:10:41.704 "pending_bdev_io": 0, 00:10:41.704 "completed_nvme_io": 0, 00:10:41.704 "transports": [] 00:10:41.704 }, 00:10:41.704 { 00:10:41.704 "name": "nvmf_tgt_poll_group_003", 00:10:41.704 "admin_qpairs": 0, 00:10:41.704 "io_qpairs": 0, 00:10:41.704 "current_admin_qpairs": 0, 00:10:41.704 "current_io_qpairs": 0, 00:10:41.704 "pending_bdev_io": 0, 00:10:41.704 "completed_nvme_io": 0, 00:10:41.704 "transports": [] 00:10:41.704 } 00:10:41.704 ] 00:10:41.704 }' 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@28 -- # jcount '.poll_groups[].name' 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@14 -- # local 'filter=.poll_groups[].name' 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@15 -- # jq '.poll_groups[].name' 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@15 -- # wc -l 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@28 -- # (( 4 == 4 )) 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@29 -- # jq '.poll_groups[0].transports[0]' 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@29 -- # [[ null == null ]] 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@31 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.704 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.704 [2024-09-27 15:15:43.452207] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x2097500/0x209b9f0) succeed. 00:10:41.704 [2024-09-27 15:15:43.462662] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x2098b40/0x20dd090) succeed. 00:10:41.964 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.964 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@33 -- # rpc_cmd nvmf_get_stats 00:10:41.964 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.964 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:41.964 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.964 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@33 -- # stats='{ 00:10:41.964 "tick_rate": 2300000000, 00:10:41.964 "poll_groups": [ 00:10:41.964 { 00:10:41.964 "name": "nvmf_tgt_poll_group_000", 00:10:41.964 "admin_qpairs": 0, 00:10:41.964 "io_qpairs": 0, 00:10:41.964 "current_admin_qpairs": 0, 00:10:41.964 "current_io_qpairs": 0, 00:10:41.964 "pending_bdev_io": 0, 00:10:41.964 "completed_nvme_io": 0, 00:10:41.964 "transports": [ 00:10:41.964 { 00:10:41.964 "trtype": "RDMA", 00:10:41.964 "pending_data_buffer": 0, 00:10:41.964 "devices": [ 00:10:41.964 { 00:10:41.964 "name": "mlx5_0", 00:10:41.964 "polls": 17060, 00:10:41.964 "idle_polls": 17060, 00:10:41.964 "completions": 0, 00:10:41.964 "requests": 0, 00:10:41.964 "request_latency": 0, 00:10:41.964 "pending_free_request": 0, 00:10:41.964 "pending_rdma_read": 0, 00:10:41.964 "pending_rdma_write": 0, 00:10:41.964 "pending_rdma_send": 0, 00:10:41.964 "total_send_wrs": 0, 00:10:41.964 "send_doorbell_updates": 0, 00:10:41.964 "total_recv_wrs": 4096, 00:10:41.964 "recv_doorbell_updates": 1 00:10:41.964 }, 00:10:41.964 { 00:10:41.964 "name": "mlx5_1", 00:10:41.964 "polls": 17060, 00:10:41.964 "idle_polls": 17060, 00:10:41.964 "completions": 0, 00:10:41.964 "requests": 0, 00:10:41.964 "request_latency": 0, 00:10:41.964 "pending_free_request": 0, 00:10:41.964 "pending_rdma_read": 0, 00:10:41.964 "pending_rdma_write": 0, 00:10:41.964 "pending_rdma_send": 0, 00:10:41.964 "total_send_wrs": 0, 00:10:41.964 "send_doorbell_updates": 0, 00:10:41.964 "total_recv_wrs": 4096, 00:10:41.964 "recv_doorbell_updates": 1 00:10:41.964 } 00:10:41.964 ] 00:10:41.964 } 00:10:41.964 ] 00:10:41.964 }, 00:10:41.964 { 00:10:41.964 "name": "nvmf_tgt_poll_group_001", 00:10:41.964 "admin_qpairs": 0, 00:10:41.964 "io_qpairs": 0, 00:10:41.964 "current_admin_qpairs": 0, 00:10:41.964 "current_io_qpairs": 0, 00:10:41.964 "pending_bdev_io": 0, 00:10:41.964 "completed_nvme_io": 0, 00:10:41.964 "transports": [ 00:10:41.964 { 00:10:41.964 "trtype": "RDMA", 00:10:41.964 "pending_data_buffer": 0, 00:10:41.964 "devices": [ 00:10:41.964 { 00:10:41.964 "name": "mlx5_0", 00:10:41.964 "polls": 11055, 00:10:41.964 "idle_polls": 11055, 00:10:41.964 "completions": 0, 00:10:41.964 "requests": 0, 00:10:41.964 "request_latency": 0, 00:10:41.964 "pending_free_request": 0, 00:10:41.964 "pending_rdma_read": 0, 00:10:41.964 "pending_rdma_write": 0, 00:10:41.964 "pending_rdma_send": 0, 00:10:41.964 "total_send_wrs": 0, 00:10:41.964 "send_doorbell_updates": 0, 00:10:41.964 "total_recv_wrs": 4096, 00:10:41.964 "recv_doorbell_updates": 1 00:10:41.964 }, 00:10:41.964 { 00:10:41.964 "name": "mlx5_1", 00:10:41.964 "polls": 11055, 00:10:41.964 "idle_polls": 11055, 00:10:41.964 "completions": 0, 00:10:41.964 "requests": 0, 00:10:41.964 "request_latency": 0, 00:10:41.964 "pending_free_request": 0, 00:10:41.964 "pending_rdma_read": 0, 00:10:41.964 "pending_rdma_write": 0, 00:10:41.964 "pending_rdma_send": 0, 00:10:41.965 "total_send_wrs": 0, 00:10:41.965 "send_doorbell_updates": 0, 00:10:41.965 "total_recv_wrs": 4096, 00:10:41.965 "recv_doorbell_updates": 1 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 }, 00:10:41.965 { 00:10:41.965 "name": "nvmf_tgt_poll_group_002", 00:10:41.965 "admin_qpairs": 0, 00:10:41.965 "io_qpairs": 0, 00:10:41.965 "current_admin_qpairs": 0, 00:10:41.965 "current_io_qpairs": 0, 00:10:41.965 "pending_bdev_io": 0, 00:10:41.965 "completed_nvme_io": 0, 00:10:41.965 "transports": [ 00:10:41.965 { 00:10:41.965 "trtype": "RDMA", 00:10:41.965 "pending_data_buffer": 0, 00:10:41.965 "devices": [ 00:10:41.965 { 00:10:41.965 "name": "mlx5_0", 00:10:41.965 "polls": 5926, 00:10:41.965 "idle_polls": 5926, 00:10:41.965 "completions": 0, 00:10:41.965 "requests": 0, 00:10:41.965 "request_latency": 0, 00:10:41.965 "pending_free_request": 0, 00:10:41.965 "pending_rdma_read": 0, 00:10:41.965 "pending_rdma_write": 0, 00:10:41.965 "pending_rdma_send": 0, 00:10:41.965 "total_send_wrs": 0, 00:10:41.965 "send_doorbell_updates": 0, 00:10:41.965 "total_recv_wrs": 4096, 00:10:41.965 "recv_doorbell_updates": 1 00:10:41.965 }, 00:10:41.965 { 00:10:41.965 "name": "mlx5_1", 00:10:41.965 "polls": 5926, 00:10:41.965 "idle_polls": 5926, 00:10:41.965 "completions": 0, 00:10:41.965 "requests": 0, 00:10:41.965 "request_latency": 0, 00:10:41.965 "pending_free_request": 0, 00:10:41.965 "pending_rdma_read": 0, 00:10:41.965 "pending_rdma_write": 0, 00:10:41.965 "pending_rdma_send": 0, 00:10:41.965 "total_send_wrs": 0, 00:10:41.965 "send_doorbell_updates": 0, 00:10:41.965 "total_recv_wrs": 4096, 00:10:41.965 "recv_doorbell_updates": 1 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 }, 00:10:41.965 { 00:10:41.965 "name": "nvmf_tgt_poll_group_003", 00:10:41.965 "admin_qpairs": 0, 00:10:41.965 "io_qpairs": 0, 00:10:41.965 "current_admin_qpairs": 0, 00:10:41.965 "current_io_qpairs": 0, 00:10:41.965 "pending_bdev_io": 0, 00:10:41.965 "completed_nvme_io": 0, 00:10:41.965 "transports": [ 00:10:41.965 { 00:10:41.965 "trtype": "RDMA", 00:10:41.965 "pending_data_buffer": 0, 00:10:41.965 "devices": [ 00:10:41.965 { 00:10:41.965 "name": "mlx5_0", 00:10:41.965 "polls": 920, 00:10:41.965 "idle_polls": 920, 00:10:41.965 "completions": 0, 00:10:41.965 "requests": 0, 00:10:41.965 "request_latency": 0, 00:10:41.965 "pending_free_request": 0, 00:10:41.965 "pending_rdma_read": 0, 00:10:41.965 "pending_rdma_write": 0, 00:10:41.965 "pending_rdma_send": 0, 00:10:41.965 "total_send_wrs": 0, 00:10:41.965 "send_doorbell_updates": 0, 00:10:41.965 "total_recv_wrs": 4096, 00:10:41.965 "recv_doorbell_updates": 1 00:10:41.965 }, 00:10:41.965 { 00:10:41.965 "name": "mlx5_1", 00:10:41.965 "polls": 920, 00:10:41.965 "idle_polls": 920, 00:10:41.965 "completions": 0, 00:10:41.965 "requests": 0, 00:10:41.965 "request_latency": 0, 00:10:41.965 "pending_free_request": 0, 00:10:41.965 "pending_rdma_read": 0, 00:10:41.965 "pending_rdma_write": 0, 00:10:41.965 "pending_rdma_send": 0, 00:10:41.965 "total_send_wrs": 0, 00:10:41.965 "send_doorbell_updates": 0, 00:10:41.965 "total_recv_wrs": 4096, 00:10:41.965 "recv_doorbell_updates": 1 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 } 00:10:41.965 ] 00:10:41.965 }' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@35 -- # jsum '.poll_groups[].admin_qpairs' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@19 -- # local 'filter=.poll_groups[].admin_qpairs' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # jq '.poll_groups[].admin_qpairs' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # awk '{s+=$1}END{print s}' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@35 -- # (( 0 == 0 )) 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@36 -- # jsum '.poll_groups[].io_qpairs' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@19 -- # local 'filter=.poll_groups[].io_qpairs' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # jq '.poll_groups[].io_qpairs' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # awk '{s+=$1}END{print s}' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@36 -- # (( 0 == 0 )) 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@38 -- # '[' rdma == rdma ']' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@40 -- # jcount '.poll_groups[0].transports[].trtype' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@14 -- # local 'filter=.poll_groups[0].transports[].trtype' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@15 -- # jq '.poll_groups[0].transports[].trtype' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@15 -- # wc -l 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@40 -- # (( 1 == 1 )) 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@41 -- # jq -r '.poll_groups[0].transports[0].trtype' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@41 -- # transport_type=RDMA 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@42 -- # [[ rdma == \r\d\m\a ]] 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@43 -- # jcount '.poll_groups[0].transports[0].devices[].name' 00:10:41.965 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@14 -- # local 'filter=.poll_groups[0].transports[0].devices[].name' 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@15 -- # wc -l 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@15 -- # jq '.poll_groups[0].transports[0].devices[].name' 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@43 -- # (( 2 > 0 )) 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@46 -- # MALLOC_BDEV_SIZE=64 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@47 -- # MALLOC_BLOCK_SIZE=512 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@49 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc1 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:42.225 Malloc1 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@52 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@53 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@54 -- # rpc_cmd nvmf_subsystem_allow_any_host -d nqn.2016-06.io.spdk:cnode1 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@55 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:42.225 [2024-09-27 15:15:43.904314] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@58 -- # NOT nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -a 10.0.0.2 -s 4420 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@650 -- # local es=0 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -a 10.0.0.2 -s 4420 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@638 -- # local arg=nvme 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # type -t nvme 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@644 -- # type -P nvme 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@644 -- # arg=/usr/sbin/nvme 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@644 -- # [[ -x /usr/sbin/nvme ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@653 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -a 10.0.0.2 -s 4420 00:10:42.225 [2024-09-27 15:15:43.954193] ctrlr.c: 823:nvmf_qpair_access_allowed: *ERROR*: Subsystem 'nqn.2016-06.io.spdk:cnode1' does not allow host 'nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e' 00:10:42.225 Failed to write to /dev/nvme-fabrics: Input/output error 00:10:42.225 could not add new controller: failed to write to nvme-fabrics device 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@653 -- # es=1 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@61 -- # rpc_cmd nvmf_subsystem_add_host nqn.2016-06.io.spdk:cnode1 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.225 15:15:43 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:42.225 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.225 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@62 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:43.163 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@63 -- # waitforserial SPDKISFASTANDAWESOME 00:10:43.163 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:10:43.163 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:10:43.163 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:10:43.163 15:15:44 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:10:45.702 15:15:46 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:10:45.702 15:15:46 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:10:45.702 15:15:46 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:10:45.702 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:10:45.702 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:10:45.702 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:10:45.702 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@64 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:10:46.272 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:46.272 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@65 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:10:46.272 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:10:46.272 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:10:46.272 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:46.272 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:10:46.272 15:15:47 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@68 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2016-06.io.spdk:cnode1 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@69 -- # NOT nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@650 -- # local es=0 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@638 -- # local arg=nvme 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # type -t nvme 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@644 -- # type -P nvme 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@644 -- # arg=/usr/sbin/nvme 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@644 -- # [[ -x /usr/sbin/nvme ]] 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@653 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:46.272 [2024-09-27 15:15:48.061537] ctrlr.c: 823:nvmf_qpair_access_allowed: *ERROR*: Subsystem 'nqn.2016-06.io.spdk:cnode1' does not allow host 'nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e' 00:10:46.272 Failed to write to /dev/nvme-fabrics: Input/output error 00:10:46.272 could not add new controller: failed to write to nvme-fabrics device 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@653 -- # es=1 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@72 -- # rpc_cmd nvmf_subsystem_allow_any_host -e nqn.2016-06.io.spdk:cnode1 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:46.272 15:15:48 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@73 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:47.652 15:15:49 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@74 -- # waitforserial SPDKISFASTANDAWESOME 00:10:47.652 15:15:49 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:10:47.652 15:15:49 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:10:47.652 15:15:49 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:10:47.652 15:15:49 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:10:49.561 15:15:51 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@75 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:10:50.499 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@76 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@78 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@81 -- # seq 1 5 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@81 -- # for i in $(seq 1 $loops) 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@82 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@83 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:50.499 [2024-09-27 15:15:52.125002] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@84 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 5 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@85 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:50.499 15:15:52 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@86 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:51.435 15:15:53 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@88 -- # waitforserial SPDKISFASTANDAWESOME 00:10:51.435 15:15:53 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:10:51.435 15:15:53 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:10:51.435 15:15:53 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:10:51.435 15:15:53 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:10:53.342 15:15:55 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@90 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:10:54.719 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@91 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@93 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@94 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.719 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@81 -- # for i in $(seq 1 $loops) 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@82 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@83 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:54.720 [2024-09-27 15:15:56.194982] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@84 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 5 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@85 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:54.720 15:15:56 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@86 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:55.658 15:15:57 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@88 -- # waitforserial SPDKISFASTANDAWESOME 00:10:55.658 15:15:57 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:10:55.658 15:15:57 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:10:55.658 15:15:57 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:10:55.658 15:15:57 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:10:57.565 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:10:57.565 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:10:57.565 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:10:57.565 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:10:57.565 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:10:57.565 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:10:57.566 15:15:59 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@90 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:10:58.504 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@91 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@93 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@94 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@81 -- # for i in $(seq 1 $loops) 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@82 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@83 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:58.504 [2024-09-27 15:16:00.269855] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@84 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 5 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@85 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:58.504 15:16:00 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@86 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:10:59.442 15:16:01 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@88 -- # waitforserial SPDKISFASTANDAWESOME 00:10:59.442 15:16:01 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:10:59.442 15:16:01 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:10:59.442 15:16:01 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:10:59.442 15:16:01 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:11:01.975 15:16:03 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@90 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:11:02.542 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@91 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@93 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@94 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@81 -- # for i in $(seq 1 $loops) 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@82 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@83 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:02.542 [2024-09-27 15:16:04.307550] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@84 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 5 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@85 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:02.542 15:16:04 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@86 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:11:03.479 15:16:05 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@88 -- # waitforserial SPDKISFASTANDAWESOME 00:11:03.479 15:16:05 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:11:03.479 15:16:05 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:11:03.479 15:16:05 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:11:03.479 15:16:05 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:11:06.135 15:16:07 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@90 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:11:06.704 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@91 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@93 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@94 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@81 -- # for i in $(seq 1 $loops) 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@82 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@83 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:06.704 [2024-09-27 15:16:08.350525] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:06.704 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@84 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 5 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@85 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:06.705 15:16:08 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@86 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:11:07.643 15:16:09 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@88 -- # waitforserial SPDKISFASTANDAWESOME 00:11:07.643 15:16:09 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1198 -- # local i=0 00:11:07.643 15:16:09 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:11:07.643 15:16:09 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:11:07.643 15:16:09 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1205 -- # sleep 2 00:11:09.549 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:11:09.550 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:11:09.550 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:11:09.550 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:11:09.550 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:11:09.550 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1208 -- # return 0 00:11:09.550 15:16:11 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@90 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:11:10.488 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:11:10.748 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@91 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:11:10.748 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1219 -- # local i=0 00:11:10.748 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:11:10.748 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:11:10.748 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1231 -- # return 0 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@93 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 5 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@94 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@99 -- # seq 1 5 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@99 -- # for i in $(seq 1 $loops) 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@100 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@101 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 [2024-09-27 15:16:12.418767] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@102 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@103 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@105 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@107 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@99 -- # for i in $(seq 1 $loops) 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@100 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@101 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 [2024-09-27 15:16:12.467527] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@102 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@103 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@105 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@107 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@99 -- # for i in $(seq 1 $loops) 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@100 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@101 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 [2024-09-27 15:16:12.515673] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@102 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@103 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@105 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@107 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@99 -- # for i in $(seq 1 $loops) 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@100 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@101 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.749 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.749 [2024-09-27 15:16:12.563829] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@102 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@103 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@105 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@107 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:10.750 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@99 -- # for i in $(seq 1 $loops) 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@100 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -s SPDKISFASTANDAWESOME 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@101 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 [2024-09-27 15:16:12.612004] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@102 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@103 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode1 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@105 -- # rpc_cmd nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@107 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.009 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@110 -- # rpc_cmd nvmf_get_stats 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@110 -- # stats='{ 00:11:11.010 "tick_rate": 2300000000, 00:11:11.010 "poll_groups": [ 00:11:11.010 { 00:11:11.010 "name": "nvmf_tgt_poll_group_000", 00:11:11.010 "admin_qpairs": 2, 00:11:11.010 "io_qpairs": 27, 00:11:11.010 "current_admin_qpairs": 0, 00:11:11.010 "current_io_qpairs": 0, 00:11:11.010 "pending_bdev_io": 0, 00:11:11.010 "completed_nvme_io": 78, 00:11:11.010 "transports": [ 00:11:11.010 { 00:11:11.010 "trtype": "RDMA", 00:11:11.010 "pending_data_buffer": 0, 00:11:11.010 "devices": [ 00:11:11.010 { 00:11:11.010 "name": "mlx5_0", 00:11:11.010 "polls": 3594990, 00:11:11.010 "idle_polls": 3594990, 00:11:11.010 "completions": 0, 00:11:11.010 "requests": 0, 00:11:11.010 "request_latency": 0, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 0, 00:11:11.010 "send_doorbell_updates": 0, 00:11:11.010 "total_recv_wrs": 4096, 00:11:11.010 "recv_doorbell_updates": 1 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "mlx5_1", 00:11:11.010 "polls": 3594990, 00:11:11.010 "idle_polls": 3594742, 00:11:11.010 "completions": 269, 00:11:11.010 "requests": 134, 00:11:11.010 "request_latency": 21071382, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 212, 00:11:11.010 "send_doorbell_updates": 123, 00:11:11.010 "total_recv_wrs": 4230, 00:11:11.010 "recv_doorbell_updates": 123 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "nvmf_tgt_poll_group_001", 00:11:11.010 "admin_qpairs": 2, 00:11:11.010 "io_qpairs": 26, 00:11:11.010 "current_admin_qpairs": 0, 00:11:11.010 "current_io_qpairs": 0, 00:11:11.010 "pending_bdev_io": 0, 00:11:11.010 "completed_nvme_io": 135, 00:11:11.010 "transports": [ 00:11:11.010 { 00:11:11.010 "trtype": "RDMA", 00:11:11.010 "pending_data_buffer": 0, 00:11:11.010 "devices": [ 00:11:11.010 { 00:11:11.010 "name": "mlx5_0", 00:11:11.010 "polls": 3675410, 00:11:11.010 "idle_polls": 3675410, 00:11:11.010 "completions": 0, 00:11:11.010 "requests": 0, 00:11:11.010 "request_latency": 0, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 0, 00:11:11.010 "send_doorbell_updates": 0, 00:11:11.010 "total_recv_wrs": 4096, 00:11:11.010 "recv_doorbell_updates": 1 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "mlx5_1", 00:11:11.010 "polls": 3675410, 00:11:11.010 "idle_polls": 3675077, 00:11:11.010 "completions": 382, 00:11:11.010 "requests": 191, 00:11:11.010 "request_latency": 39125514, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 326, 00:11:11.010 "send_doorbell_updates": 163, 00:11:11.010 "total_recv_wrs": 4287, 00:11:11.010 "recv_doorbell_updates": 164 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "nvmf_tgt_poll_group_002", 00:11:11.010 "admin_qpairs": 1, 00:11:11.010 "io_qpairs": 26, 00:11:11.010 "current_admin_qpairs": 0, 00:11:11.010 "current_io_qpairs": 0, 00:11:11.010 "pending_bdev_io": 0, 00:11:11.010 "completed_nvme_io": 115, 00:11:11.010 "transports": [ 00:11:11.010 { 00:11:11.010 "trtype": "RDMA", 00:11:11.010 "pending_data_buffer": 0, 00:11:11.010 "devices": [ 00:11:11.010 { 00:11:11.010 "name": "mlx5_0", 00:11:11.010 "polls": 3620859, 00:11:11.010 "idle_polls": 3620859, 00:11:11.010 "completions": 0, 00:11:11.010 "requests": 0, 00:11:11.010 "request_latency": 0, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 0, 00:11:11.010 "send_doorbell_updates": 0, 00:11:11.010 "total_recv_wrs": 4096, 00:11:11.010 "recv_doorbell_updates": 1 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "mlx5_1", 00:11:11.010 "polls": 3620859, 00:11:11.010 "idle_polls": 3620603, 00:11:11.010 "completions": 289, 00:11:11.010 "requests": 144, 00:11:11.010 "request_latency": 28688262, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 247, 00:11:11.010 "send_doorbell_updates": 129, 00:11:11.010 "total_recv_wrs": 4240, 00:11:11.010 "recv_doorbell_updates": 129 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "nvmf_tgt_poll_group_003", 00:11:11.010 "admin_qpairs": 2, 00:11:11.010 "io_qpairs": 26, 00:11:11.010 "current_admin_qpairs": 0, 00:11:11.010 "current_io_qpairs": 0, 00:11:11.010 "pending_bdev_io": 0, 00:11:11.010 "completed_nvme_io": 127, 00:11:11.010 "transports": [ 00:11:11.010 { 00:11:11.010 "trtype": "RDMA", 00:11:11.010 "pending_data_buffer": 0, 00:11:11.010 "devices": [ 00:11:11.010 { 00:11:11.010 "name": "mlx5_0", 00:11:11.010 "polls": 2812436, 00:11:11.010 "idle_polls": 2812436, 00:11:11.010 "completions": 0, 00:11:11.010 "requests": 0, 00:11:11.010 "request_latency": 0, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 0, 00:11:11.010 "send_doorbell_updates": 0, 00:11:11.010 "total_recv_wrs": 4096, 00:11:11.010 "recv_doorbell_updates": 1 00:11:11.010 }, 00:11:11.010 { 00:11:11.010 "name": "mlx5_1", 00:11:11.010 "polls": 2812436, 00:11:11.010 "idle_polls": 2812114, 00:11:11.010 "completions": 364, 00:11:11.010 "requests": 182, 00:11:11.010 "request_latency": 36280988, 00:11:11.010 "pending_free_request": 0, 00:11:11.010 "pending_rdma_read": 0, 00:11:11.010 "pending_rdma_write": 0, 00:11:11.010 "pending_rdma_send": 0, 00:11:11.010 "total_send_wrs": 309, 00:11:11.010 "send_doorbell_updates": 159, 00:11:11.010 "total_recv_wrs": 4278, 00:11:11.010 "recv_doorbell_updates": 160 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 } 00:11:11.010 ] 00:11:11.010 }' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@112 -- # jsum '.poll_groups[].admin_qpairs' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@19 -- # local 'filter=.poll_groups[].admin_qpairs' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # jq '.poll_groups[].admin_qpairs' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # awk '{s+=$1}END{print s}' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@112 -- # (( 7 > 0 )) 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@113 -- # jsum '.poll_groups[].io_qpairs' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@19 -- # local 'filter=.poll_groups[].io_qpairs' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # jq '.poll_groups[].io_qpairs' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # awk '{s+=$1}END{print s}' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@113 -- # (( 105 > 0 )) 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@115 -- # '[' rdma == rdma ']' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@117 -- # jsum '.poll_groups[].transports[].devices[].completions' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@19 -- # local 'filter=.poll_groups[].transports[].devices[].completions' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # jq '.poll_groups[].transports[].devices[].completions' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # awk '{s+=$1}END{print s}' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@117 -- # (( 1304 > 0 )) 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@118 -- # jsum '.poll_groups[].transports[].devices[].request_latency' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@19 -- # local 'filter=.poll_groups[].transports[].devices[].request_latency' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # jq '.poll_groups[].transports[].devices[].request_latency' 00:11:11.010 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@20 -- # awk '{s+=$1}END{print s}' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@118 -- # (( 125166146 > 0 )) 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@121 -- # trap - SIGINT SIGTERM EXIT 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- target/rpc.sh@123 -- # nvmftestfini 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@331 -- # nvmfcleanup 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@99 -- # sync 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@102 -- # set +e 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@103 -- # for i in {1..20} 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:11:11.270 rmmod nvme_rdma 00:11:11.270 rmmod nvme_fabrics 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@106 -- # set -e 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@107 -- # return 0 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@332 -- # '[' -n 1743268 ']' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@333 -- # killprocess 1743268 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@950 -- # '[' -z 1743268 ']' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@954 -- # kill -0 1743268 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@955 -- # uname 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1743268 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1743268' 00:11:11.270 killing process with pid 1743268 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@969 -- # kill 1743268 00:11:11.270 15:16:12 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@974 -- # wait 1743268 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@338 -- # nvmf_fini 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@264 -- # local dev 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@267 -- # remove_target_ns 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@268 -- # delete_main_bridge 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@130 -- # return 0 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@41 -- # _dev=0 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@41 -- # dev_map=() 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/setup.sh@284 -- # iptr 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@538 -- # iptables-save 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- nvmf/common.sh@538 -- # iptables-restore 00:11:11.530 00:11:11.530 real 0m38.279s 00:11:11.530 user 2m4.709s 00:11:11.530 sys 0m7.197s 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_rpc -- common/autotest_common.sh@10 -- # set +x 00:11:11.530 ************************************ 00:11:11.530 END TEST nvmf_rpc 00:11:11.530 ************************************ 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@23 -- # run_test nvmf_invalid /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/invalid.sh --transport=rdma 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:11.530 15:16:13 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:11:11.790 ************************************ 00:11:11.790 START TEST nvmf_invalid 00:11:11.790 ************************************ 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/invalid.sh --transport=rdma 00:11:11.790 * Looking for test storage... 00:11:11.790 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1681 -- # lcov --version 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@333 -- # local ver1 ver1_l 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@334 -- # local ver2 ver2_l 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@336 -- # IFS=.-: 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@336 -- # read -ra ver1 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@337 -- # IFS=.-: 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@337 -- # read -ra ver2 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@338 -- # local 'op=<' 00:11:11.790 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@340 -- # ver1_l=2 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@341 -- # ver2_l=1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@344 -- # case "$op" in 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@345 -- # : 1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@364 -- # (( v = 0 )) 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@365 -- # decimal 1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@353 -- # local d=1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@355 -- # echo 1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@365 -- # ver1[v]=1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@366 -- # decimal 2 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@353 -- # local d=2 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@355 -- # echo 2 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@366 -- # ver2[v]=2 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@368 -- # return 0 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:11:11.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:11.791 --rc genhtml_branch_coverage=1 00:11:11.791 --rc genhtml_function_coverage=1 00:11:11.791 --rc genhtml_legend=1 00:11:11.791 --rc geninfo_all_blocks=1 00:11:11.791 --rc geninfo_unexecuted_blocks=1 00:11:11.791 00:11:11.791 ' 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:11:11.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:11.791 --rc genhtml_branch_coverage=1 00:11:11.791 --rc genhtml_function_coverage=1 00:11:11.791 --rc genhtml_legend=1 00:11:11.791 --rc geninfo_all_blocks=1 00:11:11.791 --rc geninfo_unexecuted_blocks=1 00:11:11.791 00:11:11.791 ' 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:11:11.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:11.791 --rc genhtml_branch_coverage=1 00:11:11.791 --rc genhtml_function_coverage=1 00:11:11.791 --rc genhtml_legend=1 00:11:11.791 --rc geninfo_all_blocks=1 00:11:11.791 --rc geninfo_unexecuted_blocks=1 00:11:11.791 00:11:11.791 ' 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:11:11.791 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:11.791 --rc genhtml_branch_coverage=1 00:11:11.791 --rc genhtml_function_coverage=1 00:11:11.791 --rc genhtml_legend=1 00:11:11.791 --rc geninfo_all_blocks=1 00:11:11.791 --rc geninfo_unexecuted_blocks=1 00:11:11.791 00:11:11.791 ' 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@7 -- # uname -s 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:11:11.791 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@15 -- # shopt -s extglob 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- paths/export.sh@5 -- # export PATH 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@50 -- # : 0 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:11:12.051 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@54 -- # have_pci_nics=0 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@11 -- # multi_target_rpc=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py 00:11:12.051 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@12 -- # rpc=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@13 -- # nqn=nqn.2016-06.io.spdk:cnode 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@14 -- # target=foobar 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@16 -- # RANDOM=0 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@34 -- # nvmftestinit 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@292 -- # prepare_net_devs 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@254 -- # local -g is_hw=no 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@256 -- # remove_target_ns 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@125 -- # xtrace_disable 00:11:12.052 15:16:13 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@10 -- # set +x 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@131 -- # pci_devs=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@131 -- # local -a pci_devs 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@132 -- # pci_net_devs=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@133 -- # pci_drivers=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@133 -- # local -A pci_drivers 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@135 -- # net_devs=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@135 -- # local -ga net_devs 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@136 -- # e810=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@136 -- # local -ga e810 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@137 -- # x722=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@137 -- # local -ga x722 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@138 -- # mlx=() 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@138 -- # local -ga mlx 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:11:18.626 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:11:18.626 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:11:18.626 Found net devices under 0000:18:00.0: mlx_0_0 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:18.626 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:11:18.627 Found net devices under 0000:18:00.1: mlx_0_1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@249 -- # get_rdma_if_list 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@75 -- # rdma_devs=() 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@89 -- # continue 2 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@89 -- # continue 2 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@258 -- # is_hw=yes 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@61 -- # uname 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@65 -- # modprobe ib_cm 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@66 -- # modprobe ib_core 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@67 -- # modprobe ib_umad 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@69 -- # modprobe iw_cm 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@27 -- # local -gA dev_map 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@28 -- # local -g _dev 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@44 -- # ips=() 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@58 -- # key_initiator=target1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@11 -- # local val=167772161 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:11:18.627 10.0.0.1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@11 -- # local val=167772162 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:11:18.627 10.0.0.2 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:11:18.627 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@38 -- # ping_ips 1 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@107 -- # local dev=target0 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:18.888 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:18.889 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:18.889 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.033 ms 00:11:18.889 00:11:18.889 --- 10.0.0.2 ping statistics --- 00:11:18.889 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:18.889 rtt min/avg/max/mdev = 0.033/0.033/0.033/0.000 ms 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@107 -- # local dev=target0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:18.889 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:18.889 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.025 ms 00:11:18.889 00:11:18.889 --- 10.0.0.2 ping statistics --- 00:11:18.889 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:18.889 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@98 -- # (( pair++ )) 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@266 -- # return 0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@107 -- # local dev=target0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@107 -- # local dev=target1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@107 -- # local dev=target0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@107 -- # local dev=target1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:11:18.889 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@35 -- # nvmfappstart -m 0xF 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@724 -- # xtrace_disable 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@10 -- # set +x 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@324 -- # nvmfpid=1750386 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@325 -- # waitforlisten 1750386 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@831 -- # '[' -z 1750386 ']' 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:18.890 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@10 -- # set +x 00:11:18.890 15:16:20 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:11:18.890 [2024-09-27 15:16:20.708957] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:11:18.890 [2024-09-27 15:16:20.709020] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:19.150 [2024-09-27 15:16:20.798173] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:11:19.150 [2024-09-27 15:16:20.890883] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:11:19.150 [2024-09-27 15:16:20.890926] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:11:19.150 [2024-09-27 15:16:20.890937] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:11:19.150 [2024-09-27 15:16:20.890946] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:11:19.150 [2024-09-27 15:16:20.890954] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:11:19.150 [2024-09-27 15:16:20.891027] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:19.150 [2024-09-27 15:16:20.891125] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:11:19.150 [2024-09-27 15:16:20.891226] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:11:19.150 [2024-09-27 15:16:20.891227] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:11:19.718 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:19.718 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@864 -- # return 0 00:11:19.718 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:11:19.718 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@730 -- # xtrace_disable 00:11:19.718 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@10 -- # set +x 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@37 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; nvmftestfini $1; exit 1' SIGINT SIGTERM EXIT 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@40 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem -t foobar nqn.2016-06.io.spdk:cnode12433 00:11:19.978 [2024-09-27 15:16:21.785130] nvmf_rpc.c: 396:rpc_nvmf_create_subsystem: *ERROR*: Unable to find target foobar 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@40 -- # out='request: 00:11:19.978 { 00:11:19.978 "nqn": "nqn.2016-06.io.spdk:cnode12433", 00:11:19.978 "tgt_name": "foobar", 00:11:19.978 "method": "nvmf_create_subsystem", 00:11:19.978 "req_id": 1 00:11:19.978 } 00:11:19.978 Got JSON-RPC error response 00:11:19.978 response: 00:11:19.978 { 00:11:19.978 "code": -32603, 00:11:19.978 "message": "Unable to find target foobar" 00:11:19.978 }' 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@41 -- # [[ request: 00:11:19.978 { 00:11:19.978 "nqn": "nqn.2016-06.io.spdk:cnode12433", 00:11:19.978 "tgt_name": "foobar", 00:11:19.978 "method": "nvmf_create_subsystem", 00:11:19.978 "req_id": 1 00:11:19.978 } 00:11:19.978 Got JSON-RPC error response 00:11:19.978 response: 00:11:19.978 { 00:11:19.978 "code": -32603, 00:11:19.978 "message": "Unable to find target foobar" 00:11:19.978 } == *\U\n\a\b\l\e\ \t\o\ \f\i\n\d\ \t\a\r\g\e\t* ]] 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@45 -- # echo -e '\x1f' 00:11:19.978 15:16:21 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem -s $'SPDKISFASTANDAWESOME\037' nqn.2016-06.io.spdk:cnode18469 00:11:20.237 [2024-09-27 15:16:21.993902] nvmf_rpc.c: 413:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode18469: invalid serial number 'SPDKISFASTANDAWESOME' 00:11:20.237 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@45 -- # out='request: 00:11:20.237 { 00:11:20.237 "nqn": "nqn.2016-06.io.spdk:cnode18469", 00:11:20.237 "serial_number": "SPDKISFASTANDAWESOME\u001f", 00:11:20.237 "method": "nvmf_create_subsystem", 00:11:20.237 "req_id": 1 00:11:20.237 } 00:11:20.237 Got JSON-RPC error response 00:11:20.237 response: 00:11:20.237 { 00:11:20.237 "code": -32602, 00:11:20.237 "message": "Invalid SN SPDKISFASTANDAWESOME\u001f" 00:11:20.237 }' 00:11:20.237 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@46 -- # [[ request: 00:11:20.237 { 00:11:20.237 "nqn": "nqn.2016-06.io.spdk:cnode18469", 00:11:20.237 "serial_number": "SPDKISFASTANDAWESOME\u001f", 00:11:20.237 "method": "nvmf_create_subsystem", 00:11:20.237 "req_id": 1 00:11:20.237 } 00:11:20.237 Got JSON-RPC error response 00:11:20.237 response: 00:11:20.237 { 00:11:20.237 "code": -32602, 00:11:20.237 "message": "Invalid SN SPDKISFASTANDAWESOME\u001f" 00:11:20.237 } == *\I\n\v\a\l\i\d\ \S\N* ]] 00:11:20.237 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@50 -- # echo -e '\x1f' 00:11:20.237 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@50 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem -d $'SPDK_Controller\037' nqn.2016-06.io.spdk:cnode28355 00:11:20.496 [2024-09-27 15:16:22.210550] nvmf_rpc.c: 422:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode28355: invalid model number 'SPDK_Controller' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@50 -- # out='request: 00:11:20.496 { 00:11:20.496 "nqn": "nqn.2016-06.io.spdk:cnode28355", 00:11:20.496 "model_number": "SPDK_Controller\u001f", 00:11:20.496 "method": "nvmf_create_subsystem", 00:11:20.496 "req_id": 1 00:11:20.496 } 00:11:20.496 Got JSON-RPC error response 00:11:20.496 response: 00:11:20.496 { 00:11:20.496 "code": -32602, 00:11:20.496 "message": "Invalid MN SPDK_Controller\u001f" 00:11:20.496 }' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@51 -- # [[ request: 00:11:20.496 { 00:11:20.496 "nqn": "nqn.2016-06.io.spdk:cnode28355", 00:11:20.496 "model_number": "SPDK_Controller\u001f", 00:11:20.496 "method": "nvmf_create_subsystem", 00:11:20.496 "req_id": 1 00:11:20.496 } 00:11:20.496 Got JSON-RPC error response 00:11:20.496 response: 00:11:20.496 { 00:11:20.496 "code": -32602, 00:11:20.496 "message": "Invalid MN SPDK_Controller\u001f" 00:11:20.496 } == *\I\n\v\a\l\i\d\ \M\N* ]] 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@54 -- # gen_random_s 21 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@19 -- # local length=21 ll 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@21 -- # chars=('32' '33' '34' '35' '36' '37' '38' '39' '40' '41' '42' '43' '44' '45' '46' '47' '48' '49' '50' '51' '52' '53' '54' '55' '56' '57' '58' '59' '60' '61' '62' '63' '64' '65' '66' '67' '68' '69' '70' '71' '72' '73' '74' '75' '76' '77' '78' '79' '80' '81' '82' '83' '84' '85' '86' '87' '88' '89' '90' '91' '92' '93' '94' '95' '96' '97' '98' '99' '100' '101' '102' '103' '104' '105' '106' '107' '108' '109' '110' '111' '112' '113' '114' '115' '116' '117' '118' '119' '120' '121' '122' '123' '124' '125' '126' '127') 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@21 -- # local chars 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@22 -- # local string 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll = 0 )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 89 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x59' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=Y 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 100 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x64' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=d 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 54 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x36' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=6 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 59 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x3b' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=';' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 118 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x76' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=v 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 45 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x2d' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=- 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 83 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x53' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=S 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 52 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x34' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=4 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 107 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x6b' 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=k 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.496 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 92 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x5c' 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='\' 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 59 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x3b' 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=';' 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.497 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 94 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x5e' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='^' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 97 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x61' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=a 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 39 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x27' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=\' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 101 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x65' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=e 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 115 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x73' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=s 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 103 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x67' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=g 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 54 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x36' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=6 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 38 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x26' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='&' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 42 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x2a' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='*' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 122 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x7a' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=z 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@28 -- # [[ Y == \- ]] 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@31 -- # echo 'Yd6;v-S4k\;^a'\''esg6&*z' 00:11:20.756 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@54 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem -s 'Yd6;v-S4k\;^a'\''esg6&*z' nqn.2016-06.io.spdk:cnode31412 00:11:20.756 [2024-09-27 15:16:22.595780] nvmf_rpc.c: 413:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode31412: invalid serial number 'Yd6;v-S4k\;^a'esg6&*z' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@54 -- # out='request: 00:11:21.017 { 00:11:21.017 "nqn": "nqn.2016-06.io.spdk:cnode31412", 00:11:21.017 "serial_number": "Yd6;v-S4k\\;^a'\''esg6&*z", 00:11:21.017 "method": "nvmf_create_subsystem", 00:11:21.017 "req_id": 1 00:11:21.017 } 00:11:21.017 Got JSON-RPC error response 00:11:21.017 response: 00:11:21.017 { 00:11:21.017 "code": -32602, 00:11:21.017 "message": "Invalid SN Yd6;v-S4k\\;^a'\''esg6&*z" 00:11:21.017 }' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@55 -- # [[ request: 00:11:21.017 { 00:11:21.017 "nqn": "nqn.2016-06.io.spdk:cnode31412", 00:11:21.017 "serial_number": "Yd6;v-S4k\\;^a'esg6&*z", 00:11:21.017 "method": "nvmf_create_subsystem", 00:11:21.017 "req_id": 1 00:11:21.017 } 00:11:21.017 Got JSON-RPC error response 00:11:21.017 response: 00:11:21.017 { 00:11:21.017 "code": -32602, 00:11:21.017 "message": "Invalid SN Yd6;v-S4k\\;^a'esg6&*z" 00:11:21.017 } == *\I\n\v\a\l\i\d\ \S\N* ]] 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@58 -- # gen_random_s 41 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@19 -- # local length=41 ll 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@21 -- # chars=('32' '33' '34' '35' '36' '37' '38' '39' '40' '41' '42' '43' '44' '45' '46' '47' '48' '49' '50' '51' '52' '53' '54' '55' '56' '57' '58' '59' '60' '61' '62' '63' '64' '65' '66' '67' '68' '69' '70' '71' '72' '73' '74' '75' '76' '77' '78' '79' '80' '81' '82' '83' '84' '85' '86' '87' '88' '89' '90' '91' '92' '93' '94' '95' '96' '97' '98' '99' '100' '101' '102' '103' '104' '105' '106' '107' '108' '109' '110' '111' '112' '113' '114' '115' '116' '117' '118' '119' '120' '121' '122' '123' '124' '125' '126' '127') 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@21 -- # local chars 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@22 -- # local string 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll = 0 )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 77 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x4d' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=M 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 36 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x24' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='$' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 90 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x5a' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=Z 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 84 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x54' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=T 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 91 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x5b' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='[' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 58 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x3a' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=: 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 75 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x4b' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=K 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 61 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x3d' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+== 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 35 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x23' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='#' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 67 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x43' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=C 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 88 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x58' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=X 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 74 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x4a' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=J 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 37 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x25' 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=% 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.017 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 102 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x66' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=f 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 123 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x7b' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='{' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 82 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x52' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=R 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 81 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x51' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=Q 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 78 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x4e' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=N 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 56 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x38' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=8 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 102 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x66' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=f 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 104 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x68' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=h 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 87 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x57' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=W 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 33 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x21' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='!' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 120 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x78' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=x 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 64 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x40' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=@ 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 41 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x29' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=')' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 61 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x3d' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+== 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 79 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x4f' 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=O 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 52 00:11:21.018 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x34' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=4 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 52 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x34' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=4 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 89 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x59' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=Y 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 47 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x2f' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=/ 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 119 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x77' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=w 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 50 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x32' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=2 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 50 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x32' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=2 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 102 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x66' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=f 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 36 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x24' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='$' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 80 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x50' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=P 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 33 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x21' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+='!' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 99 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x63' 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=c 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # printf %x 86 00:11:21.278 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # echo -e '\x56' 00:11:21.279 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@25 -- # string+=V 00:11:21.279 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll++ )) 00:11:21.279 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@24 -- # (( ll < length )) 00:11:21.279 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@28 -- # [[ M == \- ]] 00:11:21.279 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@31 -- # echo 'M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV' 00:11:21.279 15:16:22 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@58 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem -d 'M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV' nqn.2016-06.io.spdk:cnode1358 00:11:21.538 [2024-09-27 15:16:23.141578] nvmf_rpc.c: 422:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode1358: invalid model number 'M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV' 00:11:21.538 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@58 -- # out='request: 00:11:21.538 { 00:11:21.538 "nqn": "nqn.2016-06.io.spdk:cnode1358", 00:11:21.538 "model_number": "M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV", 00:11:21.538 "method": "nvmf_create_subsystem", 00:11:21.538 "req_id": 1 00:11:21.538 } 00:11:21.538 Got JSON-RPC error response 00:11:21.538 response: 00:11:21.538 { 00:11:21.538 "code": -32602, 00:11:21.538 "message": "Invalid MN M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV" 00:11:21.538 }' 00:11:21.538 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@59 -- # [[ request: 00:11:21.538 { 00:11:21.538 "nqn": "nqn.2016-06.io.spdk:cnode1358", 00:11:21.538 "model_number": "M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV", 00:11:21.538 "method": "nvmf_create_subsystem", 00:11:21.538 "req_id": 1 00:11:21.538 } 00:11:21.538 Got JSON-RPC error response 00:11:21.538 response: 00:11:21.538 { 00:11:21.538 "code": -32602, 00:11:21.538 "message": "Invalid MN M$ZT[:K=#CXJ%f{RQN8fhW!x@)=O44Y/w22f$P!cV" 00:11:21.538 } == *\I\n\v\a\l\i\d\ \M\N* ]] 00:11:21.538 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@62 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport --trtype rdma 00:11:21.538 [2024-09-27 15:16:23.368589] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1f48dc0/0x1f4d2b0) succeed. 00:11:21.538 [2024-09-27 15:16:23.379179] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1f4a400/0x1f8e950) succeed. 00:11:21.797 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode -s SPDK001 -a 00:11:22.057 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_listener nqn.2016-06.io.spdk:cnode -t rdma -a 10.0.0.2 -s 4421 00:11:22.316 [2024-09-27 15:16:23.930721] nvmf_rpc.c: 783:nvmf_rpc_listen_paused: *ERROR*: Unable to remove listener, rc -2 00:11:22.316 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@64 -- # out='request: 00:11:22.316 { 00:11:22.316 "nqn": "nqn.2016-06.io.spdk:cnode", 00:11:22.316 "listen_address": { 00:11:22.316 "trtype": "rdma", 00:11:22.316 "traddr": "10.0.0.2", 00:11:22.316 "trsvcid": "4421" 00:11:22.316 }, 00:11:22.316 "method": "nvmf_subsystem_remove_listener", 00:11:22.316 "req_id": 1 00:11:22.316 } 00:11:22.316 Got JSON-RPC error response 00:11:22.316 response: 00:11:22.316 { 00:11:22.316 "code": -32602, 00:11:22.316 "message": "Invalid parameters" 00:11:22.316 }' 00:11:22.316 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@65 -- # [[ request: 00:11:22.316 { 00:11:22.316 "nqn": "nqn.2016-06.io.spdk:cnode", 00:11:22.316 "listen_address": { 00:11:22.316 "trtype": "rdma", 00:11:22.316 "traddr": "10.0.0.2", 00:11:22.316 "trsvcid": "4421" 00:11:22.316 }, 00:11:22.316 "method": "nvmf_subsystem_remove_listener", 00:11:22.316 "req_id": 1 00:11:22.316 } 00:11:22.316 Got JSON-RPC error response 00:11:22.316 response: 00:11:22.316 { 00:11:22.316 "code": -32602, 00:11:22.316 "message": "Invalid parameters" 00:11:22.316 } != *\U\n\a\b\l\e\ \t\o\ \s\t\o\p\ \l\i\s\t\e\n\e\r\.* ]] 00:11:22.316 15:16:23 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@68 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode9039 -i 0 00:11:22.316 [2024-09-27 15:16:24.139446] nvmf_rpc.c: 434:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode9039: invalid cntlid range [0-65519] 00:11:22.574 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@68 -- # out='request: 00:11:22.574 { 00:11:22.574 "nqn": "nqn.2016-06.io.spdk:cnode9039", 00:11:22.574 "min_cntlid": 0, 00:11:22.574 "method": "nvmf_create_subsystem", 00:11:22.574 "req_id": 1 00:11:22.574 } 00:11:22.574 Got JSON-RPC error response 00:11:22.574 response: 00:11:22.574 { 00:11:22.574 "code": -32602, 00:11:22.574 "message": "Invalid cntlid range [0-65519]" 00:11:22.574 }' 00:11:22.574 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@69 -- # [[ request: 00:11:22.574 { 00:11:22.574 "nqn": "nqn.2016-06.io.spdk:cnode9039", 00:11:22.574 "min_cntlid": 0, 00:11:22.574 "method": "nvmf_create_subsystem", 00:11:22.574 "req_id": 1 00:11:22.574 } 00:11:22.574 Got JSON-RPC error response 00:11:22.574 response: 00:11:22.574 { 00:11:22.574 "code": -32602, 00:11:22.574 "message": "Invalid cntlid range [0-65519]" 00:11:22.574 } == *\I\n\v\a\l\i\d\ \c\n\t\l\i\d\ \r\a\n\g\e* ]] 00:11:22.574 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@70 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode18508 -i 65520 00:11:22.574 [2024-09-27 15:16:24.332103] nvmf_rpc.c: 434:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode18508: invalid cntlid range [65520-65519] 00:11:22.574 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@70 -- # out='request: 00:11:22.574 { 00:11:22.574 "nqn": "nqn.2016-06.io.spdk:cnode18508", 00:11:22.574 "min_cntlid": 65520, 00:11:22.574 "method": "nvmf_create_subsystem", 00:11:22.574 "req_id": 1 00:11:22.574 } 00:11:22.574 Got JSON-RPC error response 00:11:22.574 response: 00:11:22.574 { 00:11:22.574 "code": -32602, 00:11:22.574 "message": "Invalid cntlid range [65520-65519]" 00:11:22.574 }' 00:11:22.574 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@71 -- # [[ request: 00:11:22.574 { 00:11:22.574 "nqn": "nqn.2016-06.io.spdk:cnode18508", 00:11:22.574 "min_cntlid": 65520, 00:11:22.574 "method": "nvmf_create_subsystem", 00:11:22.574 "req_id": 1 00:11:22.574 } 00:11:22.574 Got JSON-RPC error response 00:11:22.574 response: 00:11:22.574 { 00:11:22.574 "code": -32602, 00:11:22.574 "message": "Invalid cntlid range [65520-65519]" 00:11:22.574 } == *\I\n\v\a\l\i\d\ \c\n\t\l\i\d\ \r\a\n\g\e* ]] 00:11:22.574 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@72 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode24647 -I 0 00:11:22.833 [2024-09-27 15:16:24.540850] nvmf_rpc.c: 434:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode24647: invalid cntlid range [1-0] 00:11:22.833 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@72 -- # out='request: 00:11:22.833 { 00:11:22.833 "nqn": "nqn.2016-06.io.spdk:cnode24647", 00:11:22.833 "max_cntlid": 0, 00:11:22.833 "method": "nvmf_create_subsystem", 00:11:22.833 "req_id": 1 00:11:22.833 } 00:11:22.833 Got JSON-RPC error response 00:11:22.833 response: 00:11:22.833 { 00:11:22.833 "code": -32602, 00:11:22.833 "message": "Invalid cntlid range [1-0]" 00:11:22.833 }' 00:11:22.833 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@73 -- # [[ request: 00:11:22.833 { 00:11:22.833 "nqn": "nqn.2016-06.io.spdk:cnode24647", 00:11:22.833 "max_cntlid": 0, 00:11:22.833 "method": "nvmf_create_subsystem", 00:11:22.833 "req_id": 1 00:11:22.833 } 00:11:22.833 Got JSON-RPC error response 00:11:22.833 response: 00:11:22.833 { 00:11:22.833 "code": -32602, 00:11:22.833 "message": "Invalid cntlid range [1-0]" 00:11:22.833 } == *\I\n\v\a\l\i\d\ \c\n\t\l\i\d\ \r\a\n\g\e* ]] 00:11:22.833 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@74 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode22152 -I 65520 00:11:23.092 [2024-09-27 15:16:24.741570] nvmf_rpc.c: 434:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode22152: invalid cntlid range [1-65520] 00:11:23.092 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@74 -- # out='request: 00:11:23.092 { 00:11:23.092 "nqn": "nqn.2016-06.io.spdk:cnode22152", 00:11:23.092 "max_cntlid": 65520, 00:11:23.092 "method": "nvmf_create_subsystem", 00:11:23.092 "req_id": 1 00:11:23.092 } 00:11:23.092 Got JSON-RPC error response 00:11:23.092 response: 00:11:23.092 { 00:11:23.092 "code": -32602, 00:11:23.092 "message": "Invalid cntlid range [1-65520]" 00:11:23.092 }' 00:11:23.092 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@75 -- # [[ request: 00:11:23.092 { 00:11:23.092 "nqn": "nqn.2016-06.io.spdk:cnode22152", 00:11:23.092 "max_cntlid": 65520, 00:11:23.092 "method": "nvmf_create_subsystem", 00:11:23.092 "req_id": 1 00:11:23.092 } 00:11:23.092 Got JSON-RPC error response 00:11:23.092 response: 00:11:23.092 { 00:11:23.092 "code": -32602, 00:11:23.092 "message": "Invalid cntlid range [1-65520]" 00:11:23.092 } == *\I\n\v\a\l\i\d\ \c\n\t\l\i\d\ \r\a\n\g\e* ]] 00:11:23.092 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@78 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode15593 -i 6 -I 5 00:11:23.351 [2024-09-27 15:16:24.950305] nvmf_rpc.c: 434:rpc_nvmf_create_subsystem: *ERROR*: Subsystem nqn.2016-06.io.spdk:cnode15593: invalid cntlid range [6-5] 00:11:23.351 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@78 -- # out='request: 00:11:23.351 { 00:11:23.351 "nqn": "nqn.2016-06.io.spdk:cnode15593", 00:11:23.351 "min_cntlid": 6, 00:11:23.351 "max_cntlid": 5, 00:11:23.351 "method": "nvmf_create_subsystem", 00:11:23.351 "req_id": 1 00:11:23.351 } 00:11:23.351 Got JSON-RPC error response 00:11:23.351 response: 00:11:23.351 { 00:11:23.351 "code": -32602, 00:11:23.351 "message": "Invalid cntlid range [6-5]" 00:11:23.351 }' 00:11:23.351 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@79 -- # [[ request: 00:11:23.351 { 00:11:23.351 "nqn": "nqn.2016-06.io.spdk:cnode15593", 00:11:23.351 "min_cntlid": 6, 00:11:23.351 "max_cntlid": 5, 00:11:23.351 "method": "nvmf_create_subsystem", 00:11:23.351 "req_id": 1 00:11:23.351 } 00:11:23.351 Got JSON-RPC error response 00:11:23.351 response: 00:11:23.351 { 00:11:23.351 "code": -32602, 00:11:23.351 "message": "Invalid cntlid range [6-5]" 00:11:23.351 } == *\I\n\v\a\l\i\d\ \c\n\t\l\i\d\ \r\a\n\g\e* ]] 00:11:23.351 15:16:24 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@82 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/multitarget_rpc.py nvmf_delete_target --name foobar 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@82 -- # out='request: 00:11:23.351 { 00:11:23.351 "name": "foobar", 00:11:23.351 "method": "nvmf_delete_target", 00:11:23.351 "req_id": 1 00:11:23.351 } 00:11:23.351 Got JSON-RPC error response 00:11:23.351 response: 00:11:23.351 { 00:11:23.351 "code": -32602, 00:11:23.351 "message": "The specified target doesn'\''t exist, cannot delete it." 00:11:23.351 }' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@83 -- # [[ request: 00:11:23.351 { 00:11:23.351 "name": "foobar", 00:11:23.351 "method": "nvmf_delete_target", 00:11:23.351 "req_id": 1 00:11:23.351 } 00:11:23.351 Got JSON-RPC error response 00:11:23.351 response: 00:11:23.351 { 00:11:23.351 "code": -32602, 00:11:23.351 "message": "The specified target doesn't exist, cannot delete it." 00:11:23.351 } == *\T\h\e\ \s\p\e\c\i\f\i\e\d\ \t\a\r\g\e\t\ \d\o\e\s\n\'\t\ \e\x\i\s\t\,\ \c\a\n\n\o\t\ \d\e\l\e\t\e\ \i\t\.* ]] 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@85 -- # trap - SIGINT SIGTERM EXIT 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- target/invalid.sh@86 -- # nvmftestfini 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@331 -- # nvmfcleanup 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@99 -- # sync 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@102 -- # set +e 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@103 -- # for i in {1..20} 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:11:23.351 rmmod nvme_rdma 00:11:23.351 rmmod nvme_fabrics 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@106 -- # set -e 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@107 -- # return 0 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@332 -- # '[' -n 1750386 ']' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@333 -- # killprocess 1750386 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@950 -- # '[' -z 1750386 ']' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@954 -- # kill -0 1750386 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@955 -- # uname 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1750386 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1750386' 00:11:23.351 killing process with pid 1750386 00:11:23.351 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@969 -- # kill 1750386 00:11:23.352 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@974 -- # wait 1750386 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@338 -- # nvmf_fini 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@264 -- # local dev 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@267 -- # remove_target_ns 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@268 -- # delete_main_bridge 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@130 -- # return 0 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@41 -- # _dev=0 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@41 -- # dev_map=() 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/setup.sh@284 -- # iptr 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@538 -- # iptables-save 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- nvmf/common.sh@538 -- # iptables-restore 00:11:23.919 00:11:23.919 real 0m12.082s 00:11:23.919 user 0m22.798s 00:11:23.919 sys 0m6.538s 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_invalid -- common/autotest_common.sh@10 -- # set +x 00:11:23.919 ************************************ 00:11:23.919 END TEST nvmf_invalid 00:11:23.919 ************************************ 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@24 -- # run_test nvmf_connect_stress /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/connect_stress.sh --transport=rdma 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:11:23.919 ************************************ 00:11:23.919 START TEST nvmf_connect_stress 00:11:23.919 ************************************ 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/connect_stress.sh --transport=rdma 00:11:23.919 * Looking for test storage... 00:11:23.919 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1681 -- # lcov --version 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:11:23.919 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@333 -- # local ver1 ver1_l 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@334 -- # local ver2 ver2_l 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@336 -- # IFS=.-: 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@336 -- # read -ra ver1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@337 -- # IFS=.-: 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@337 -- # read -ra ver2 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@338 -- # local 'op=<' 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@340 -- # ver1_l=2 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@341 -- # ver2_l=1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@344 -- # case "$op" in 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@345 -- # : 1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@364 -- # (( v = 0 )) 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@365 -- # decimal 1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@353 -- # local d=1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@355 -- # echo 1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@365 -- # ver1[v]=1 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@366 -- # decimal 2 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@353 -- # local d=2 00:11:24.179 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@355 -- # echo 2 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@366 -- # ver2[v]=2 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@368 -- # return 0 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:11:24.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:24.180 --rc genhtml_branch_coverage=1 00:11:24.180 --rc genhtml_function_coverage=1 00:11:24.180 --rc genhtml_legend=1 00:11:24.180 --rc geninfo_all_blocks=1 00:11:24.180 --rc geninfo_unexecuted_blocks=1 00:11:24.180 00:11:24.180 ' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:11:24.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:24.180 --rc genhtml_branch_coverage=1 00:11:24.180 --rc genhtml_function_coverage=1 00:11:24.180 --rc genhtml_legend=1 00:11:24.180 --rc geninfo_all_blocks=1 00:11:24.180 --rc geninfo_unexecuted_blocks=1 00:11:24.180 00:11:24.180 ' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:11:24.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:24.180 --rc genhtml_branch_coverage=1 00:11:24.180 --rc genhtml_function_coverage=1 00:11:24.180 --rc genhtml_legend=1 00:11:24.180 --rc geninfo_all_blocks=1 00:11:24.180 --rc geninfo_unexecuted_blocks=1 00:11:24.180 00:11:24.180 ' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:11:24.180 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:24.180 --rc genhtml_branch_coverage=1 00:11:24.180 --rc genhtml_function_coverage=1 00:11:24.180 --rc genhtml_legend=1 00:11:24.180 --rc geninfo_all_blocks=1 00:11:24.180 --rc geninfo_unexecuted_blocks=1 00:11:24.180 00:11:24.180 ' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@7 -- # uname -s 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@15 -- # shopt -s extglob 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- paths/export.sh@5 -- # export PATH 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@50 -- # : 0 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:11:24.180 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@54 -- # have_pci_nics=0 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@12 -- # nvmftestinit 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@292 -- # prepare_net_devs 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@254 -- # local -g is_hw=no 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@256 -- # remove_target_ns 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@125 -- # xtrace_disable 00:11:24.180 15:16:25 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@131 -- # pci_devs=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@131 -- # local -a pci_devs 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@132 -- # pci_net_devs=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@133 -- # pci_drivers=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@133 -- # local -A pci_drivers 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@135 -- # net_devs=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@135 -- # local -ga net_devs 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@136 -- # e810=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@136 -- # local -ga e810 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@137 -- # x722=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@137 -- # local -ga x722 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@138 -- # mlx=() 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@138 -- # local -ga mlx 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:11:30.757 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:11:30.757 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:30.757 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:11:30.758 Found net devices under 0000:18:00.0: mlx_0_0 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:11:30.758 Found net devices under 0000:18:00.1: mlx_0_1 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@249 -- # get_rdma_if_list 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@75 -- # rdma_devs=() 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@89 -- # continue 2 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@89 -- # continue 2 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@258 -- # is_hw=yes 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@61 -- # uname 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@65 -- # modprobe ib_cm 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@66 -- # modprobe ib_core 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@67 -- # modprobe ib_umad 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@69 -- # modprobe iw_cm 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@27 -- # local -gA dev_map 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@28 -- # local -g _dev 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@44 -- # ips=() 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:11:30.758 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@58 -- # key_initiator=target1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@11 -- # local val=167772161 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:11:31.019 10.0.0.1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@11 -- # local val=167772162 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:11:31.019 10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@38 -- # ping_ips 1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:31.019 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:31.019 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.030 ms 00:11:31.019 00:11:31.019 --- 10.0.0.2 ping statistics --- 00:11:31.019 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:31.019 rtt min/avg/max/mdev = 0.030/0.030/0.030/0.000 ms 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:31.019 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:31.019 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:31.019 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.028 ms 00:11:31.019 00:11:31.019 --- 10.0.0.2 ping statistics --- 00:11:31.020 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:31.020 rtt min/avg/max/mdev = 0.028/0.028/0.028/0.000 ms 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@98 -- # (( pair++ )) 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@266 -- # return 0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@107 -- # local dev=target1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@107 -- # local dev=target0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@107 -- # local dev=target1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@13 -- # nvmfappstart -m 0xE 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@724 -- # xtrace_disable 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@324 -- # nvmfpid=1754201 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@325 -- # waitforlisten 1754201 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xE 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@831 -- # '[' -z 1754201 ']' 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:31.020 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:31.020 15:16:32 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:31.280 [2024-09-27 15:16:32.883494] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:11:31.280 [2024-09-27 15:16:32.883555] [ DPDK EAL parameters: nvmf -c 0xE --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:31.280 [2024-09-27 15:16:32.969500] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:11:31.280 [2024-09-27 15:16:33.055743] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:11:31.280 [2024-09-27 15:16:33.055790] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:11:31.280 [2024-09-27 15:16:33.055800] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:11:31.280 [2024-09-27 15:16:33.055809] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:11:31.280 [2024-09-27 15:16:33.055816] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:11:31.280 [2024-09-27 15:16:33.055943] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:11:31.280 [2024-09-27 15:16:33.056031] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:31.280 [2024-09-27 15:16:33.056032] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@864 -- # return 0 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@730 -- # xtrace_disable 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@15 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.218 [2024-09-27 15:16:33.848619] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x102bc10/0x1030100) succeed. 00:11:32.218 [2024-09-27 15:16:33.859607] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x102d1b0/0x10717a0) succeed. 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@16 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -m 10 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@17 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.218 [2024-09-27 15:16:33.969455] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@18 -- # rpc_cmd bdev_null_create NULL1 1000 512 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.218 NULL1 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@21 -- # PERF_PID=1754402 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@23 -- # rpcs=/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpc.txt 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@20 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvme/connect_stress/connect_stress -c 0x1 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2016-06.io.spdk:cnode1' -t 10 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@25 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpc.txt 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # seq 1 20 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:33 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.218 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@27 -- # for i in $(seq 1 20) 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@28 -- # cat 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.480 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.739 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.739 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:32.739 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:32.739 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.739 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:32.999 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:32.999 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:32.999 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:32.999 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:32.999 15:16:34 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:33.258 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:33.258 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:33.258 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:33.258 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:33.258 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:33.828 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:33.828 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:33.828 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:33.828 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:33.828 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:34.088 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:34.088 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:34.088 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:34.088 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:34.088 15:16:35 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:34.348 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:34.348 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:34.348 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:34.348 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:34.348 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:34.608 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:34.608 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:34.608 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:34.608 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:34.608 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:34.867 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:34.867 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:34.867 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:34.867 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:34.867 15:16:36 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:35.196 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.196 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:35.196 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:35.196 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.196 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:35.765 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:35.765 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:35.765 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:35.765 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:35.765 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:36.025 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:36.025 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:36.025 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:36.025 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:36.025 15:16:37 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:36.285 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:36.285 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:36.285 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:36.285 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:36.285 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:36.544 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:36.544 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:36.544 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:36.544 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:36.544 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:37.113 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:37.113 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:37.113 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:37.113 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:37.113 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:37.373 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:37.373 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:37.373 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:37.373 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:37.373 15:16:38 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:37.633 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:37.633 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:37.633 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:37.633 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:37.633 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:37.892 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:37.892 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:37.892 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:37.892 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:37.892 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:38.152 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:38.152 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:38.152 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:38.152 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:38.152 15:16:39 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:38.721 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:38.721 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:38.721 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:38.721 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:38.721 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:38.981 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:38.981 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:38.981 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:38.981 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:38.981 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:39.240 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:39.240 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:39.240 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:39.240 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:39.240 15:16:40 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:39.499 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:39.499 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:39.499 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:39.499 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:39.499 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:40.067 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:40.067 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:40.067 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:40.067 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:40.068 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:40.327 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:40.327 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:40.327 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:40.327 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:40.327 15:16:41 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:40.586 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:40.586 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:40.586 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:40.586 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:40.586 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:40.846 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:40.846 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:40.846 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:40.846 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:40.846 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:41.106 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:41.106 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:41.106 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:41.106 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:41.106 15:16:42 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:41.675 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:41.675 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:41.675 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:41.675 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:41.675 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:41.935 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:41.935 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:41.935 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:41.935 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:41.935 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:42.195 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:42.195 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:42.195 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@35 -- # rpc_cmd 00:11:42.195 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:42.195 15:16:43 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:42.506 Testing NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@34 -- # kill -0 1754402 00:11:42.506 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/connect_stress.sh: line 34: kill: (1754402) - No such process 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@38 -- # wait 1754402 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@39 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpc.txt 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@41 -- # trap - SIGINT SIGTERM EXIT 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- target/connect_stress.sh@43 -- # nvmftestfini 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@331 -- # nvmfcleanup 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@99 -- # sync 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@102 -- # set +e 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@103 -- # for i in {1..20} 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:11:42.506 rmmod nvme_rdma 00:11:42.506 rmmod nvme_fabrics 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@106 -- # set -e 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@107 -- # return 0 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@332 -- # '[' -n 1754201 ']' 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@333 -- # killprocess 1754201 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@950 -- # '[' -z 1754201 ']' 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@954 -- # kill -0 1754201 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@955 -- # uname 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:42.506 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1754201 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1754201' 00:11:42.819 killing process with pid 1754201 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@969 -- # kill 1754201 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@974 -- # wait 1754201 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@338 -- # nvmf_fini 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@264 -- # local dev 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@267 -- # remove_target_ns 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@268 -- # delete_main_bridge 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@130 -- # return 0 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:11:42.819 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@41 -- # _dev=0 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@41 -- # dev_map=() 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/setup.sh@284 -- # iptr 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@538 -- # iptables-save 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- nvmf/common.sh@538 -- # iptables-restore 00:11:43.079 00:11:43.079 real 0m19.084s 00:11:43.079 user 0m42.427s 00:11:43.079 sys 0m8.004s 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_connect_stress -- common/autotest_common.sh@10 -- # set +x 00:11:43.079 ************************************ 00:11:43.079 END TEST nvmf_connect_stress 00:11:43.079 ************************************ 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@25 -- # run_test nvmf_fused_ordering /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/fused_ordering.sh --transport=rdma 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:11:43.079 ************************************ 00:11:43.079 START TEST nvmf_fused_ordering 00:11:43.079 ************************************ 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/fused_ordering.sh --transport=rdma 00:11:43.079 * Looking for test storage... 00:11:43.079 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1681 -- # lcov --version 00:11:43.079 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@333 -- # local ver1 ver1_l 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@334 -- # local ver2 ver2_l 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@336 -- # IFS=.-: 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@336 -- # read -ra ver1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@337 -- # IFS=.-: 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@337 -- # read -ra ver2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@338 -- # local 'op=<' 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@340 -- # ver1_l=2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@341 -- # ver2_l=1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@344 -- # case "$op" in 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@345 -- # : 1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@364 -- # (( v = 0 )) 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@365 -- # decimal 1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@353 -- # local d=1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@355 -- # echo 1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@365 -- # ver1[v]=1 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@366 -- # decimal 2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@353 -- # local d=2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@355 -- # echo 2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@366 -- # ver2[v]=2 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@368 -- # return 0 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:11:43.339 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:11:43.339 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:43.339 --rc genhtml_branch_coverage=1 00:11:43.339 --rc genhtml_function_coverage=1 00:11:43.339 --rc genhtml_legend=1 00:11:43.339 --rc geninfo_all_blocks=1 00:11:43.340 --rc geninfo_unexecuted_blocks=1 00:11:43.340 00:11:43.340 ' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:11:43.340 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:43.340 --rc genhtml_branch_coverage=1 00:11:43.340 --rc genhtml_function_coverage=1 00:11:43.340 --rc genhtml_legend=1 00:11:43.340 --rc geninfo_all_blocks=1 00:11:43.340 --rc geninfo_unexecuted_blocks=1 00:11:43.340 00:11:43.340 ' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:11:43.340 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:43.340 --rc genhtml_branch_coverage=1 00:11:43.340 --rc genhtml_function_coverage=1 00:11:43.340 --rc genhtml_legend=1 00:11:43.340 --rc geninfo_all_blocks=1 00:11:43.340 --rc geninfo_unexecuted_blocks=1 00:11:43.340 00:11:43.340 ' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:11:43.340 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:43.340 --rc genhtml_branch_coverage=1 00:11:43.340 --rc genhtml_function_coverage=1 00:11:43.340 --rc genhtml_legend=1 00:11:43.340 --rc geninfo_all_blocks=1 00:11:43.340 --rc geninfo_unexecuted_blocks=1 00:11:43.340 00:11:43.340 ' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@7 -- # uname -s 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@15 -- # shopt -s extglob 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- paths/export.sh@5 -- # export PATH 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@50 -- # : 0 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:11:43.340 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@54 -- # have_pci_nics=0 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@12 -- # nvmftestinit 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@292 -- # prepare_net_devs 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@254 -- # local -g is_hw=no 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@256 -- # remove_target_ns 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@125 -- # xtrace_disable 00:11:43.340 15:16:44 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:49.914 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:11:49.914 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@131 -- # pci_devs=() 00:11:49.914 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@131 -- # local -a pci_devs 00:11:49.914 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@132 -- # pci_net_devs=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@133 -- # pci_drivers=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@133 -- # local -A pci_drivers 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@135 -- # net_devs=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@135 -- # local -ga net_devs 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@136 -- # e810=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@136 -- # local -ga e810 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@137 -- # x722=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@137 -- # local -ga x722 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@138 -- # mlx=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@138 -- # local -ga mlx 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:11:49.915 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:11:49.915 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:11:49.915 Found net devices under 0000:18:00.0: mlx_0_0 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:11:49.915 Found net devices under 0000:18:00.1: mlx_0_1 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@249 -- # get_rdma_if_list 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@75 -- # rdma_devs=() 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@89 -- # continue 2 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@89 -- # continue 2 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@258 -- # is_hw=yes 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@61 -- # uname 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@65 -- # modprobe ib_cm 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@66 -- # modprobe ib_core 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@67 -- # modprobe ib_umad 00:11:49.915 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@69 -- # modprobe iw_cm 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@27 -- # local -gA dev_map 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@28 -- # local -g _dev 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@44 -- # ips=() 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@58 -- # key_initiator=target1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@11 -- # local val=167772161 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:11:50.177 10.0.0.1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@11 -- # local val=167772162 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:11:50.177 10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@38 -- # ping_ips 1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@107 -- # local dev=target0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:50.177 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:50.177 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.030 ms 00:11:50.177 00:11:50.177 --- 10.0.0.2 ping statistics --- 00:11:50.177 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:50.177 rtt min/avg/max/mdev = 0.030/0.030/0.030/0.000 ms 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@107 -- # local dev=target0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:50.177 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:50.177 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:11:50.177 00:11:50.177 --- 10.0.0.2 ping statistics --- 00:11:50.177 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:50.177 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@98 -- # (( pair++ )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@266 -- # return 0 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:11:50.177 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@107 -- # local dev=target0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@107 -- # local dev=target1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@107 -- # local dev=target0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@107 -- # local dev=target1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:11:50.178 15:16:51 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:11:50.178 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@13 -- # nvmfappstart -m 0x2 00:11:50.178 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:11:50.178 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@724 -- # xtrace_disable 00:11:50.178 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@324 -- # nvmfpid=1758749 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@325 -- # waitforlisten 1758749 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@831 -- # '[' -z 1758749 ']' 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:50.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:50.438 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:50.438 [2024-09-27 15:16:52.081238] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:11:50.438 [2024-09-27 15:16:52.081299] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:50.438 [2024-09-27 15:16:52.166145] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:50.438 [2024-09-27 15:16:52.251896] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:11:50.438 [2024-09-27 15:16:52.251943] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:11:50.438 [2024-09-27 15:16:52.251954] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:11:50.438 [2024-09-27 15:16:52.251962] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:11:50.438 [2024-09-27 15:16:52.251970] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:11:50.438 [2024-09-27 15:16:52.251995] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@864 -- # return 0 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@730 -- # xtrace_disable 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@15 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:51.390 15:16:52 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 [2024-09-27 15:16:53.003817] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1186540/0x118aa30) succeed. 00:11:51.390 [2024-09-27 15:16:53.012758] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1187a40/0x11cc0d0) succeed. 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@16 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -m 10 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@17 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 [2024-09-27 15:16:53.088428] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@18 -- # rpc_cmd bdev_null_create NULL1 1000 512 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 NULL1 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@19 -- # rpc_cmd bdev_wait_for_examine 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@20 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 NULL1 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@561 -- # xtrace_disable 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:11:51.390 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@22 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvme/fused_ordering/fused_ordering -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2016-06.io.spdk:cnode1' 00:11:51.390 [2024-09-27 15:16:53.144251] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:11:51.390 [2024-09-27 15:16:53.144292] [ DPDK EAL parameters: fused_ordering --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1758827 ] 00:11:51.652 Attached to nqn.2016-06.io.spdk:cnode1 00:11:51.652 Namespace ID: 1 size: 1GB 00:11:51.652 fused_ordering(0) 00:11:51.652 fused_ordering(1) 00:11:51.652 fused_ordering(2) 00:11:51.652 fused_ordering(3) 00:11:51.653 fused_ordering(4) 00:11:51.653 fused_ordering(5) 00:11:51.653 fused_ordering(6) 00:11:51.653 fused_ordering(7) 00:11:51.653 fused_ordering(8) 00:11:51.653 fused_ordering(9) 00:11:51.653 fused_ordering(10) 00:11:51.653 fused_ordering(11) 00:11:51.653 fused_ordering(12) 00:11:51.653 fused_ordering(13) 00:11:51.653 fused_ordering(14) 00:11:51.653 fused_ordering(15) 00:11:51.653 fused_ordering(16) 00:11:51.653 fused_ordering(17) 00:11:51.653 fused_ordering(18) 00:11:51.653 fused_ordering(19) 00:11:51.653 fused_ordering(20) 00:11:51.653 fused_ordering(21) 00:11:51.653 fused_ordering(22) 00:11:51.653 fused_ordering(23) 00:11:51.653 fused_ordering(24) 00:11:51.653 fused_ordering(25) 00:11:51.653 fused_ordering(26) 00:11:51.653 fused_ordering(27) 00:11:51.653 fused_ordering(28) 00:11:51.653 fused_ordering(29) 00:11:51.653 fused_ordering(30) 00:11:51.653 fused_ordering(31) 00:11:51.653 fused_ordering(32) 00:11:51.653 fused_ordering(33) 00:11:51.653 fused_ordering(34) 00:11:51.653 fused_ordering(35) 00:11:51.653 fused_ordering(36) 00:11:51.653 fused_ordering(37) 00:11:51.653 fused_ordering(38) 00:11:51.653 fused_ordering(39) 00:11:51.653 fused_ordering(40) 00:11:51.653 fused_ordering(41) 00:11:51.653 fused_ordering(42) 00:11:51.653 fused_ordering(43) 00:11:51.653 fused_ordering(44) 00:11:51.653 fused_ordering(45) 00:11:51.653 fused_ordering(46) 00:11:51.653 fused_ordering(47) 00:11:51.653 fused_ordering(48) 00:11:51.653 fused_ordering(49) 00:11:51.653 fused_ordering(50) 00:11:51.653 fused_ordering(51) 00:11:51.653 fused_ordering(52) 00:11:51.653 fused_ordering(53) 00:11:51.653 fused_ordering(54) 00:11:51.653 fused_ordering(55) 00:11:51.653 fused_ordering(56) 00:11:51.653 fused_ordering(57) 00:11:51.653 fused_ordering(58) 00:11:51.653 fused_ordering(59) 00:11:51.653 fused_ordering(60) 00:11:51.653 fused_ordering(61) 00:11:51.653 fused_ordering(62) 00:11:51.653 fused_ordering(63) 00:11:51.653 fused_ordering(64) 00:11:51.653 fused_ordering(65) 00:11:51.653 fused_ordering(66) 00:11:51.653 fused_ordering(67) 00:11:51.653 fused_ordering(68) 00:11:51.653 fused_ordering(69) 00:11:51.653 fused_ordering(70) 00:11:51.653 fused_ordering(71) 00:11:51.653 fused_ordering(72) 00:11:51.653 fused_ordering(73) 00:11:51.653 fused_ordering(74) 00:11:51.653 fused_ordering(75) 00:11:51.653 fused_ordering(76) 00:11:51.653 fused_ordering(77) 00:11:51.653 fused_ordering(78) 00:11:51.653 fused_ordering(79) 00:11:51.653 fused_ordering(80) 00:11:51.653 fused_ordering(81) 00:11:51.653 fused_ordering(82) 00:11:51.653 fused_ordering(83) 00:11:51.653 fused_ordering(84) 00:11:51.653 fused_ordering(85) 00:11:51.653 fused_ordering(86) 00:11:51.653 fused_ordering(87) 00:11:51.653 fused_ordering(88) 00:11:51.653 fused_ordering(89) 00:11:51.653 fused_ordering(90) 00:11:51.653 fused_ordering(91) 00:11:51.653 fused_ordering(92) 00:11:51.653 fused_ordering(93) 00:11:51.653 fused_ordering(94) 00:11:51.653 fused_ordering(95) 00:11:51.653 fused_ordering(96) 00:11:51.653 fused_ordering(97) 00:11:51.653 fused_ordering(98) 00:11:51.653 fused_ordering(99) 00:11:51.653 fused_ordering(100) 00:11:51.653 fused_ordering(101) 00:11:51.653 fused_ordering(102) 00:11:51.653 fused_ordering(103) 00:11:51.653 fused_ordering(104) 00:11:51.653 fused_ordering(105) 00:11:51.653 fused_ordering(106) 00:11:51.653 fused_ordering(107) 00:11:51.653 fused_ordering(108) 00:11:51.653 fused_ordering(109) 00:11:51.653 fused_ordering(110) 00:11:51.653 fused_ordering(111) 00:11:51.653 fused_ordering(112) 00:11:51.653 fused_ordering(113) 00:11:51.653 fused_ordering(114) 00:11:51.653 fused_ordering(115) 00:11:51.653 fused_ordering(116) 00:11:51.653 fused_ordering(117) 00:11:51.653 fused_ordering(118) 00:11:51.653 fused_ordering(119) 00:11:51.653 fused_ordering(120) 00:11:51.653 fused_ordering(121) 00:11:51.653 fused_ordering(122) 00:11:51.653 fused_ordering(123) 00:11:51.653 fused_ordering(124) 00:11:51.653 fused_ordering(125) 00:11:51.653 fused_ordering(126) 00:11:51.653 fused_ordering(127) 00:11:51.653 fused_ordering(128) 00:11:51.653 fused_ordering(129) 00:11:51.653 fused_ordering(130) 00:11:51.653 fused_ordering(131) 00:11:51.653 fused_ordering(132) 00:11:51.653 fused_ordering(133) 00:11:51.653 fused_ordering(134) 00:11:51.653 fused_ordering(135) 00:11:51.653 fused_ordering(136) 00:11:51.653 fused_ordering(137) 00:11:51.653 fused_ordering(138) 00:11:51.653 fused_ordering(139) 00:11:51.653 fused_ordering(140) 00:11:51.653 fused_ordering(141) 00:11:51.653 fused_ordering(142) 00:11:51.653 fused_ordering(143) 00:11:51.653 fused_ordering(144) 00:11:51.653 fused_ordering(145) 00:11:51.653 fused_ordering(146) 00:11:51.653 fused_ordering(147) 00:11:51.653 fused_ordering(148) 00:11:51.653 fused_ordering(149) 00:11:51.653 fused_ordering(150) 00:11:51.653 fused_ordering(151) 00:11:51.653 fused_ordering(152) 00:11:51.653 fused_ordering(153) 00:11:51.653 fused_ordering(154) 00:11:51.653 fused_ordering(155) 00:11:51.653 fused_ordering(156) 00:11:51.653 fused_ordering(157) 00:11:51.653 fused_ordering(158) 00:11:51.653 fused_ordering(159) 00:11:51.653 fused_ordering(160) 00:11:51.653 fused_ordering(161) 00:11:51.653 fused_ordering(162) 00:11:51.653 fused_ordering(163) 00:11:51.653 fused_ordering(164) 00:11:51.653 fused_ordering(165) 00:11:51.653 fused_ordering(166) 00:11:51.653 fused_ordering(167) 00:11:51.653 fused_ordering(168) 00:11:51.653 fused_ordering(169) 00:11:51.653 fused_ordering(170) 00:11:51.653 fused_ordering(171) 00:11:51.653 fused_ordering(172) 00:11:51.653 fused_ordering(173) 00:11:51.653 fused_ordering(174) 00:11:51.653 fused_ordering(175) 00:11:51.653 fused_ordering(176) 00:11:51.653 fused_ordering(177) 00:11:51.653 fused_ordering(178) 00:11:51.653 fused_ordering(179) 00:11:51.653 fused_ordering(180) 00:11:51.653 fused_ordering(181) 00:11:51.653 fused_ordering(182) 00:11:51.653 fused_ordering(183) 00:11:51.653 fused_ordering(184) 00:11:51.653 fused_ordering(185) 00:11:51.653 fused_ordering(186) 00:11:51.653 fused_ordering(187) 00:11:51.653 fused_ordering(188) 00:11:51.653 fused_ordering(189) 00:11:51.653 fused_ordering(190) 00:11:51.653 fused_ordering(191) 00:11:51.653 fused_ordering(192) 00:11:51.653 fused_ordering(193) 00:11:51.653 fused_ordering(194) 00:11:51.653 fused_ordering(195) 00:11:51.653 fused_ordering(196) 00:11:51.653 fused_ordering(197) 00:11:51.653 fused_ordering(198) 00:11:51.653 fused_ordering(199) 00:11:51.653 fused_ordering(200) 00:11:51.653 fused_ordering(201) 00:11:51.653 fused_ordering(202) 00:11:51.653 fused_ordering(203) 00:11:51.653 fused_ordering(204) 00:11:51.653 fused_ordering(205) 00:11:51.653 fused_ordering(206) 00:11:51.653 fused_ordering(207) 00:11:51.653 fused_ordering(208) 00:11:51.653 fused_ordering(209) 00:11:51.653 fused_ordering(210) 00:11:51.653 fused_ordering(211) 00:11:51.653 fused_ordering(212) 00:11:51.653 fused_ordering(213) 00:11:51.653 fused_ordering(214) 00:11:51.653 fused_ordering(215) 00:11:51.653 fused_ordering(216) 00:11:51.653 fused_ordering(217) 00:11:51.653 fused_ordering(218) 00:11:51.653 fused_ordering(219) 00:11:51.653 fused_ordering(220) 00:11:51.653 fused_ordering(221) 00:11:51.653 fused_ordering(222) 00:11:51.653 fused_ordering(223) 00:11:51.653 fused_ordering(224) 00:11:51.653 fused_ordering(225) 00:11:51.653 fused_ordering(226) 00:11:51.653 fused_ordering(227) 00:11:51.653 fused_ordering(228) 00:11:51.653 fused_ordering(229) 00:11:51.653 fused_ordering(230) 00:11:51.653 fused_ordering(231) 00:11:51.653 fused_ordering(232) 00:11:51.653 fused_ordering(233) 00:11:51.653 fused_ordering(234) 00:11:51.653 fused_ordering(235) 00:11:51.653 fused_ordering(236) 00:11:51.653 fused_ordering(237) 00:11:51.653 fused_ordering(238) 00:11:51.653 fused_ordering(239) 00:11:51.653 fused_ordering(240) 00:11:51.653 fused_ordering(241) 00:11:51.653 fused_ordering(242) 00:11:51.653 fused_ordering(243) 00:11:51.653 fused_ordering(244) 00:11:51.653 fused_ordering(245) 00:11:51.653 fused_ordering(246) 00:11:51.653 fused_ordering(247) 00:11:51.653 fused_ordering(248) 00:11:51.653 fused_ordering(249) 00:11:51.653 fused_ordering(250) 00:11:51.653 fused_ordering(251) 00:11:51.653 fused_ordering(252) 00:11:51.653 fused_ordering(253) 00:11:51.653 fused_ordering(254) 00:11:51.653 fused_ordering(255) 00:11:51.653 fused_ordering(256) 00:11:51.653 fused_ordering(257) 00:11:51.653 fused_ordering(258) 00:11:51.653 fused_ordering(259) 00:11:51.653 fused_ordering(260) 00:11:51.653 fused_ordering(261) 00:11:51.653 fused_ordering(262) 00:11:51.653 fused_ordering(263) 00:11:51.653 fused_ordering(264) 00:11:51.653 fused_ordering(265) 00:11:51.653 fused_ordering(266) 00:11:51.653 fused_ordering(267) 00:11:51.653 fused_ordering(268) 00:11:51.653 fused_ordering(269) 00:11:51.653 fused_ordering(270) 00:11:51.653 fused_ordering(271) 00:11:51.653 fused_ordering(272) 00:11:51.653 fused_ordering(273) 00:11:51.653 fused_ordering(274) 00:11:51.653 fused_ordering(275) 00:11:51.653 fused_ordering(276) 00:11:51.653 fused_ordering(277) 00:11:51.653 fused_ordering(278) 00:11:51.653 fused_ordering(279) 00:11:51.653 fused_ordering(280) 00:11:51.653 fused_ordering(281) 00:11:51.653 fused_ordering(282) 00:11:51.653 fused_ordering(283) 00:11:51.654 fused_ordering(284) 00:11:51.654 fused_ordering(285) 00:11:51.654 fused_ordering(286) 00:11:51.654 fused_ordering(287) 00:11:51.654 fused_ordering(288) 00:11:51.654 fused_ordering(289) 00:11:51.654 fused_ordering(290) 00:11:51.654 fused_ordering(291) 00:11:51.654 fused_ordering(292) 00:11:51.654 fused_ordering(293) 00:11:51.654 fused_ordering(294) 00:11:51.654 fused_ordering(295) 00:11:51.654 fused_ordering(296) 00:11:51.654 fused_ordering(297) 00:11:51.654 fused_ordering(298) 00:11:51.654 fused_ordering(299) 00:11:51.654 fused_ordering(300) 00:11:51.654 fused_ordering(301) 00:11:51.654 fused_ordering(302) 00:11:51.654 fused_ordering(303) 00:11:51.654 fused_ordering(304) 00:11:51.654 fused_ordering(305) 00:11:51.654 fused_ordering(306) 00:11:51.654 fused_ordering(307) 00:11:51.654 fused_ordering(308) 00:11:51.654 fused_ordering(309) 00:11:51.654 fused_ordering(310) 00:11:51.654 fused_ordering(311) 00:11:51.654 fused_ordering(312) 00:11:51.654 fused_ordering(313) 00:11:51.654 fused_ordering(314) 00:11:51.654 fused_ordering(315) 00:11:51.654 fused_ordering(316) 00:11:51.654 fused_ordering(317) 00:11:51.654 fused_ordering(318) 00:11:51.654 fused_ordering(319) 00:11:51.654 fused_ordering(320) 00:11:51.654 fused_ordering(321) 00:11:51.654 fused_ordering(322) 00:11:51.654 fused_ordering(323) 00:11:51.654 fused_ordering(324) 00:11:51.654 fused_ordering(325) 00:11:51.654 fused_ordering(326) 00:11:51.654 fused_ordering(327) 00:11:51.654 fused_ordering(328) 00:11:51.654 fused_ordering(329) 00:11:51.654 fused_ordering(330) 00:11:51.654 fused_ordering(331) 00:11:51.654 fused_ordering(332) 00:11:51.654 fused_ordering(333) 00:11:51.654 fused_ordering(334) 00:11:51.654 fused_ordering(335) 00:11:51.654 fused_ordering(336) 00:11:51.654 fused_ordering(337) 00:11:51.654 fused_ordering(338) 00:11:51.654 fused_ordering(339) 00:11:51.654 fused_ordering(340) 00:11:51.654 fused_ordering(341) 00:11:51.654 fused_ordering(342) 00:11:51.654 fused_ordering(343) 00:11:51.654 fused_ordering(344) 00:11:51.654 fused_ordering(345) 00:11:51.654 fused_ordering(346) 00:11:51.654 fused_ordering(347) 00:11:51.654 fused_ordering(348) 00:11:51.654 fused_ordering(349) 00:11:51.654 fused_ordering(350) 00:11:51.654 fused_ordering(351) 00:11:51.654 fused_ordering(352) 00:11:51.654 fused_ordering(353) 00:11:51.654 fused_ordering(354) 00:11:51.654 fused_ordering(355) 00:11:51.654 fused_ordering(356) 00:11:51.654 fused_ordering(357) 00:11:51.654 fused_ordering(358) 00:11:51.654 fused_ordering(359) 00:11:51.654 fused_ordering(360) 00:11:51.654 fused_ordering(361) 00:11:51.654 fused_ordering(362) 00:11:51.654 fused_ordering(363) 00:11:51.654 fused_ordering(364) 00:11:51.654 fused_ordering(365) 00:11:51.654 fused_ordering(366) 00:11:51.654 fused_ordering(367) 00:11:51.654 fused_ordering(368) 00:11:51.654 fused_ordering(369) 00:11:51.654 fused_ordering(370) 00:11:51.654 fused_ordering(371) 00:11:51.654 fused_ordering(372) 00:11:51.654 fused_ordering(373) 00:11:51.654 fused_ordering(374) 00:11:51.654 fused_ordering(375) 00:11:51.654 fused_ordering(376) 00:11:51.654 fused_ordering(377) 00:11:51.654 fused_ordering(378) 00:11:51.654 fused_ordering(379) 00:11:51.654 fused_ordering(380) 00:11:51.654 fused_ordering(381) 00:11:51.654 fused_ordering(382) 00:11:51.654 fused_ordering(383) 00:11:51.654 fused_ordering(384) 00:11:51.654 fused_ordering(385) 00:11:51.654 fused_ordering(386) 00:11:51.654 fused_ordering(387) 00:11:51.654 fused_ordering(388) 00:11:51.654 fused_ordering(389) 00:11:51.654 fused_ordering(390) 00:11:51.654 fused_ordering(391) 00:11:51.654 fused_ordering(392) 00:11:51.654 fused_ordering(393) 00:11:51.654 fused_ordering(394) 00:11:51.654 fused_ordering(395) 00:11:51.654 fused_ordering(396) 00:11:51.654 fused_ordering(397) 00:11:51.654 fused_ordering(398) 00:11:51.654 fused_ordering(399) 00:11:51.654 fused_ordering(400) 00:11:51.654 fused_ordering(401) 00:11:51.654 fused_ordering(402) 00:11:51.654 fused_ordering(403) 00:11:51.654 fused_ordering(404) 00:11:51.654 fused_ordering(405) 00:11:51.654 fused_ordering(406) 00:11:51.654 fused_ordering(407) 00:11:51.654 fused_ordering(408) 00:11:51.654 fused_ordering(409) 00:11:51.654 fused_ordering(410) 00:11:51.914 fused_ordering(411) 00:11:51.914 fused_ordering(412) 00:11:51.914 fused_ordering(413) 00:11:51.914 fused_ordering(414) 00:11:51.914 fused_ordering(415) 00:11:51.914 fused_ordering(416) 00:11:51.914 fused_ordering(417) 00:11:51.914 fused_ordering(418) 00:11:51.914 fused_ordering(419) 00:11:51.914 fused_ordering(420) 00:11:51.914 fused_ordering(421) 00:11:51.914 fused_ordering(422) 00:11:51.914 fused_ordering(423) 00:11:51.914 fused_ordering(424) 00:11:51.914 fused_ordering(425) 00:11:51.914 fused_ordering(426) 00:11:51.914 fused_ordering(427) 00:11:51.914 fused_ordering(428) 00:11:51.914 fused_ordering(429) 00:11:51.914 fused_ordering(430) 00:11:51.914 fused_ordering(431) 00:11:51.914 fused_ordering(432) 00:11:51.914 fused_ordering(433) 00:11:51.914 fused_ordering(434) 00:11:51.914 fused_ordering(435) 00:11:51.914 fused_ordering(436) 00:11:51.914 fused_ordering(437) 00:11:51.914 fused_ordering(438) 00:11:51.914 fused_ordering(439) 00:11:51.914 fused_ordering(440) 00:11:51.914 fused_ordering(441) 00:11:51.914 fused_ordering(442) 00:11:51.914 fused_ordering(443) 00:11:51.914 fused_ordering(444) 00:11:51.914 fused_ordering(445) 00:11:51.914 fused_ordering(446) 00:11:51.914 fused_ordering(447) 00:11:51.914 fused_ordering(448) 00:11:51.914 fused_ordering(449) 00:11:51.914 fused_ordering(450) 00:11:51.914 fused_ordering(451) 00:11:51.914 fused_ordering(452) 00:11:51.914 fused_ordering(453) 00:11:51.914 fused_ordering(454) 00:11:51.914 fused_ordering(455) 00:11:51.914 fused_ordering(456) 00:11:51.914 fused_ordering(457) 00:11:51.914 fused_ordering(458) 00:11:51.914 fused_ordering(459) 00:11:51.914 fused_ordering(460) 00:11:51.914 fused_ordering(461) 00:11:51.914 fused_ordering(462) 00:11:51.914 fused_ordering(463) 00:11:51.914 fused_ordering(464) 00:11:51.914 fused_ordering(465) 00:11:51.914 fused_ordering(466) 00:11:51.914 fused_ordering(467) 00:11:51.914 fused_ordering(468) 00:11:51.914 fused_ordering(469) 00:11:51.914 fused_ordering(470) 00:11:51.914 fused_ordering(471) 00:11:51.914 fused_ordering(472) 00:11:51.914 fused_ordering(473) 00:11:51.914 fused_ordering(474) 00:11:51.914 fused_ordering(475) 00:11:51.914 fused_ordering(476) 00:11:51.914 fused_ordering(477) 00:11:51.914 fused_ordering(478) 00:11:51.914 fused_ordering(479) 00:11:51.914 fused_ordering(480) 00:11:51.914 fused_ordering(481) 00:11:51.914 fused_ordering(482) 00:11:51.914 fused_ordering(483) 00:11:51.914 fused_ordering(484) 00:11:51.914 fused_ordering(485) 00:11:51.914 fused_ordering(486) 00:11:51.914 fused_ordering(487) 00:11:51.914 fused_ordering(488) 00:11:51.914 fused_ordering(489) 00:11:51.914 fused_ordering(490) 00:11:51.914 fused_ordering(491) 00:11:51.914 fused_ordering(492) 00:11:51.914 fused_ordering(493) 00:11:51.914 fused_ordering(494) 00:11:51.914 fused_ordering(495) 00:11:51.914 fused_ordering(496) 00:11:51.914 fused_ordering(497) 00:11:51.914 fused_ordering(498) 00:11:51.914 fused_ordering(499) 00:11:51.914 fused_ordering(500) 00:11:51.914 fused_ordering(501) 00:11:51.914 fused_ordering(502) 00:11:51.914 fused_ordering(503) 00:11:51.914 fused_ordering(504) 00:11:51.914 fused_ordering(505) 00:11:51.914 fused_ordering(506) 00:11:51.914 fused_ordering(507) 00:11:51.914 fused_ordering(508) 00:11:51.914 fused_ordering(509) 00:11:51.914 fused_ordering(510) 00:11:51.914 fused_ordering(511) 00:11:51.914 fused_ordering(512) 00:11:51.914 fused_ordering(513) 00:11:51.914 fused_ordering(514) 00:11:51.914 fused_ordering(515) 00:11:51.914 fused_ordering(516) 00:11:51.914 fused_ordering(517) 00:11:51.914 fused_ordering(518) 00:11:51.914 fused_ordering(519) 00:11:51.914 fused_ordering(520) 00:11:51.914 fused_ordering(521) 00:11:51.914 fused_ordering(522) 00:11:51.914 fused_ordering(523) 00:11:51.914 fused_ordering(524) 00:11:51.914 fused_ordering(525) 00:11:51.914 fused_ordering(526) 00:11:51.914 fused_ordering(527) 00:11:51.914 fused_ordering(528) 00:11:51.914 fused_ordering(529) 00:11:51.914 fused_ordering(530) 00:11:51.914 fused_ordering(531) 00:11:51.914 fused_ordering(532) 00:11:51.914 fused_ordering(533) 00:11:51.914 fused_ordering(534) 00:11:51.914 fused_ordering(535) 00:11:51.914 fused_ordering(536) 00:11:51.914 fused_ordering(537) 00:11:51.914 fused_ordering(538) 00:11:51.914 fused_ordering(539) 00:11:51.914 fused_ordering(540) 00:11:51.915 fused_ordering(541) 00:11:51.915 fused_ordering(542) 00:11:51.915 fused_ordering(543) 00:11:51.915 fused_ordering(544) 00:11:51.915 fused_ordering(545) 00:11:51.915 fused_ordering(546) 00:11:51.915 fused_ordering(547) 00:11:51.915 fused_ordering(548) 00:11:51.915 fused_ordering(549) 00:11:51.915 fused_ordering(550) 00:11:51.915 fused_ordering(551) 00:11:51.915 fused_ordering(552) 00:11:51.915 fused_ordering(553) 00:11:51.915 fused_ordering(554) 00:11:51.915 fused_ordering(555) 00:11:51.915 fused_ordering(556) 00:11:51.915 fused_ordering(557) 00:11:51.915 fused_ordering(558) 00:11:51.915 fused_ordering(559) 00:11:51.915 fused_ordering(560) 00:11:51.915 fused_ordering(561) 00:11:51.915 fused_ordering(562) 00:11:51.915 fused_ordering(563) 00:11:51.915 fused_ordering(564) 00:11:51.915 fused_ordering(565) 00:11:51.915 fused_ordering(566) 00:11:51.915 fused_ordering(567) 00:11:51.915 fused_ordering(568) 00:11:51.915 fused_ordering(569) 00:11:51.915 fused_ordering(570) 00:11:51.915 fused_ordering(571) 00:11:51.915 fused_ordering(572) 00:11:51.915 fused_ordering(573) 00:11:51.915 fused_ordering(574) 00:11:51.915 fused_ordering(575) 00:11:51.915 fused_ordering(576) 00:11:51.915 fused_ordering(577) 00:11:51.915 fused_ordering(578) 00:11:51.915 fused_ordering(579) 00:11:51.915 fused_ordering(580) 00:11:51.915 fused_ordering(581) 00:11:51.915 fused_ordering(582) 00:11:51.915 fused_ordering(583) 00:11:51.915 fused_ordering(584) 00:11:51.915 fused_ordering(585) 00:11:51.915 fused_ordering(586) 00:11:51.915 fused_ordering(587) 00:11:51.915 fused_ordering(588) 00:11:51.915 fused_ordering(589) 00:11:51.915 fused_ordering(590) 00:11:51.915 fused_ordering(591) 00:11:51.915 fused_ordering(592) 00:11:51.915 fused_ordering(593) 00:11:51.915 fused_ordering(594) 00:11:51.915 fused_ordering(595) 00:11:51.915 fused_ordering(596) 00:11:51.915 fused_ordering(597) 00:11:51.915 fused_ordering(598) 00:11:51.915 fused_ordering(599) 00:11:51.915 fused_ordering(600) 00:11:51.915 fused_ordering(601) 00:11:51.915 fused_ordering(602) 00:11:51.915 fused_ordering(603) 00:11:51.915 fused_ordering(604) 00:11:51.915 fused_ordering(605) 00:11:51.915 fused_ordering(606) 00:11:51.915 fused_ordering(607) 00:11:51.915 fused_ordering(608) 00:11:51.915 fused_ordering(609) 00:11:51.915 fused_ordering(610) 00:11:51.915 fused_ordering(611) 00:11:51.915 fused_ordering(612) 00:11:51.915 fused_ordering(613) 00:11:51.915 fused_ordering(614) 00:11:51.915 fused_ordering(615) 00:11:51.915 fused_ordering(616) 00:11:51.915 fused_ordering(617) 00:11:51.915 fused_ordering(618) 00:11:51.915 fused_ordering(619) 00:11:51.915 fused_ordering(620) 00:11:51.915 fused_ordering(621) 00:11:51.915 fused_ordering(622) 00:11:51.915 fused_ordering(623) 00:11:51.915 fused_ordering(624) 00:11:51.915 fused_ordering(625) 00:11:51.915 fused_ordering(626) 00:11:51.915 fused_ordering(627) 00:11:51.915 fused_ordering(628) 00:11:51.915 fused_ordering(629) 00:11:51.915 fused_ordering(630) 00:11:51.915 fused_ordering(631) 00:11:51.915 fused_ordering(632) 00:11:51.915 fused_ordering(633) 00:11:51.915 fused_ordering(634) 00:11:51.915 fused_ordering(635) 00:11:51.915 fused_ordering(636) 00:11:51.915 fused_ordering(637) 00:11:51.915 fused_ordering(638) 00:11:51.915 fused_ordering(639) 00:11:51.915 fused_ordering(640) 00:11:51.915 fused_ordering(641) 00:11:51.915 fused_ordering(642) 00:11:51.915 fused_ordering(643) 00:11:51.915 fused_ordering(644) 00:11:51.915 fused_ordering(645) 00:11:51.915 fused_ordering(646) 00:11:51.915 fused_ordering(647) 00:11:51.915 fused_ordering(648) 00:11:51.915 fused_ordering(649) 00:11:51.915 fused_ordering(650) 00:11:51.915 fused_ordering(651) 00:11:51.915 fused_ordering(652) 00:11:51.915 fused_ordering(653) 00:11:51.915 fused_ordering(654) 00:11:51.915 fused_ordering(655) 00:11:51.915 fused_ordering(656) 00:11:51.915 fused_ordering(657) 00:11:51.915 fused_ordering(658) 00:11:51.915 fused_ordering(659) 00:11:51.915 fused_ordering(660) 00:11:51.915 fused_ordering(661) 00:11:51.915 fused_ordering(662) 00:11:51.915 fused_ordering(663) 00:11:51.915 fused_ordering(664) 00:11:51.915 fused_ordering(665) 00:11:51.915 fused_ordering(666) 00:11:51.915 fused_ordering(667) 00:11:51.915 fused_ordering(668) 00:11:51.915 fused_ordering(669) 00:11:51.915 fused_ordering(670) 00:11:51.915 fused_ordering(671) 00:11:51.915 fused_ordering(672) 00:11:51.915 fused_ordering(673) 00:11:51.915 fused_ordering(674) 00:11:51.915 fused_ordering(675) 00:11:51.915 fused_ordering(676) 00:11:51.915 fused_ordering(677) 00:11:51.915 fused_ordering(678) 00:11:51.915 fused_ordering(679) 00:11:51.915 fused_ordering(680) 00:11:51.915 fused_ordering(681) 00:11:51.915 fused_ordering(682) 00:11:51.915 fused_ordering(683) 00:11:51.915 fused_ordering(684) 00:11:51.915 fused_ordering(685) 00:11:51.915 fused_ordering(686) 00:11:51.915 fused_ordering(687) 00:11:51.915 fused_ordering(688) 00:11:51.915 fused_ordering(689) 00:11:51.915 fused_ordering(690) 00:11:51.915 fused_ordering(691) 00:11:51.915 fused_ordering(692) 00:11:51.915 fused_ordering(693) 00:11:51.915 fused_ordering(694) 00:11:51.915 fused_ordering(695) 00:11:51.915 fused_ordering(696) 00:11:51.915 fused_ordering(697) 00:11:51.915 fused_ordering(698) 00:11:51.915 fused_ordering(699) 00:11:51.915 fused_ordering(700) 00:11:51.915 fused_ordering(701) 00:11:51.915 fused_ordering(702) 00:11:51.915 fused_ordering(703) 00:11:51.915 fused_ordering(704) 00:11:51.915 fused_ordering(705) 00:11:51.915 fused_ordering(706) 00:11:51.915 fused_ordering(707) 00:11:51.915 fused_ordering(708) 00:11:51.915 fused_ordering(709) 00:11:51.915 fused_ordering(710) 00:11:51.915 fused_ordering(711) 00:11:51.915 fused_ordering(712) 00:11:51.915 fused_ordering(713) 00:11:51.915 fused_ordering(714) 00:11:51.915 fused_ordering(715) 00:11:51.915 fused_ordering(716) 00:11:51.915 fused_ordering(717) 00:11:51.915 fused_ordering(718) 00:11:51.915 fused_ordering(719) 00:11:51.915 fused_ordering(720) 00:11:51.915 fused_ordering(721) 00:11:51.915 fused_ordering(722) 00:11:51.915 fused_ordering(723) 00:11:51.915 fused_ordering(724) 00:11:51.915 fused_ordering(725) 00:11:51.915 fused_ordering(726) 00:11:51.915 fused_ordering(727) 00:11:51.915 fused_ordering(728) 00:11:51.915 fused_ordering(729) 00:11:51.915 fused_ordering(730) 00:11:51.915 fused_ordering(731) 00:11:51.915 fused_ordering(732) 00:11:51.915 fused_ordering(733) 00:11:51.915 fused_ordering(734) 00:11:51.915 fused_ordering(735) 00:11:51.915 fused_ordering(736) 00:11:51.915 fused_ordering(737) 00:11:51.915 fused_ordering(738) 00:11:51.915 fused_ordering(739) 00:11:51.915 fused_ordering(740) 00:11:51.915 fused_ordering(741) 00:11:51.915 fused_ordering(742) 00:11:51.915 fused_ordering(743) 00:11:51.915 fused_ordering(744) 00:11:51.915 fused_ordering(745) 00:11:51.915 fused_ordering(746) 00:11:51.915 fused_ordering(747) 00:11:51.915 fused_ordering(748) 00:11:51.915 fused_ordering(749) 00:11:51.915 fused_ordering(750) 00:11:51.915 fused_ordering(751) 00:11:51.915 fused_ordering(752) 00:11:51.915 fused_ordering(753) 00:11:51.915 fused_ordering(754) 00:11:51.915 fused_ordering(755) 00:11:51.915 fused_ordering(756) 00:11:51.915 fused_ordering(757) 00:11:51.915 fused_ordering(758) 00:11:51.915 fused_ordering(759) 00:11:51.915 fused_ordering(760) 00:11:51.915 fused_ordering(761) 00:11:51.915 fused_ordering(762) 00:11:51.915 fused_ordering(763) 00:11:51.915 fused_ordering(764) 00:11:51.915 fused_ordering(765) 00:11:51.915 fused_ordering(766) 00:11:51.915 fused_ordering(767) 00:11:51.915 fused_ordering(768) 00:11:51.915 fused_ordering(769) 00:11:51.915 fused_ordering(770) 00:11:51.915 fused_ordering(771) 00:11:51.915 fused_ordering(772) 00:11:51.915 fused_ordering(773) 00:11:51.915 fused_ordering(774) 00:11:51.915 fused_ordering(775) 00:11:51.915 fused_ordering(776) 00:11:51.915 fused_ordering(777) 00:11:51.915 fused_ordering(778) 00:11:51.915 fused_ordering(779) 00:11:51.915 fused_ordering(780) 00:11:51.915 fused_ordering(781) 00:11:51.915 fused_ordering(782) 00:11:51.915 fused_ordering(783) 00:11:51.915 fused_ordering(784) 00:11:51.915 fused_ordering(785) 00:11:51.915 fused_ordering(786) 00:11:51.915 fused_ordering(787) 00:11:51.915 fused_ordering(788) 00:11:51.915 fused_ordering(789) 00:11:51.915 fused_ordering(790) 00:11:51.915 fused_ordering(791) 00:11:51.915 fused_ordering(792) 00:11:51.915 fused_ordering(793) 00:11:51.915 fused_ordering(794) 00:11:51.915 fused_ordering(795) 00:11:51.915 fused_ordering(796) 00:11:51.915 fused_ordering(797) 00:11:51.915 fused_ordering(798) 00:11:51.915 fused_ordering(799) 00:11:51.915 fused_ordering(800) 00:11:51.915 fused_ordering(801) 00:11:51.915 fused_ordering(802) 00:11:51.915 fused_ordering(803) 00:11:51.915 fused_ordering(804) 00:11:51.915 fused_ordering(805) 00:11:51.915 fused_ordering(806) 00:11:51.915 fused_ordering(807) 00:11:51.915 fused_ordering(808) 00:11:51.915 fused_ordering(809) 00:11:51.915 fused_ordering(810) 00:11:51.915 fused_ordering(811) 00:11:51.915 fused_ordering(812) 00:11:51.915 fused_ordering(813) 00:11:51.915 fused_ordering(814) 00:11:51.915 fused_ordering(815) 00:11:51.915 fused_ordering(816) 00:11:51.915 fused_ordering(817) 00:11:51.915 fused_ordering(818) 00:11:51.915 fused_ordering(819) 00:11:51.915 fused_ordering(820) 00:11:52.176 fused_ordering(821) 00:11:52.176 fused_ordering(822) 00:11:52.176 fused_ordering(823) 00:11:52.176 fused_ordering(824) 00:11:52.176 fused_ordering(825) 00:11:52.176 fused_ordering(826) 00:11:52.176 fused_ordering(827) 00:11:52.176 fused_ordering(828) 00:11:52.176 fused_ordering(829) 00:11:52.176 fused_ordering(830) 00:11:52.176 fused_ordering(831) 00:11:52.176 fused_ordering(832) 00:11:52.176 fused_ordering(833) 00:11:52.176 fused_ordering(834) 00:11:52.176 fused_ordering(835) 00:11:52.176 fused_ordering(836) 00:11:52.176 fused_ordering(837) 00:11:52.176 fused_ordering(838) 00:11:52.176 fused_ordering(839) 00:11:52.176 fused_ordering(840) 00:11:52.176 fused_ordering(841) 00:11:52.176 fused_ordering(842) 00:11:52.176 fused_ordering(843) 00:11:52.176 fused_ordering(844) 00:11:52.176 fused_ordering(845) 00:11:52.176 fused_ordering(846) 00:11:52.176 fused_ordering(847) 00:11:52.176 fused_ordering(848) 00:11:52.176 fused_ordering(849) 00:11:52.176 fused_ordering(850) 00:11:52.176 fused_ordering(851) 00:11:52.176 fused_ordering(852) 00:11:52.176 fused_ordering(853) 00:11:52.176 fused_ordering(854) 00:11:52.176 fused_ordering(855) 00:11:52.176 fused_ordering(856) 00:11:52.176 fused_ordering(857) 00:11:52.176 fused_ordering(858) 00:11:52.176 fused_ordering(859) 00:11:52.176 fused_ordering(860) 00:11:52.176 fused_ordering(861) 00:11:52.176 fused_ordering(862) 00:11:52.176 fused_ordering(863) 00:11:52.176 fused_ordering(864) 00:11:52.176 fused_ordering(865) 00:11:52.176 fused_ordering(866) 00:11:52.176 fused_ordering(867) 00:11:52.176 fused_ordering(868) 00:11:52.176 fused_ordering(869) 00:11:52.176 fused_ordering(870) 00:11:52.176 fused_ordering(871) 00:11:52.176 fused_ordering(872) 00:11:52.176 fused_ordering(873) 00:11:52.176 fused_ordering(874) 00:11:52.176 fused_ordering(875) 00:11:52.176 fused_ordering(876) 00:11:52.176 fused_ordering(877) 00:11:52.176 fused_ordering(878) 00:11:52.176 fused_ordering(879) 00:11:52.176 fused_ordering(880) 00:11:52.176 fused_ordering(881) 00:11:52.176 fused_ordering(882) 00:11:52.176 fused_ordering(883) 00:11:52.176 fused_ordering(884) 00:11:52.176 fused_ordering(885) 00:11:52.176 fused_ordering(886) 00:11:52.176 fused_ordering(887) 00:11:52.176 fused_ordering(888) 00:11:52.176 fused_ordering(889) 00:11:52.176 fused_ordering(890) 00:11:52.176 fused_ordering(891) 00:11:52.176 fused_ordering(892) 00:11:52.176 fused_ordering(893) 00:11:52.176 fused_ordering(894) 00:11:52.176 fused_ordering(895) 00:11:52.176 fused_ordering(896) 00:11:52.176 fused_ordering(897) 00:11:52.176 fused_ordering(898) 00:11:52.176 fused_ordering(899) 00:11:52.176 fused_ordering(900) 00:11:52.176 fused_ordering(901) 00:11:52.176 fused_ordering(902) 00:11:52.176 fused_ordering(903) 00:11:52.176 fused_ordering(904) 00:11:52.176 fused_ordering(905) 00:11:52.176 fused_ordering(906) 00:11:52.176 fused_ordering(907) 00:11:52.176 fused_ordering(908) 00:11:52.176 fused_ordering(909) 00:11:52.176 fused_ordering(910) 00:11:52.176 fused_ordering(911) 00:11:52.176 fused_ordering(912) 00:11:52.176 fused_ordering(913) 00:11:52.176 fused_ordering(914) 00:11:52.176 fused_ordering(915) 00:11:52.176 fused_ordering(916) 00:11:52.176 fused_ordering(917) 00:11:52.176 fused_ordering(918) 00:11:52.176 fused_ordering(919) 00:11:52.176 fused_ordering(920) 00:11:52.176 fused_ordering(921) 00:11:52.176 fused_ordering(922) 00:11:52.176 fused_ordering(923) 00:11:52.176 fused_ordering(924) 00:11:52.176 fused_ordering(925) 00:11:52.176 fused_ordering(926) 00:11:52.176 fused_ordering(927) 00:11:52.176 fused_ordering(928) 00:11:52.176 fused_ordering(929) 00:11:52.176 fused_ordering(930) 00:11:52.176 fused_ordering(931) 00:11:52.176 fused_ordering(932) 00:11:52.176 fused_ordering(933) 00:11:52.176 fused_ordering(934) 00:11:52.176 fused_ordering(935) 00:11:52.176 fused_ordering(936) 00:11:52.176 fused_ordering(937) 00:11:52.176 fused_ordering(938) 00:11:52.176 fused_ordering(939) 00:11:52.176 fused_ordering(940) 00:11:52.176 fused_ordering(941) 00:11:52.176 fused_ordering(942) 00:11:52.176 fused_ordering(943) 00:11:52.176 fused_ordering(944) 00:11:52.176 fused_ordering(945) 00:11:52.176 fused_ordering(946) 00:11:52.176 fused_ordering(947) 00:11:52.176 fused_ordering(948) 00:11:52.176 fused_ordering(949) 00:11:52.176 fused_ordering(950) 00:11:52.176 fused_ordering(951) 00:11:52.176 fused_ordering(952) 00:11:52.176 fused_ordering(953) 00:11:52.176 fused_ordering(954) 00:11:52.176 fused_ordering(955) 00:11:52.176 fused_ordering(956) 00:11:52.176 fused_ordering(957) 00:11:52.176 fused_ordering(958) 00:11:52.176 fused_ordering(959) 00:11:52.176 fused_ordering(960) 00:11:52.176 fused_ordering(961) 00:11:52.176 fused_ordering(962) 00:11:52.176 fused_ordering(963) 00:11:52.176 fused_ordering(964) 00:11:52.176 fused_ordering(965) 00:11:52.176 fused_ordering(966) 00:11:52.176 fused_ordering(967) 00:11:52.176 fused_ordering(968) 00:11:52.176 fused_ordering(969) 00:11:52.176 fused_ordering(970) 00:11:52.176 fused_ordering(971) 00:11:52.176 fused_ordering(972) 00:11:52.176 fused_ordering(973) 00:11:52.176 fused_ordering(974) 00:11:52.176 fused_ordering(975) 00:11:52.176 fused_ordering(976) 00:11:52.176 fused_ordering(977) 00:11:52.176 fused_ordering(978) 00:11:52.176 fused_ordering(979) 00:11:52.176 fused_ordering(980) 00:11:52.176 fused_ordering(981) 00:11:52.176 fused_ordering(982) 00:11:52.176 fused_ordering(983) 00:11:52.176 fused_ordering(984) 00:11:52.176 fused_ordering(985) 00:11:52.176 fused_ordering(986) 00:11:52.176 fused_ordering(987) 00:11:52.176 fused_ordering(988) 00:11:52.176 fused_ordering(989) 00:11:52.176 fused_ordering(990) 00:11:52.176 fused_ordering(991) 00:11:52.176 fused_ordering(992) 00:11:52.176 fused_ordering(993) 00:11:52.176 fused_ordering(994) 00:11:52.176 fused_ordering(995) 00:11:52.176 fused_ordering(996) 00:11:52.176 fused_ordering(997) 00:11:52.176 fused_ordering(998) 00:11:52.176 fused_ordering(999) 00:11:52.176 fused_ordering(1000) 00:11:52.176 fused_ordering(1001) 00:11:52.176 fused_ordering(1002) 00:11:52.176 fused_ordering(1003) 00:11:52.176 fused_ordering(1004) 00:11:52.176 fused_ordering(1005) 00:11:52.176 fused_ordering(1006) 00:11:52.176 fused_ordering(1007) 00:11:52.176 fused_ordering(1008) 00:11:52.176 fused_ordering(1009) 00:11:52.176 fused_ordering(1010) 00:11:52.176 fused_ordering(1011) 00:11:52.176 fused_ordering(1012) 00:11:52.176 fused_ordering(1013) 00:11:52.176 fused_ordering(1014) 00:11:52.176 fused_ordering(1015) 00:11:52.176 fused_ordering(1016) 00:11:52.176 fused_ordering(1017) 00:11:52.176 fused_ordering(1018) 00:11:52.176 fused_ordering(1019) 00:11:52.176 fused_ordering(1020) 00:11:52.176 fused_ordering(1021) 00:11:52.176 fused_ordering(1022) 00:11:52.176 fused_ordering(1023) 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@23 -- # trap - SIGINT SIGTERM EXIT 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- target/fused_ordering.sh@25 -- # nvmftestfini 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@331 -- # nvmfcleanup 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@99 -- # sync 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@102 -- # set +e 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@103 -- # for i in {1..20} 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:11:52.177 rmmod nvme_rdma 00:11:52.177 rmmod nvme_fabrics 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@106 -- # set -e 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@107 -- # return 0 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@332 -- # '[' -n 1758749 ']' 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@333 -- # killprocess 1758749 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@950 -- # '[' -z 1758749 ']' 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@954 -- # kill -0 1758749 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@955 -- # uname 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1758749 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1758749' 00:11:52.177 killing process with pid 1758749 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@969 -- # kill 1758749 00:11:52.177 15:16:53 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@974 -- # wait 1758749 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@338 -- # nvmf_fini 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@264 -- # local dev 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@267 -- # remove_target_ns 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@268 -- # delete_main_bridge 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@130 -- # return 0 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:11:52.436 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@41 -- # _dev=0 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@41 -- # dev_map=() 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/setup.sh@284 -- # iptr 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@538 -- # iptables-save 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- nvmf/common.sh@538 -- # iptables-restore 00:11:52.437 00:11:52.437 real 0m9.460s 00:11:52.437 user 0m5.052s 00:11:52.437 sys 0m5.812s 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_fused_ordering -- common/autotest_common.sh@10 -- # set +x 00:11:52.437 ************************************ 00:11:52.437 END TEST nvmf_fused_ordering 00:11:52.437 ************************************ 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@26 -- # run_test nvmf_ns_masking test/nvmf/target/ns_masking.sh --transport=rdma 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:52.437 15:16:54 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:11:52.696 ************************************ 00:11:52.696 START TEST nvmf_ns_masking 00:11:52.696 ************************************ 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1125 -- # test/nvmf/target/ns_masking.sh --transport=rdma 00:11:52.696 * Looking for test storage... 00:11:52.696 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1681 -- # lcov --version 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@333 -- # local ver1 ver1_l 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@334 -- # local ver2 ver2_l 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@336 -- # IFS=.-: 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@336 -- # read -ra ver1 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@337 -- # IFS=.-: 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@337 -- # read -ra ver2 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@338 -- # local 'op=<' 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@340 -- # ver1_l=2 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@341 -- # ver2_l=1 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@344 -- # case "$op" in 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@345 -- # : 1 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@364 -- # (( v = 0 )) 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@365 -- # decimal 1 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@353 -- # local d=1 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:11:52.696 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@355 -- # echo 1 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@365 -- # ver1[v]=1 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@366 -- # decimal 2 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@353 -- # local d=2 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@355 -- # echo 2 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@366 -- # ver2[v]=2 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@368 -- # return 0 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:11:52.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:52.697 --rc genhtml_branch_coverage=1 00:11:52.697 --rc genhtml_function_coverage=1 00:11:52.697 --rc genhtml_legend=1 00:11:52.697 --rc geninfo_all_blocks=1 00:11:52.697 --rc geninfo_unexecuted_blocks=1 00:11:52.697 00:11:52.697 ' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:11:52.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:52.697 --rc genhtml_branch_coverage=1 00:11:52.697 --rc genhtml_function_coverage=1 00:11:52.697 --rc genhtml_legend=1 00:11:52.697 --rc geninfo_all_blocks=1 00:11:52.697 --rc geninfo_unexecuted_blocks=1 00:11:52.697 00:11:52.697 ' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:11:52.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:52.697 --rc genhtml_branch_coverage=1 00:11:52.697 --rc genhtml_function_coverage=1 00:11:52.697 --rc genhtml_legend=1 00:11:52.697 --rc geninfo_all_blocks=1 00:11:52.697 --rc geninfo_unexecuted_blocks=1 00:11:52.697 00:11:52.697 ' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:11:52.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:52.697 --rc genhtml_branch_coverage=1 00:11:52.697 --rc genhtml_function_coverage=1 00:11:52.697 --rc genhtml_legend=1 00:11:52.697 --rc geninfo_all_blocks=1 00:11:52.697 --rc geninfo_unexecuted_blocks=1 00:11:52.697 00:11:52.697 ' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@8 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@7 -- # uname -s 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@15 -- # shopt -s extglob 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- paths/export.sh@5 -- # export PATH 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@50 -- # : 0 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:11:52.697 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@54 -- # have_pci_nics=0 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@10 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@11 -- # hostsock=/var/tmp/host.sock 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@12 -- # loops=5 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@13 -- # uuidgen 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@13 -- # ns1uuid=f463ca91-4bad-45f1-89a1-e067bfceda08 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@14 -- # uuidgen 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@14 -- # ns2uuid=2698ff05-9cbd-46e3-9b71-8c9723b34d22 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@16 -- # SUBSYSNQN=nqn.2016-06.io.spdk:cnode1 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@17 -- # HOSTNQN1=nqn.2016-06.io.spdk:host1 00:11:52.697 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@18 -- # HOSTNQN2=nqn.2016-06.io.spdk:host2 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@19 -- # uuidgen 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@19 -- # HOSTID=39b73296-d8e0-4421-ac5b-9ffa9758f223 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@50 -- # nvmftestinit 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@292 -- # prepare_net_devs 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@254 -- # local -g is_hw=no 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@256 -- # remove_target_ns 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@22 -- # _remove_target_ns 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@125 -- # xtrace_disable 00:11:52.957 15:16:54 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@10 -- # set +x 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@131 -- # pci_devs=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@131 -- # local -a pci_devs 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@132 -- # pci_net_devs=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@133 -- # pci_drivers=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@133 -- # local -A pci_drivers 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@135 -- # net_devs=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@135 -- # local -ga net_devs 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@136 -- # e810=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@136 -- # local -ga e810 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@137 -- # x722=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@137 -- # local -ga x722 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@138 -- # mlx=() 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@138 -- # local -ga mlx 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:11:59.526 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:11:59.527 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:11:59.527 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:11:59.527 Found net devices under 0000:18:00.0: mlx_0_0 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:11:59.527 Found net devices under 0000:18:00.1: mlx_0_1 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@249 -- # get_rdma_if_list 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@75 -- # rdma_devs=() 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@89 -- # continue 2 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@89 -- # continue 2 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@258 -- # is_hw=yes 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@61 -- # uname 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@65 -- # modprobe ib_cm 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@66 -- # modprobe ib_core 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@67 -- # modprobe ib_umad 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@69 -- # modprobe iw_cm 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@27 -- # local -gA dev_map 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@28 -- # local -g _dev 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@44 -- # ips=() 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@58 -- # key_initiator=target1 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:59.527 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:11:59.528 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@11 -- # local val=167772161 00:11:59.528 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:11:59.528 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:11:59.528 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:11:59.528 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:11:59.788 10.0.0.1 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@11 -- # local val=167772162 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:11:59.788 10.0.0.2 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@38 -- # ping_ips 1 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@107 -- # local dev=target0 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:59.788 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:59.789 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:59.789 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.031 ms 00:11:59.789 00:11:59.789 --- 10.0.0.2 ping statistics --- 00:11:59.789 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:59.789 rtt min/avg/max/mdev = 0.031/0.031/0.031/0.000 ms 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@107 -- # local dev=target0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:11:59.789 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:11:59.789 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.024 ms 00:11:59.789 00:11:59.789 --- 10.0.0.2 ping statistics --- 00:11:59.789 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:11:59.789 rtt min/avg/max/mdev = 0.024/0.024/0.024/0.000 ms 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@98 -- # (( pair++ )) 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@266 -- # return 0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@107 -- # local dev=target0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@107 -- # local dev=target1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # get_net_dev target0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@107 -- # local dev=target0 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:11:59.789 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # get_net_dev target1 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@107 -- # local dev=target1 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@51 -- # nvmfappstart 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@724 -- # xtrace_disable 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@10 -- # set +x 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@324 -- # nvmfpid=1761941 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@325 -- # waitforlisten 1761941 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@831 -- # '[' -z 1761941 ']' 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:59.790 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:59.790 15:17:01 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@10 -- # set +x 00:12:00.049 [2024-09-27 15:17:01.640835] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:12:00.049 [2024-09-27 15:17:01.640900] [ DPDK EAL parameters: nvmf -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:00.049 [2024-09-27 15:17:01.727636] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:00.049 [2024-09-27 15:17:01.817106] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:12:00.049 [2024-09-27 15:17:01.817147] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:12:00.049 [2024-09-27 15:17:01.817156] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:12:00.049 [2024-09-27 15:17:01.817165] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:12:00.049 [2024-09-27 15:17:01.817172] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:12:00.049 [2024-09-27 15:17:01.817201] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@864 -- # return 0 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@730 -- # xtrace_disable 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@10 -- # set +x 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@53 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:12:00.986 [2024-09-27 15:17:02.741701] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x19592a0/0x195d790) succeed. 00:12:00.986 [2024-09-27 15:17:02.750805] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x195a7a0/0x199ee30) succeed. 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@55 -- # MALLOC_BDEV_SIZE=64 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@56 -- # MALLOC_BLOCK_SIZE=512 00:12:00.986 15:17:02 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@58 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 -b Malloc1 00:12:01.246 Malloc1 00:12:01.246 15:17:03 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 -b Malloc2 00:12:01.505 Malloc2 00:12:01.505 15:17:03 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@62 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME 00:12:01.764 15:17:03 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@63 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 1 00:12:02.023 15:17:03 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:12:02.023 [2024-09-27 15:17:03.841967] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:12:02.023 15:17:03 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@67 -- # connect 00:12:02.282 15:17:03 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@22 -- # nvme connect -t rdma -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host1 -I 39b73296-d8e0-4421-ac5b-9ffa9758f223 -a 10.0.0.2 -s 4420 -i 4 00:12:02.541 15:17:04 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@24 -- # waitforserial SPDKISFASTANDAWESOME 00:12:02.541 15:17:04 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1198 -- # local i=0 00:12:02.541 15:17:04 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:12:02.541 15:17:04 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1200 -- # [[ -n '' ]] 00:12:02.541 15:17:04 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1205 -- # sleep 2 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1208 -- # return 0 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # nvme list-subsys -o json 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # jq -r '.[].Subsystems[] | select(.NQN=="nqn.2016-06.io.spdk:cnode1") | .Paths[0].Name' 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # ctrl_id=nvme0 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@27 -- # [[ -z nvme0 ]] 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@68 -- # ns_is_visible 0x1 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:04.446 [ 0]:0x1 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:04.446 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:04.704 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=459970b1ffd24ad18e18be56e951b6f9 00:12:04.704 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 459970b1ffd24ad18e18be56e951b6f9 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:04.704 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@71 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc2 -n 2 00:12:04.704 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@72 -- # ns_is_visible 0x1 00:12:04.704 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:04.704 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:04.704 [ 0]:0x1 00:12:04.705 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:04.705 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=459970b1ffd24ad18e18be56e951b6f9 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 459970b1ffd24ad18e18be56e951b6f9 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@73 -- # ns_is_visible 0x2 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:04.963 [ 1]:0x2 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:04.963 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:04.964 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:04.964 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@75 -- # disconnect 00:12:04.964 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@38 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:12:05.223 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:12:05.223 15:17:06 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@79 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:12:05.482 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@80 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 1 --no-auto-visible 00:12:05.742 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@83 -- # connect 1 00:12:05.742 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@22 -- # nvme connect -t rdma -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host1 -I 39b73296-d8e0-4421-ac5b-9ffa9758f223 -a 10.0.0.2 -s 4420 -i 4 00:12:06.000 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@24 -- # waitforserial SPDKISFASTANDAWESOME 1 00:12:06.000 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1198 -- # local i=0 00:12:06.000 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:12:06.000 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1200 -- # [[ -n 1 ]] 00:12:06.001 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1201 -- # nvme_device_counter=1 00:12:06.001 15:17:07 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1205 -- # sleep 2 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # nvme_devices=1 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1208 -- # return 0 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # nvme list-subsys -o json 00:12:07.906 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # jq -r '.[].Subsystems[] | select(.NQN=="nqn.2016-06.io.spdk:cnode1") | .Paths[0].Name' 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # ctrl_id=nvme0 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@27 -- # [[ -z nvme0 ]] 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@84 -- # NOT ns_is_visible 0x1 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@650 -- # local es=0 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@652 -- # valid_exec_arg ns_is_visible 0x1 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@638 -- # local arg=ns_is_visible 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # type -t ns_is_visible 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # ns_is_visible 0x1 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=00000000000000000000000000000000 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 00000000000000000000000000000000 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # es=1 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@85 -- # ns_is_visible 0x2 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:08.165 [ 0]:0x2 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:08.165 15:17:09 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@88 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_add_host nqn.2016-06.io.spdk:cnode1 1 nqn.2016-06.io.spdk:host1 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@89 -- # ns_is_visible 0x1 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:08.424 [ 0]:0x1 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=459970b1ffd24ad18e18be56e951b6f9 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 459970b1ffd24ad18e18be56e951b6f9 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@90 -- # ns_is_visible 0x2 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:08.424 [ 1]:0x2 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:08.424 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@93 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_remove_host nqn.2016-06.io.spdk:cnode1 1 nqn.2016-06.io.spdk:host1 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@94 -- # NOT ns_is_visible 0x1 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@650 -- # local es=0 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@652 -- # valid_exec_arg ns_is_visible 0x1 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@638 -- # local arg=ns_is_visible 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # type -t ns_is_visible 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # ns_is_visible 0x1 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=00000000000000000000000000000000 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 00000000000000000000000000000000 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # es=1 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@95 -- # ns_is_visible 0x2 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:08.683 [ 0]:0x2 00:12:08.683 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:08.684 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:08.943 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:08.943 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:08.943 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@97 -- # disconnect 00:12:08.943 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@38 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:12:09.203 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:12:09.203 15:17:10 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@100 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_add_host nqn.2016-06.io.spdk:cnode1 1 nqn.2016-06.io.spdk:host1 00:12:09.462 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@101 -- # connect 2 00:12:09.462 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@22 -- # nvme connect -t rdma -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host1 -I 39b73296-d8e0-4421-ac5b-9ffa9758f223 -a 10.0.0.2 -s 4420 -i 4 00:12:09.721 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@24 -- # waitforserial SPDKISFASTANDAWESOME 2 00:12:09.721 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1198 -- # local i=0 00:12:09.721 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:12:09.721 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1200 -- # [[ -n 2 ]] 00:12:09.722 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1201 -- # nvme_device_counter=2 00:12:09.722 15:17:11 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1205 -- # sleep 2 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1207 -- # nvme_devices=2 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1208 -- # return 0 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # jq -r '.[].Subsystems[] | select(.NQN=="nqn.2016-06.io.spdk:cnode1") | .Paths[0].Name' 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # nvme list-subsys -o json 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@26 -- # ctrl_id=nvme0 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@27 -- # [[ -z nvme0 ]] 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@102 -- # ns_is_visible 0x1 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:11.628 [ 0]:0x1 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:11.628 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=459970b1ffd24ad18e18be56e951b6f9 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 459970b1ffd24ad18e18be56e951b6f9 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@103 -- # ns_is_visible 0x2 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:11.888 [ 1]:0x2 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:11.888 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@106 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_remove_host nqn.2016-06.io.spdk:cnode1 1 nqn.2016-06.io.spdk:host1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@107 -- # NOT ns_is_visible 0x1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@650 -- # local es=0 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@652 -- # valid_exec_arg ns_is_visible 0x1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@638 -- # local arg=ns_is_visible 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # type -t ns_is_visible 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # ns_is_visible 0x1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=00000000000000000000000000000000 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 00000000000000000000000000000000 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # es=1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@108 -- # ns_is_visible 0x2 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:12.147 [ 0]:0x2 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@111 -- # NOT /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_remove_host nqn.2016-06.io.spdk:cnode1 2 nqn.2016-06.io.spdk:host1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@650 -- # local es=0 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_remove_host nqn.2016-06.io.spdk:cnode1 2 nqn.2016-06.io.spdk:host1 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py ]] 00:12:12.147 15:17:13 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_remove_host nqn.2016-06.io.spdk:cnode1 2 nqn.2016-06.io.spdk:host1 00:12:12.406 [2024-09-27 15:17:14.031475] nvmf_rpc.c:1870:nvmf_rpc_ns_visible_paused: *ERROR*: Unable to add/remove nqn.2016-06.io.spdk:host1 to namespace ID 2 00:12:12.406 request: 00:12:12.406 { 00:12:12.406 "nqn": "nqn.2016-06.io.spdk:cnode1", 00:12:12.406 "nsid": 2, 00:12:12.406 "host": "nqn.2016-06.io.spdk:host1", 00:12:12.406 "method": "nvmf_ns_remove_host", 00:12:12.406 "req_id": 1 00:12:12.406 } 00:12:12.406 Got JSON-RPC error response 00:12:12.406 response: 00:12:12.406 { 00:12:12.406 "code": -32602, 00:12:12.406 "message": "Invalid parameters" 00:12:12.406 } 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # es=1 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@112 -- # NOT ns_is_visible 0x1 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@650 -- # local es=0 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@652 -- # valid_exec_arg ns_is_visible 0x1 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@638 -- # local arg=ns_is_visible 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # type -t ns_is_visible 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # ns_is_visible 0x1 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x1 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x1 -o json 00:12:12.406 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=00000000000000000000000000000000 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ 00000000000000000000000000000000 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@653 -- # es=1 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@113 -- # ns_is_visible 0x2 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # nvme list-ns /dev/nvme0 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@43 -- # grep 0x2 00:12:12.407 [ 0]:0x2 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nvme id-ns /dev/nvme0 -n 0x2 -o json 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # jq -r .nguid 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@44 -- # nguid=ecdb546dbc88479ea4f45c19c48cb3b3 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@45 -- # [[ ecdb546dbc88479ea4f45c19c48cb3b3 != \0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0 ]] 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@114 -- # disconnect 00:12:12.407 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@38 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:12:12.667 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@118 -- # hostpid=1763814 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@117 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -r /var/tmp/host.sock -m 2 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@119 -- # trap 'killprocess $hostpid; nvmftestfini' SIGINT SIGTERM EXIT 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@121 -- # waitforlisten 1763814 /var/tmp/host.sock 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@831 -- # '[' -z 1763814 ']' 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/host.sock 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/host.sock...' 00:12:12.667 Waiting for process to start up and listen on UNIX domain socket /var/tmp/host.sock... 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:12.667 15:17:14 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@10 -- # set +x 00:12:12.926 [2024-09-27 15:17:14.555057] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:12:12.926 [2024-09-27 15:17:14.555124] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1763814 ] 00:12:12.926 [2024-09-27 15:17:14.640943] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:12.926 [2024-09-27 15:17:14.728118] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:12:13.864 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:13.865 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@864 -- # return 0 00:12:13.865 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@122 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 1 00:12:13.865 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@123 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_ns nqn.2016-06.io.spdk:cnode1 2 00:12:14.123 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@124 -- # uuid2nguid f463ca91-4bad-45f1-89a1-e067bfceda08 00:12:14.123 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@534 -- # tr -d - 00:12:14.123 15:17:15 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@124 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 1 -g F463CA914BAD45F189A1E067BFCEDA08 -i 00:12:14.383 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@125 -- # uuid2nguid 2698ff05-9cbd-46e3-9b71-8c9723b34d22 00:12:14.383 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@534 -- # tr -d - 00:12:14.383 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc2 -n 2 -g 2698FF059CBD46E39B718C9723B34D22 -i 00:12:14.642 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@126 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_add_host nqn.2016-06.io.spdk:cnode1 1 nqn.2016-06.io.spdk:host1 00:12:14.642 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@127 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_ns_add_host nqn.2016-06.io.spdk:cnode1 2 nqn.2016-06.io.spdk:host2 00:12:14.901 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@129 -- # hostrpc bdev_nvme_attach_controller -t rdma -a 10.0.0.2 -f ipv4 -s 4420 -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host1 -b nvme0 00:12:14.901 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -a 10.0.0.2 -f ipv4 -s 4420 -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host1 -b nvme0 00:12:15.159 nvme0n1 00:12:15.159 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@131 -- # hostrpc bdev_nvme_attach_controller -t rdma -a 10.0.0.2 -f ipv4 -s 4420 -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host2 -b nvme1 00:12:15.159 15:17:16 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -a 10.0.0.2 -f ipv4 -s 4420 -n nqn.2016-06.io.spdk:cnode1 -q nqn.2016-06.io.spdk:host2 -b nvme1 00:12:15.419 nvme1n2 00:12:15.419 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@134 -- # hostrpc bdev_get_bdevs 00:12:15.419 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@134 -- # jq -r '.[].name' 00:12:15.419 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_get_bdevs 00:12:15.419 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@134 -- # sort 00:12:15.419 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@134 -- # xargs 00:12:15.678 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@134 -- # [[ nvme0n1 nvme1n2 == \n\v\m\e\0\n\1\ \n\v\m\e\1\n\2 ]] 00:12:15.678 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@135 -- # hostrpc bdev_get_bdevs -b nvme0n1 00:12:15.678 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@135 -- # jq -r '.[].uuid' 00:12:15.678 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_get_bdevs -b nvme0n1 00:12:15.937 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@135 -- # [[ f463ca91-4bad-45f1-89a1-e067bfceda08 == \f\4\6\3\c\a\9\1\-\4\b\a\d\-\4\5\f\1\-\8\9\a\1\-\e\0\6\7\b\f\c\e\d\a\0\8 ]] 00:12:15.937 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@136 -- # hostrpc bdev_get_bdevs -b nvme1n2 00:12:15.937 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@136 -- # jq -r '.[].uuid' 00:12:15.937 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_get_bdevs -b nvme1n2 00:12:16.196 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@136 -- # [[ 2698ff05-9cbd-46e3-9b71-8c9723b34d22 == \2\6\9\8\f\f\0\5\-\9\c\b\d\-\4\6\e\3\-\9\b\7\1\-\8\c\9\7\2\3\b\3\4\d\2\2 ]] 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@138 -- # killprocess 1763814 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@950 -- # '[' -z 1763814 ']' 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@954 -- # kill -0 1763814 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@955 -- # uname 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1763814 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1763814' 00:12:16.197 killing process with pid 1763814 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@969 -- # kill 1763814 00:12:16.197 15:17:17 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@974 -- # wait 1763814 00:12:16.456 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@139 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@141 -- # trap - SIGINT SIGTERM EXIT 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- target/ns_masking.sh@142 -- # nvmftestfini 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@331 -- # nvmfcleanup 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@99 -- # sync 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@102 -- # set +e 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@103 -- # for i in {1..20} 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:12:16.714 rmmod nvme_rdma 00:12:16.714 rmmod nvme_fabrics 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@106 -- # set -e 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@107 -- # return 0 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@332 -- # '[' -n 1761941 ']' 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@333 -- # killprocess 1761941 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@950 -- # '[' -z 1761941 ']' 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@954 -- # kill -0 1761941 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@955 -- # uname 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1761941 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1761941' 00:12:16.714 killing process with pid 1761941 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@969 -- # kill 1761941 00:12:16.714 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@974 -- # wait 1761941 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@338 -- # nvmf_fini 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@264 -- # local dev 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@267 -- # remove_target_ns 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@22 -- # _remove_target_ns 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@268 -- # delete_main_bridge 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@130 -- # return 0 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:12:16.973 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@41 -- # _dev=0 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@41 -- # dev_map=() 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/setup.sh@284 -- # iptr 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@538 -- # iptables-save 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- nvmf/common.sh@538 -- # iptables-restore 00:12:17.233 00:12:17.233 real 0m24.544s 00:12:17.233 user 0m28.192s 00:12:17.233 sys 0m7.836s 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_ns_masking -- common/autotest_common.sh@10 -- # set +x 00:12:17.233 ************************************ 00:12:17.233 END TEST nvmf_ns_masking 00:12:17.233 ************************************ 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@27 -- # [[ 1 -eq 1 ]] 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@28 -- # run_test nvmf_nvme_cli /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvme_cli.sh --transport=rdma 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:12:17.233 ************************************ 00:12:17.233 START TEST nvmf_nvme_cli 00:12:17.233 ************************************ 00:12:17.233 15:17:18 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/nvme_cli.sh --transport=rdma 00:12:17.233 * Looking for test storage... 00:12:17.233 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:12:17.233 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:12:17.233 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1681 -- # lcov --version 00:12:17.233 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@333 -- # local ver1 ver1_l 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@334 -- # local ver2 ver2_l 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@336 -- # IFS=.-: 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@336 -- # read -ra ver1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@337 -- # IFS=.-: 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@337 -- # read -ra ver2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@338 -- # local 'op=<' 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@340 -- # ver1_l=2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@341 -- # ver2_l=1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@344 -- # case "$op" in 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@345 -- # : 1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@364 -- # (( v = 0 )) 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@365 -- # decimal 1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@353 -- # local d=1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@355 -- # echo 1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@365 -- # ver1[v]=1 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@366 -- # decimal 2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@353 -- # local d=2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@355 -- # echo 2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@366 -- # ver2[v]=2 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:12:17.493 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@368 -- # return 0 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:12:17.494 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:17.494 --rc genhtml_branch_coverage=1 00:12:17.494 --rc genhtml_function_coverage=1 00:12:17.494 --rc genhtml_legend=1 00:12:17.494 --rc geninfo_all_blocks=1 00:12:17.494 --rc geninfo_unexecuted_blocks=1 00:12:17.494 00:12:17.494 ' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:12:17.494 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:17.494 --rc genhtml_branch_coverage=1 00:12:17.494 --rc genhtml_function_coverage=1 00:12:17.494 --rc genhtml_legend=1 00:12:17.494 --rc geninfo_all_blocks=1 00:12:17.494 --rc geninfo_unexecuted_blocks=1 00:12:17.494 00:12:17.494 ' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:12:17.494 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:17.494 --rc genhtml_branch_coverage=1 00:12:17.494 --rc genhtml_function_coverage=1 00:12:17.494 --rc genhtml_legend=1 00:12:17.494 --rc geninfo_all_blocks=1 00:12:17.494 --rc geninfo_unexecuted_blocks=1 00:12:17.494 00:12:17.494 ' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:12:17.494 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:17.494 --rc genhtml_branch_coverage=1 00:12:17.494 --rc genhtml_function_coverage=1 00:12:17.494 --rc genhtml_legend=1 00:12:17.494 --rc geninfo_all_blocks=1 00:12:17.494 --rc geninfo_unexecuted_blocks=1 00:12:17.494 00:12:17.494 ' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@7 -- # uname -s 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@15 -- # shopt -s extglob 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- paths/export.sh@5 -- # export PATH 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@50 -- # : 0 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:12:17.494 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@54 -- # have_pci_nics=0 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@11 -- # MALLOC_BDEV_SIZE=64 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@14 -- # devs=() 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@16 -- # nvmftestinit 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@292 -- # prepare_net_devs 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@254 -- # local -g is_hw=no 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@256 -- # remove_target_ns 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@22 -- # _remove_target_ns 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@125 -- # xtrace_disable 00:12:17.494 15:17:19 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@131 -- # pci_devs=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@131 -- # local -a pci_devs 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@132 -- # pci_net_devs=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@133 -- # pci_drivers=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@133 -- # local -A pci_drivers 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@135 -- # net_devs=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@135 -- # local -ga net_devs 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@136 -- # e810=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@136 -- # local -ga e810 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@137 -- # x722=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@137 -- # local -ga x722 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@138 -- # mlx=() 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@138 -- # local -ga mlx 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:12:24.206 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:12:24.207 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:12:24.207 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:12:24.207 Found net devices under 0000:18:00.0: mlx_0_0 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:12:24.207 Found net devices under 0000:18:00.1: mlx_0_1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@249 -- # get_rdma_if_list 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@75 -- # rdma_devs=() 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@89 -- # continue 2 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@89 -- # continue 2 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@258 -- # is_hw=yes 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@61 -- # uname 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@65 -- # modprobe ib_cm 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@66 -- # modprobe ib_core 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@67 -- # modprobe ib_umad 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@69 -- # modprobe iw_cm 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@27 -- # local -gA dev_map 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@28 -- # local -g _dev 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@44 -- # ips=() 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@58 -- # key_initiator=target1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@11 -- # local val=167772161 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:12:24.207 10.0.0.1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@11 -- # local val=167772162 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:12:24.207 10.0.0.2 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:12:24.207 15:17:25 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@38 -- # ping_ips 1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@107 -- # local dev=target0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:12:24.207 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:12:24.207 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.037 ms 00:12:24.207 00:12:24.207 --- 10.0.0.2 ping statistics --- 00:12:24.207 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:12:24.207 rtt min/avg/max/mdev = 0.037/0.037/0.037/0.000 ms 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@107 -- # local dev=target0 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:12:24.207 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:12:24.207 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:12:24.207 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.029 ms 00:12:24.207 00:12:24.207 --- 10.0.0.2 ping statistics --- 00:12:24.207 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:12:24.207 rtt min/avg/max/mdev = 0.029/0.029/0.029/0.000 ms 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@98 -- # (( pair++ )) 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@266 -- # return 0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@107 -- # local dev=target0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # get_net_dev target1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@107 -- # local dev=target1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@107 -- # local dev=target0 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # get_net_dev target1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@107 -- # local dev=target1 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:12:24.467 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@17 -- # nvmfappstart -m 0xF 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@724 -- # xtrace_disable 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@324 -- # nvmfpid=1767418 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@325 -- # waitforlisten 1767418 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@831 -- # '[' -z 1767418 ']' 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:24.468 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:24.468 15:17:26 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:24.468 [2024-09-27 15:17:26.227733] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:12:24.468 [2024-09-27 15:17:26.227796] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:24.468 [2024-09-27 15:17:26.313917] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:12:24.727 [2024-09-27 15:17:26.402848] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:12:24.727 [2024-09-27 15:17:26.402888] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:12:24.727 [2024-09-27 15:17:26.402898] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:12:24.727 [2024-09-27 15:17:26.402906] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:12:24.727 [2024-09-27 15:17:26.402913] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:12:24.727 [2024-09-27 15:17:26.402968] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:12:24.727 [2024-09-27 15:17:26.403071] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:12:24.727 [2024-09-27 15:17:26.403171] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.727 [2024-09-27 15:17:26.403172] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@864 -- # return 0 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@730 -- # xtrace_disable 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@19 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.293 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 [2024-09-27 15:17:27.167965] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x193c4a0/0x1940990) succeed. 00:12:25.551 [2024-09-27 15:17:27.178443] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x193dae0/0x1982030) succeed. 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@21 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 Malloc0 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@22 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc1 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 Malloc1 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@24 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDKISFASTANDAWESOME -d SPDK_Controller1 -i 291 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@26 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@27 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.551 [2024-09-27 15:17:27.392801] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@28 -- # rpc_cmd nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:25.551 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@30 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -a 10.0.0.2 -s 4420 00:12:25.810 00:12:25.810 Discovery Log Number of Records 2, Generation counter 2 00:12:25.810 =====Discovery Log Entry 0====== 00:12:25.810 trtype: rdma 00:12:25.810 adrfam: ipv4 00:12:25.810 subtype: current discovery subsystem 00:12:25.810 treq: not required 00:12:25.810 portid: 0 00:12:25.810 trsvcid: 4420 00:12:25.810 subnqn: nqn.2014-08.org.nvmexpress.discovery 00:12:25.810 traddr: 10.0.0.2 00:12:25.810 eflags: explicit discovery connections, duplicate discovery information 00:12:25.810 rdma_prtype: not specified 00:12:25.810 rdma_qptype: connected 00:12:25.810 rdma_cms: rdma-cm 00:12:25.810 rdma_pkey: 0x0000 00:12:25.810 =====Discovery Log Entry 1====== 00:12:25.810 trtype: rdma 00:12:25.810 adrfam: ipv4 00:12:25.810 subtype: nvme subsystem 00:12:25.810 treq: not required 00:12:25.810 portid: 0 00:12:25.810 trsvcid: 4420 00:12:25.810 subnqn: nqn.2016-06.io.spdk:cnode1 00:12:25.810 traddr: 10.0.0.2 00:12:25.810 eflags: none 00:12:25.810 rdma_prtype: not specified 00:12:25.810 rdma_qptype: connected 00:12:25.810 rdma_cms: rdma-cm 00:12:25.810 rdma_pkey: 0x0000 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@31 -- # devs=($(get_nvme_devs)) 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@31 -- # get_nvme_devs 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@358 -- # local dev _ 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@357 -- # nvme list 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ Node == /dev/nvme* ]] 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ --------------------- == /dev/nvme* ]] 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@31 -- # nvme_num_before_connection=0 00:12:25.810 15:17:27 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@32 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:12:26.747 15:17:28 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@34 -- # waitforserial SPDKISFASTANDAWESOME 2 00:12:26.747 15:17:28 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1198 -- # local i=0 00:12:26.747 15:17:28 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1199 -- # local nvme_device_counter=1 nvme_devices=0 00:12:26.747 15:17:28 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1200 -- # [[ -n 2 ]] 00:12:26.747 15:17:28 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1201 -- # nvme_device_counter=2 00:12:26.747 15:17:28 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1205 -- # sleep 2 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1206 -- # (( i++ <= 15 )) 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1207 -- # lsblk -l -o NAME,SERIAL 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1207 -- # grep -c SPDKISFASTANDAWESOME 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1207 -- # nvme_devices=2 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1208 -- # (( nvme_devices == nvme_device_counter )) 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1208 -- # return 0 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@35 -- # get_nvme_devs 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@358 -- # local dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@357 -- # nvme list 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ Node == /dev/nvme* ]] 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ --------------------- == /dev/nvme* ]] 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ /dev/nvme0n1 == /dev/nvme* ]] 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@362 -- # echo /dev/nvme0n1 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ /dev/nvme0n2 == /dev/nvme* ]] 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@362 -- # echo /dev/nvme0n2 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@35 -- # [[ -z /dev/nvme0n1 00:12:29.282 /dev/nvme0n2 ]] 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@59 -- # devs=($(get_nvme_devs)) 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@59 -- # get_nvme_devs 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@358 -- # local dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.282 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@357 -- # nvme list 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ Node == /dev/nvme* ]] 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ --------------------- == /dev/nvme* ]] 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ /dev/nvme0n1 == /dev/nvme* ]] 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@362 -- # echo /dev/nvme0n1 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@361 -- # [[ /dev/nvme0n2 == /dev/nvme* ]] 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@362 -- # echo /dev/nvme0n2 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@360 -- # read -r dev _ 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@59 -- # nvme_num=2 00:12:29.283 15:17:30 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@60 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:12:29.851 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@61 -- # waitforserial_disconnect SPDKISFASTANDAWESOME 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1219 -- # local i=0 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1220 -- # grep -q -w SPDKISFASTANDAWESOME 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1227 -- # grep -q -w SPDKISFASTANDAWESOME 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1231 -- # return 0 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@62 -- # (( nvme_num <= nvme_num_before_connection )) 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@67 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@68 -- # trap - SIGINT SIGTERM EXIT 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- target/nvme_cli.sh@70 -- # nvmftestfini 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@331 -- # nvmfcleanup 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@99 -- # sync 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@102 -- # set +e 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@103 -- # for i in {1..20} 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:12:29.851 rmmod nvme_rdma 00:12:29.851 rmmod nvme_fabrics 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@106 -- # set -e 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@107 -- # return 0 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@332 -- # '[' -n 1767418 ']' 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@333 -- # killprocess 1767418 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@950 -- # '[' -z 1767418 ']' 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@954 -- # kill -0 1767418 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@955 -- # uname 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:29.851 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1767418 00:12:30.111 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:30.111 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:30.111 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1767418' 00:12:30.111 killing process with pid 1767418 00:12:30.111 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@969 -- # kill 1767418 00:12:30.111 15:17:31 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@974 -- # wait 1767418 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@338 -- # nvmf_fini 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@264 -- # local dev 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@267 -- # remove_target_ns 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@22 -- # _remove_target_ns 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@268 -- # delete_main_bridge 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@130 -- # return 0 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@41 -- # _dev=0 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@41 -- # dev_map=() 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/setup.sh@284 -- # iptr 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@538 -- # iptables-save 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- nvmf/common.sh@538 -- # iptables-restore 00:12:30.370 00:12:30.370 real 0m13.172s 00:12:30.370 user 0m24.670s 00:12:30.370 sys 0m6.000s 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_nvme_cli -- common/autotest_common.sh@10 -- # set +x 00:12:30.370 ************************************ 00:12:30.370 END TEST nvmf_nvme_cli 00:12:30.370 ************************************ 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@30 -- # [[ 0 -eq 1 ]] 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@37 -- # run_test nvmf_auth_target /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/auth.sh --transport=rdma 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:12:30.370 ************************************ 00:12:30.370 START TEST nvmf_auth_target 00:12:30.370 ************************************ 00:12:30.370 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/auth.sh --transport=rdma 00:12:30.630 * Looking for test storage... 00:12:30.630 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1681 -- # lcov --version 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@333 -- # local ver1 ver1_l 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@334 -- # local ver2 ver2_l 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@336 -- # IFS=.-: 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@336 -- # read -ra ver1 00:12:30.630 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@337 -- # IFS=.-: 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@337 -- # read -ra ver2 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@338 -- # local 'op=<' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@340 -- # ver1_l=2 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@341 -- # ver2_l=1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@344 -- # case "$op" in 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@345 -- # : 1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@364 -- # (( v = 0 )) 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@365 -- # decimal 1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@353 -- # local d=1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@355 -- # echo 1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@365 -- # ver1[v]=1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@366 -- # decimal 2 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@353 -- # local d=2 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@355 -- # echo 2 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@366 -- # ver2[v]=2 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@368 -- # return 0 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:12:30.631 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:30.631 --rc genhtml_branch_coverage=1 00:12:30.631 --rc genhtml_function_coverage=1 00:12:30.631 --rc genhtml_legend=1 00:12:30.631 --rc geninfo_all_blocks=1 00:12:30.631 --rc geninfo_unexecuted_blocks=1 00:12:30.631 00:12:30.631 ' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:12:30.631 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:30.631 --rc genhtml_branch_coverage=1 00:12:30.631 --rc genhtml_function_coverage=1 00:12:30.631 --rc genhtml_legend=1 00:12:30.631 --rc geninfo_all_blocks=1 00:12:30.631 --rc geninfo_unexecuted_blocks=1 00:12:30.631 00:12:30.631 ' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:12:30.631 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:30.631 --rc genhtml_branch_coverage=1 00:12:30.631 --rc genhtml_function_coverage=1 00:12:30.631 --rc genhtml_legend=1 00:12:30.631 --rc geninfo_all_blocks=1 00:12:30.631 --rc geninfo_unexecuted_blocks=1 00:12:30.631 00:12:30.631 ' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:12:30.631 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:12:30.631 --rc genhtml_branch_coverage=1 00:12:30.631 --rc genhtml_function_coverage=1 00:12:30.631 --rc genhtml_legend=1 00:12:30.631 --rc geninfo_all_blocks=1 00:12:30.631 --rc geninfo_unexecuted_blocks=1 00:12:30.631 00:12:30.631 ' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@7 -- # uname -s 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@15 -- # shopt -s extglob 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- paths/export.sh@5 -- # export PATH 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@50 -- # : 0 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:12:30.631 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:12:30.632 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@54 -- # have_pci_nics=0 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@13 -- # digests=("sha256" "sha384" "sha512") 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@14 -- # dhgroups=("null" "ffdhe2048" "ffdhe3072" "ffdhe4096" "ffdhe6144" "ffdhe8192") 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@15 -- # subnqn=nqn.2024-03.io.spdk:cnode0 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@16 -- # hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@17 -- # hostsock=/var/tmp/host.sock 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@18 -- # keys=() 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@18 -- # ckeys=() 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@86 -- # nvmftestinit 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@292 -- # prepare_net_devs 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@254 -- # local -g is_hw=no 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@256 -- # remove_target_ns 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@22 -- # _remove_target_ns 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@125 -- # xtrace_disable 00:12:30.632 15:17:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@131 -- # pci_devs=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@131 -- # local -a pci_devs 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@132 -- # pci_net_devs=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@133 -- # pci_drivers=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@133 -- # local -A pci_drivers 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@135 -- # net_devs=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@135 -- # local -ga net_devs 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@136 -- # e810=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@136 -- # local -ga e810 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@137 -- # x722=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@137 -- # local -ga x722 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@138 -- # mlx=() 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@138 -- # local -ga mlx 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:12:38.760 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:12:38.760 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:12:38.760 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:12:38.761 Found net devices under 0000:18:00.0: mlx_0_0 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:12:38.761 Found net devices under 0000:18:00.1: mlx_0_1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@249 -- # get_rdma_if_list 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@75 -- # rdma_devs=() 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@89 -- # continue 2 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@89 -- # continue 2 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@258 -- # is_hw=yes 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@61 -- # uname 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@65 -- # modprobe ib_cm 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@66 -- # modprobe ib_core 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@67 -- # modprobe ib_umad 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@69 -- # modprobe iw_cm 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@27 -- # local -gA dev_map 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@28 -- # local -g _dev 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@44 -- # ips=() 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@58 -- # key_initiator=target1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@11 -- # local val=167772161 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:12:38.761 10.0.0.1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@11 -- # local val=167772162 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:12:38.761 10.0.0.2 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:12:38.761 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@38 -- # ping_ips 1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@107 -- # local dev=target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:12:38.762 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:12:38.762 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:12:38.762 00:12:38.762 --- 10.0.0.2 ping statistics --- 00:12:38.762 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:12:38.762 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@107 -- # local dev=target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:12:38.762 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:12:38.762 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.025 ms 00:12:38.762 00:12:38.762 --- 10.0.0.2 ping statistics --- 00:12:38.762 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:12:38.762 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@98 -- # (( pair++ )) 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@266 -- # return 0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@107 -- # local dev=target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # get_net_dev target1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@107 -- # local dev=target1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@107 -- # local dev=target0 00:12:38.762 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # get_net_dev target1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@107 -- # local dev=target1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@87 -- # nvmfappstart -L nvmf_auth 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@724 -- # xtrace_disable 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@324 -- # nvmfpid=1771121 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@325 -- # waitforlisten 1771121 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -L nvmf_auth 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@831 -- # '[' -z 1771121 ']' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:38.763 15:17:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@864 -- # return 0 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@730 -- # xtrace_disable 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@89 -- # hostpid=1771309 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@88 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_tgt -m 2 -r /var/tmp/host.sock -L nvme_auth 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@91 -- # trap 'dumplogs; cleanup' SIGINT SIGTERM EXIT 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@94 -- # gen_dhchap_key null 48 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=null 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=48 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 24 /dev/urandom 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=34ba925fbc6597b659b753ecf3c5f9ba304357b959083532 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-null.XXX 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-null.NVj 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key 34ba925fbc6597b659b753ecf3c5f9ba304357b959083532 0 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 34ba925fbc6597b659b753ecf3c5f9ba304357b959083532 0 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=34ba925fbc6597b659b753ecf3c5f9ba304357b959083532 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=0 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-null.NVj 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-null.NVj 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@94 -- # keys[0]=/tmp/spdk.key-null.NVj 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@94 -- # gen_dhchap_key sha512 64 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=sha512 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=64 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 32 /dev/urandom 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=bad9cb2fcf9184debc7ba2b9d887230384b2292aa1bf168ca65a317917ee1551 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha512.XXX 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha512.Dgs 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key bad9cb2fcf9184debc7ba2b9d887230384b2292aa1bf168ca65a317917ee1551 3 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 bad9cb2fcf9184debc7ba2b9d887230384b2292aa1bf168ca65a317917ee1551 3 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=bad9cb2fcf9184debc7ba2b9d887230384b2292aa1bf168ca65a317917ee1551 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=3 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha512.Dgs 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha512.Dgs 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@94 -- # ckeys[0]=/tmp/spdk.key-sha512.Dgs 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@95 -- # gen_dhchap_key sha256 32 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=sha256 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=32 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 16 /dev/urandom 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=48cf1c252fae9e1e1e17921b73bfe768 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha256.XXX 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha256.1MK 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key 48cf1c252fae9e1e1e17921b73bfe768 1 00:12:38.763 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 48cf1c252fae9e1e1e17921b73bfe768 1 00:12:38.764 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:38.764 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:38.764 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=48cf1c252fae9e1e1e17921b73bfe768 00:12:38.764 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=1 00:12:38.764 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha256.1MK 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha256.1MK 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@95 -- # keys[1]=/tmp/spdk.key-sha256.1MK 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@95 -- # gen_dhchap_key sha384 48 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=sha384 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=48 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 24 /dev/urandom 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=656ec8ae125de0cae7bc059cd3defc5b265860e1ad0de7ba 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha384.XXX 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha384.yGT 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key 656ec8ae125de0cae7bc059cd3defc5b265860e1ad0de7ba 2 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 656ec8ae125de0cae7bc059cd3defc5b265860e1ad0de7ba 2 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=656ec8ae125de0cae7bc059cd3defc5b265860e1ad0de7ba 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=2 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:39.023 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha384.yGT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha384.yGT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@95 -- # ckeys[1]=/tmp/spdk.key-sha384.yGT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@96 -- # gen_dhchap_key sha384 48 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=sha384 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=48 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 24 /dev/urandom 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=73588eb1fecd9cca3c01fe0023add858735427a2aaa815a4 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha384.XXX 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha384.qrk 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key 73588eb1fecd9cca3c01fe0023add858735427a2aaa815a4 2 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 73588eb1fecd9cca3c01fe0023add858735427a2aaa815a4 2 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=73588eb1fecd9cca3c01fe0023add858735427a2aaa815a4 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=2 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha384.qrk 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha384.qrk 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@96 -- # keys[2]=/tmp/spdk.key-sha384.qrk 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@96 -- # gen_dhchap_key sha256 32 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=sha256 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=32 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 16 /dev/urandom 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=33169a2dc67b0fe4a7a348887be60db3 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha256.XXX 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha256.GOT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key 33169a2dc67b0fe4a7a348887be60db3 1 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 33169a2dc67b0fe4a7a348887be60db3 1 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=33169a2dc67b0fe4a7a348887be60db3 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=1 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha256.GOT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha256.GOT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@96 -- # ckeys[2]=/tmp/spdk.key-sha256.GOT 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@97 -- # gen_dhchap_key sha512 64 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@521 -- # local digest len file key 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@522 -- # local -A digests 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # digest=sha512 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@524 -- # len=64 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # xxd -p -c0 -l 32 /dev/urandom 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@525 -- # key=ef14658e8ec755d6929b888b611914b391facf081bd01c6978c9ad799cf620df 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha512.XXX 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha512.dEA 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@527 -- # format_dhchap_key ef14658e8ec755d6929b888b611914b391facf081bd01c6978c9ad799cf620df 3 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@517 -- # format_key DHHC-1 ef14658e8ec755d6929b888b611914b391facf081bd01c6978c9ad799cf620df 3 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@500 -- # local prefix key digest 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # key=ef14658e8ec755d6929b888b611914b391facf081bd01c6978c9ad799cf620df 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@502 -- # digest=3 00:12:39.024 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@503 -- # python - 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha512.dEA 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha512.dEA 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@97 -- # keys[3]=/tmp/spdk.key-sha512.dEA 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@97 -- # ckeys[3]= 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@99 -- # waitforlisten 1771121 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@831 -- # '[' -z 1771121 ']' 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:39.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:39.283 15:17:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@864 -- # return 0 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@100 -- # waitforlisten 1771309 /var/tmp/host.sock 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@831 -- # '[' -z 1771309 ']' 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/host.sock 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/host.sock...' 00:12:39.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/host.sock... 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:39.283 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:39.541 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:39.541 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@864 -- # return 0 00:12:39.541 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@101 -- # rpc_cmd 00:12:39.541 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:39.541 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@108 -- # for i in "${!keys[@]}" 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@109 -- # rpc_cmd keyring_file_add_key key0 /tmp/spdk.key-null.NVj 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@110 -- # hostrpc keyring_file_add_key key0 /tmp/spdk.key-null.NVj 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key key0 /tmp/spdk.key-null.NVj 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@111 -- # [[ -n /tmp/spdk.key-sha512.Dgs ]] 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@112 -- # rpc_cmd keyring_file_add_key ckey0 /tmp/spdk.key-sha512.Dgs 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:39.800 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:39.801 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:39.801 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@113 -- # hostrpc keyring_file_add_key ckey0 /tmp/spdk.key-sha512.Dgs 00:12:39.801 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key ckey0 /tmp/spdk.key-sha512.Dgs 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@108 -- # for i in "${!keys[@]}" 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@109 -- # rpc_cmd keyring_file_add_key key1 /tmp/spdk.key-sha256.1MK 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@110 -- # hostrpc keyring_file_add_key key1 /tmp/spdk.key-sha256.1MK 00:12:40.059 15:17:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key key1 /tmp/spdk.key-sha256.1MK 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@111 -- # [[ -n /tmp/spdk.key-sha384.yGT ]] 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@112 -- # rpc_cmd keyring_file_add_key ckey1 /tmp/spdk.key-sha384.yGT 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@113 -- # hostrpc keyring_file_add_key ckey1 /tmp/spdk.key-sha384.yGT 00:12:40.318 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key ckey1 /tmp/spdk.key-sha384.yGT 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@108 -- # for i in "${!keys[@]}" 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@109 -- # rpc_cmd keyring_file_add_key key2 /tmp/spdk.key-sha384.qrk 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@110 -- # hostrpc keyring_file_add_key key2 /tmp/spdk.key-sha384.qrk 00:12:40.578 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key key2 /tmp/spdk.key-sha384.qrk 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@111 -- # [[ -n /tmp/spdk.key-sha256.GOT ]] 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@112 -- # rpc_cmd keyring_file_add_key ckey2 /tmp/spdk.key-sha256.GOT 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@113 -- # hostrpc keyring_file_add_key ckey2 /tmp/spdk.key-sha256.GOT 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key ckey2 /tmp/spdk.key-sha256.GOT 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@108 -- # for i in "${!keys[@]}" 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@109 -- # rpc_cmd keyring_file_add_key key3 /tmp/spdk.key-sha512.dEA 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@110 -- # hostrpc keyring_file_add_key key3 /tmp/spdk.key-sha512.dEA 00:12:40.837 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock keyring_file_add_key key3 /tmp/spdk.key-sha512.dEA 00:12:41.097 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@111 -- # [[ -n '' ]] 00:12:41.097 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@118 -- # for digest in "${digests[@]}" 00:12:41.097 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:12:41.097 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:41.097 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:41.097 15:17:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 null 0 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:41.357 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:41.616 00:12:41.616 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:41.616 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:41.616 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:41.875 { 00:12:41.875 "cntlid": 1, 00:12:41.875 "qid": 0, 00:12:41.875 "state": "enabled", 00:12:41.875 "thread": "nvmf_tgt_poll_group_000", 00:12:41.875 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:41.875 "listen_address": { 00:12:41.875 "trtype": "RDMA", 00:12:41.875 "adrfam": "IPv4", 00:12:41.875 "traddr": "10.0.0.2", 00:12:41.875 "trsvcid": "4420" 00:12:41.875 }, 00:12:41.875 "peer_address": { 00:12:41.875 "trtype": "RDMA", 00:12:41.875 "adrfam": "IPv4", 00:12:41.875 "traddr": "10.0.0.2", 00:12:41.875 "trsvcid": "57579" 00:12:41.875 }, 00:12:41.875 "auth": { 00:12:41.875 "state": "completed", 00:12:41.875 "digest": "sha256", 00:12:41.875 "dhgroup": "null" 00:12:41.875 } 00:12:41.875 } 00:12:41.875 ]' 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:41.875 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:42.134 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:12:42.134 15:17:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:12:42.703 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:42.963 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:42.963 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 null 1 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:43.222 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:43.223 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:43.223 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:43.223 15:17:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:43.482 00:12:43.482 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:43.482 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:43.482 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:43.482 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:43.741 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:43.741 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:43.741 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:43.741 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:43.741 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:43.741 { 00:12:43.741 "cntlid": 3, 00:12:43.741 "qid": 0, 00:12:43.741 "state": "enabled", 00:12:43.741 "thread": "nvmf_tgt_poll_group_000", 00:12:43.741 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:43.741 "listen_address": { 00:12:43.741 "trtype": "RDMA", 00:12:43.741 "adrfam": "IPv4", 00:12:43.741 "traddr": "10.0.0.2", 00:12:43.741 "trsvcid": "4420" 00:12:43.741 }, 00:12:43.741 "peer_address": { 00:12:43.741 "trtype": "RDMA", 00:12:43.741 "adrfam": "IPv4", 00:12:43.741 "traddr": "10.0.0.2", 00:12:43.741 "trsvcid": "57228" 00:12:43.742 }, 00:12:43.742 "auth": { 00:12:43.742 "state": "completed", 00:12:43.742 "digest": "sha256", 00:12:43.742 "dhgroup": "null" 00:12:43.742 } 00:12:43.742 } 00:12:43.742 ]' 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:43.742 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:44.001 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:12:44.001 15:17:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:12:44.569 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:44.829 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 null 2 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:44.829 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:45.088 00:12:45.348 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:45.348 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:45.348 15:17:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:45.348 { 00:12:45.348 "cntlid": 5, 00:12:45.348 "qid": 0, 00:12:45.348 "state": "enabled", 00:12:45.348 "thread": "nvmf_tgt_poll_group_000", 00:12:45.348 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:45.348 "listen_address": { 00:12:45.348 "trtype": "RDMA", 00:12:45.348 "adrfam": "IPv4", 00:12:45.348 "traddr": "10.0.0.2", 00:12:45.348 "trsvcid": "4420" 00:12:45.348 }, 00:12:45.348 "peer_address": { 00:12:45.348 "trtype": "RDMA", 00:12:45.348 "adrfam": "IPv4", 00:12:45.348 "traddr": "10.0.0.2", 00:12:45.348 "trsvcid": "38117" 00:12:45.348 }, 00:12:45.348 "auth": { 00:12:45.348 "state": "completed", 00:12:45.348 "digest": "sha256", 00:12:45.348 "dhgroup": "null" 00:12:45.348 } 00:12:45.348 } 00:12:45.348 ]' 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:45.348 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:45.607 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:12:45.607 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:45.607 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:45.607 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:45.607 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:45.866 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:12:45.866 15:17:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:46.434 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:46.434 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups null 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 null 3 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:12:46.694 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:12:46.953 00:12:46.953 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:46.953 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:46.953 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:47.214 { 00:12:47.214 "cntlid": 7, 00:12:47.214 "qid": 0, 00:12:47.214 "state": "enabled", 00:12:47.214 "thread": "nvmf_tgt_poll_group_000", 00:12:47.214 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:47.214 "listen_address": { 00:12:47.214 "trtype": "RDMA", 00:12:47.214 "adrfam": "IPv4", 00:12:47.214 "traddr": "10.0.0.2", 00:12:47.214 "trsvcid": "4420" 00:12:47.214 }, 00:12:47.214 "peer_address": { 00:12:47.214 "trtype": "RDMA", 00:12:47.214 "adrfam": "IPv4", 00:12:47.214 "traddr": "10.0.0.2", 00:12:47.214 "trsvcid": "49445" 00:12:47.214 }, 00:12:47.214 "auth": { 00:12:47.214 "state": "completed", 00:12:47.214 "digest": "sha256", 00:12:47.214 "dhgroup": "null" 00:12:47.214 } 00:12:47.214 } 00:12:47.214 ]' 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:12:47.214 15:17:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:47.214 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:47.214 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:47.214 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:47.473 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:12:47.473 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:12:48.043 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:48.302 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:48.302 15:17:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe2048 0 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:48.562 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:48.821 00:12:48.822 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:48.822 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:48.822 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:49.081 { 00:12:49.081 "cntlid": 9, 00:12:49.081 "qid": 0, 00:12:49.081 "state": "enabled", 00:12:49.081 "thread": "nvmf_tgt_poll_group_000", 00:12:49.081 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:49.081 "listen_address": { 00:12:49.081 "trtype": "RDMA", 00:12:49.081 "adrfam": "IPv4", 00:12:49.081 "traddr": "10.0.0.2", 00:12:49.081 "trsvcid": "4420" 00:12:49.081 }, 00:12:49.081 "peer_address": { 00:12:49.081 "trtype": "RDMA", 00:12:49.081 "adrfam": "IPv4", 00:12:49.081 "traddr": "10.0.0.2", 00:12:49.081 "trsvcid": "47655" 00:12:49.081 }, 00:12:49.081 "auth": { 00:12:49.081 "state": "completed", 00:12:49.081 "digest": "sha256", 00:12:49.081 "dhgroup": "ffdhe2048" 00:12:49.081 } 00:12:49.081 } 00:12:49.081 ]' 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:49.081 15:17:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:49.341 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:12:49.341 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:12:49.909 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:50.167 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:50.167 15:17:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe2048 1 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:50.167 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:50.426 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:50.426 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:50.426 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:50.426 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:50.426 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:50.685 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:50.944 { 00:12:50.944 "cntlid": 11, 00:12:50.944 "qid": 0, 00:12:50.944 "state": "enabled", 00:12:50.944 "thread": "nvmf_tgt_poll_group_000", 00:12:50.944 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:50.944 "listen_address": { 00:12:50.944 "trtype": "RDMA", 00:12:50.944 "adrfam": "IPv4", 00:12:50.944 "traddr": "10.0.0.2", 00:12:50.944 "trsvcid": "4420" 00:12:50.944 }, 00:12:50.944 "peer_address": { 00:12:50.944 "trtype": "RDMA", 00:12:50.944 "adrfam": "IPv4", 00:12:50.944 "traddr": "10.0.0.2", 00:12:50.944 "trsvcid": "42210" 00:12:50.944 }, 00:12:50.944 "auth": { 00:12:50.944 "state": "completed", 00:12:50.944 "digest": "sha256", 00:12:50.944 "dhgroup": "ffdhe2048" 00:12:50.944 } 00:12:50.944 } 00:12:50.944 ]' 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:50.944 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:51.203 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:12:51.203 15:17:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:12:51.771 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:51.771 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:51.771 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:51.771 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:51.771 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe2048 2 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:52.029 15:17:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:52.287 00:12:52.288 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:52.288 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:52.288 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:52.546 { 00:12:52.546 "cntlid": 13, 00:12:52.546 "qid": 0, 00:12:52.546 "state": "enabled", 00:12:52.546 "thread": "nvmf_tgt_poll_group_000", 00:12:52.546 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:52.546 "listen_address": { 00:12:52.546 "trtype": "RDMA", 00:12:52.546 "adrfam": "IPv4", 00:12:52.546 "traddr": "10.0.0.2", 00:12:52.546 "trsvcid": "4420" 00:12:52.546 }, 00:12:52.546 "peer_address": { 00:12:52.546 "trtype": "RDMA", 00:12:52.546 "adrfam": "IPv4", 00:12:52.546 "traddr": "10.0.0.2", 00:12:52.546 "trsvcid": "52450" 00:12:52.546 }, 00:12:52.546 "auth": { 00:12:52.546 "state": "completed", 00:12:52.546 "digest": "sha256", 00:12:52.546 "dhgroup": "ffdhe2048" 00:12:52.546 } 00:12:52.546 } 00:12:52.546 ]' 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:12:52.546 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:52.805 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:52.805 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:52.805 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:52.805 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:12:52.805 15:17:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:53.741 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:53.741 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe2048 3 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:12:54.000 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:12:54.260 00:12:54.260 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:54.260 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:54.260 15:17:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:54.260 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:54.260 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:54.260 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:54.260 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:54.260 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:54.519 { 00:12:54.519 "cntlid": 15, 00:12:54.519 "qid": 0, 00:12:54.519 "state": "enabled", 00:12:54.519 "thread": "nvmf_tgt_poll_group_000", 00:12:54.519 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:54.519 "listen_address": { 00:12:54.519 "trtype": "RDMA", 00:12:54.519 "adrfam": "IPv4", 00:12:54.519 "traddr": "10.0.0.2", 00:12:54.519 "trsvcid": "4420" 00:12:54.519 }, 00:12:54.519 "peer_address": { 00:12:54.519 "trtype": "RDMA", 00:12:54.519 "adrfam": "IPv4", 00:12:54.519 "traddr": "10.0.0.2", 00:12:54.519 "trsvcid": "48659" 00:12:54.519 }, 00:12:54.519 "auth": { 00:12:54.519 "state": "completed", 00:12:54.519 "digest": "sha256", 00:12:54.519 "dhgroup": "ffdhe2048" 00:12:54.519 } 00:12:54.519 } 00:12:54.519 ]' 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:54.519 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:54.778 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:12:54.778 15:17:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:12:55.344 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:55.601 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe3072 0 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:55.601 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:55.602 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:55.602 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:55.602 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:12:55.859 00:12:55.859 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:56.117 { 00:12:56.117 "cntlid": 17, 00:12:56.117 "qid": 0, 00:12:56.117 "state": "enabled", 00:12:56.117 "thread": "nvmf_tgt_poll_group_000", 00:12:56.117 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:56.117 "listen_address": { 00:12:56.117 "trtype": "RDMA", 00:12:56.117 "adrfam": "IPv4", 00:12:56.117 "traddr": "10.0.0.2", 00:12:56.117 "trsvcid": "4420" 00:12:56.117 }, 00:12:56.117 "peer_address": { 00:12:56.117 "trtype": "RDMA", 00:12:56.117 "adrfam": "IPv4", 00:12:56.117 "traddr": "10.0.0.2", 00:12:56.117 "trsvcid": "48734" 00:12:56.117 }, 00:12:56.117 "auth": { 00:12:56.117 "state": "completed", 00:12:56.117 "digest": "sha256", 00:12:56.117 "dhgroup": "ffdhe3072" 00:12:56.117 } 00:12:56.117 } 00:12:56.117 ]' 00:12:56.117 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:56.376 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:56.376 15:17:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:56.376 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:12:56.376 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:56.376 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:56.376 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:56.376 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:56.635 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:12:56.635 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:12:57.202 15:17:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:57.202 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:12:57.202 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe3072 1 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:57.461 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:12:57.719 00:12:57.719 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:57.719 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:57.719 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:58.058 { 00:12:58.058 "cntlid": 19, 00:12:58.058 "qid": 0, 00:12:58.058 "state": "enabled", 00:12:58.058 "thread": "nvmf_tgt_poll_group_000", 00:12:58.058 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:58.058 "listen_address": { 00:12:58.058 "trtype": "RDMA", 00:12:58.058 "adrfam": "IPv4", 00:12:58.058 "traddr": "10.0.0.2", 00:12:58.058 "trsvcid": "4420" 00:12:58.058 }, 00:12:58.058 "peer_address": { 00:12:58.058 "trtype": "RDMA", 00:12:58.058 "adrfam": "IPv4", 00:12:58.058 "traddr": "10.0.0.2", 00:12:58.058 "trsvcid": "45081" 00:12:58.058 }, 00:12:58.058 "auth": { 00:12:58.058 "state": "completed", 00:12:58.058 "digest": "sha256", 00:12:58.058 "dhgroup": "ffdhe3072" 00:12:58.058 } 00:12:58.058 } 00:12:58.058 ]' 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:58.058 15:17:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:12:58.360 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:12:58.360 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:12:58.927 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:12:59.186 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:12:59.186 15:18:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe3072 2 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:59.186 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:59.445 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:59.445 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:59.445 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:59.445 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:12:59.703 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:12:59.703 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:12:59.703 { 00:12:59.703 "cntlid": 21, 00:12:59.703 "qid": 0, 00:12:59.703 "state": "enabled", 00:12:59.703 "thread": "nvmf_tgt_poll_group_000", 00:12:59.703 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:12:59.703 "listen_address": { 00:12:59.703 "trtype": "RDMA", 00:12:59.703 "adrfam": "IPv4", 00:12:59.703 "traddr": "10.0.0.2", 00:12:59.703 "trsvcid": "4420" 00:12:59.703 }, 00:12:59.703 "peer_address": { 00:12:59.703 "trtype": "RDMA", 00:12:59.703 "adrfam": "IPv4", 00:12:59.703 "traddr": "10.0.0.2", 00:12:59.703 "trsvcid": "41377" 00:12:59.703 }, 00:12:59.703 "auth": { 00:12:59.703 "state": "completed", 00:12:59.703 "digest": "sha256", 00:12:59.703 "dhgroup": "ffdhe3072" 00:12:59.703 } 00:12:59.703 } 00:12:59.703 ]' 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:12:59.962 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:00.220 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:00.220 15:18:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:00.787 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:01.046 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe3072 3 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:01.046 15:18:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:01.304 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:01.563 { 00:13:01.563 "cntlid": 23, 00:13:01.563 "qid": 0, 00:13:01.563 "state": "enabled", 00:13:01.563 "thread": "nvmf_tgt_poll_group_000", 00:13:01.563 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:01.563 "listen_address": { 00:13:01.563 "trtype": "RDMA", 00:13:01.563 "adrfam": "IPv4", 00:13:01.563 "traddr": "10.0.0.2", 00:13:01.563 "trsvcid": "4420" 00:13:01.563 }, 00:13:01.563 "peer_address": { 00:13:01.563 "trtype": "RDMA", 00:13:01.563 "adrfam": "IPv4", 00:13:01.563 "traddr": "10.0.0.2", 00:13:01.563 "trsvcid": "37561" 00:13:01.563 }, 00:13:01.563 "auth": { 00:13:01.563 "state": "completed", 00:13:01.563 "digest": "sha256", 00:13:01.563 "dhgroup": "ffdhe3072" 00:13:01.563 } 00:13:01.563 } 00:13:01.563 ]' 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:01.563 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:01.822 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:13:01.822 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:01.822 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:01.823 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:01.823 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:02.081 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:02.081 15:18:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:02.649 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:02.649 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe4096 0 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:02.908 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:03.168 00:13:03.168 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:03.168 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:03.168 15:18:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:03.426 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:03.426 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:03.426 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:03.426 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:03.426 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:03.426 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:03.426 { 00:13:03.426 "cntlid": 25, 00:13:03.426 "qid": 0, 00:13:03.426 "state": "enabled", 00:13:03.426 "thread": "nvmf_tgt_poll_group_000", 00:13:03.426 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:03.426 "listen_address": { 00:13:03.427 "trtype": "RDMA", 00:13:03.427 "adrfam": "IPv4", 00:13:03.427 "traddr": "10.0.0.2", 00:13:03.427 "trsvcid": "4420" 00:13:03.427 }, 00:13:03.427 "peer_address": { 00:13:03.427 "trtype": "RDMA", 00:13:03.427 "adrfam": "IPv4", 00:13:03.427 "traddr": "10.0.0.2", 00:13:03.427 "trsvcid": "34768" 00:13:03.427 }, 00:13:03.427 "auth": { 00:13:03.427 "state": "completed", 00:13:03.427 "digest": "sha256", 00:13:03.427 "dhgroup": "ffdhe4096" 00:13:03.427 } 00:13:03.427 } 00:13:03.427 ]' 00:13:03.427 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:03.686 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:03.945 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:03.945 15:18:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:04.514 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:04.514 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe4096 1 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:04.773 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:05.031 00:13:05.031 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:05.031 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:05.031 15:18:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:05.291 { 00:13:05.291 "cntlid": 27, 00:13:05.291 "qid": 0, 00:13:05.291 "state": "enabled", 00:13:05.291 "thread": "nvmf_tgt_poll_group_000", 00:13:05.291 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:05.291 "listen_address": { 00:13:05.291 "trtype": "RDMA", 00:13:05.291 "adrfam": "IPv4", 00:13:05.291 "traddr": "10.0.0.2", 00:13:05.291 "trsvcid": "4420" 00:13:05.291 }, 00:13:05.291 "peer_address": { 00:13:05.291 "trtype": "RDMA", 00:13:05.291 "adrfam": "IPv4", 00:13:05.291 "traddr": "10.0.0.2", 00:13:05.291 "trsvcid": "35258" 00:13:05.291 }, 00:13:05.291 "auth": { 00:13:05.291 "state": "completed", 00:13:05.291 "digest": "sha256", 00:13:05.291 "dhgroup": "ffdhe4096" 00:13:05.291 } 00:13:05.291 } 00:13:05.291 ]' 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:05.291 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:05.550 15:18:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:06.488 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:06.488 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe4096 2 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:06.748 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:07.007 00:13:07.007 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:07.007 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:07.007 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:07.266 { 00:13:07.266 "cntlid": 29, 00:13:07.266 "qid": 0, 00:13:07.266 "state": "enabled", 00:13:07.266 "thread": "nvmf_tgt_poll_group_000", 00:13:07.266 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:07.266 "listen_address": { 00:13:07.266 "trtype": "RDMA", 00:13:07.266 "adrfam": "IPv4", 00:13:07.266 "traddr": "10.0.0.2", 00:13:07.266 "trsvcid": "4420" 00:13:07.266 }, 00:13:07.266 "peer_address": { 00:13:07.266 "trtype": "RDMA", 00:13:07.266 "adrfam": "IPv4", 00:13:07.266 "traddr": "10.0.0.2", 00:13:07.266 "trsvcid": "46793" 00:13:07.266 }, 00:13:07.266 "auth": { 00:13:07.266 "state": "completed", 00:13:07.266 "digest": "sha256", 00:13:07.266 "dhgroup": "ffdhe4096" 00:13:07.266 } 00:13:07.266 } 00:13:07.266 ]' 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:07.266 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:07.267 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:07.267 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:07.267 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:07.267 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:07.267 15:18:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:07.526 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:07.526 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:08.094 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:08.353 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:08.353 15:18:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe4096 3 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:08.353 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:08.921 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:08.921 { 00:13:08.921 "cntlid": 31, 00:13:08.921 "qid": 0, 00:13:08.921 "state": "enabled", 00:13:08.921 "thread": "nvmf_tgt_poll_group_000", 00:13:08.921 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:08.921 "listen_address": { 00:13:08.921 "trtype": "RDMA", 00:13:08.921 "adrfam": "IPv4", 00:13:08.921 "traddr": "10.0.0.2", 00:13:08.921 "trsvcid": "4420" 00:13:08.921 }, 00:13:08.921 "peer_address": { 00:13:08.921 "trtype": "RDMA", 00:13:08.921 "adrfam": "IPv4", 00:13:08.921 "traddr": "10.0.0.2", 00:13:08.921 "trsvcid": "37979" 00:13:08.921 }, 00:13:08.921 "auth": { 00:13:08.921 "state": "completed", 00:13:08.921 "digest": "sha256", 00:13:08.921 "dhgroup": "ffdhe4096" 00:13:08.921 } 00:13:08.921 } 00:13:08.921 ]' 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:08.921 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:09.180 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:09.180 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:09.180 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:09.180 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:09.180 15:18:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:09.439 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:09.440 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:10.008 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:10.008 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:10.268 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe6144 0 00:13:10.268 15:18:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:10.268 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:10.527 00:13:10.527 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:10.527 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:10.527 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:10.786 { 00:13:10.786 "cntlid": 33, 00:13:10.786 "qid": 0, 00:13:10.786 "state": "enabled", 00:13:10.786 "thread": "nvmf_tgt_poll_group_000", 00:13:10.786 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:10.786 "listen_address": { 00:13:10.786 "trtype": "RDMA", 00:13:10.786 "adrfam": "IPv4", 00:13:10.786 "traddr": "10.0.0.2", 00:13:10.786 "trsvcid": "4420" 00:13:10.786 }, 00:13:10.786 "peer_address": { 00:13:10.786 "trtype": "RDMA", 00:13:10.786 "adrfam": "IPv4", 00:13:10.786 "traddr": "10.0.0.2", 00:13:10.786 "trsvcid": "37463" 00:13:10.786 }, 00:13:10.786 "auth": { 00:13:10.786 "state": "completed", 00:13:10.786 "digest": "sha256", 00:13:10.786 "dhgroup": "ffdhe6144" 00:13:10.786 } 00:13:10.786 } 00:13:10.786 ]' 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:10.786 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:11.045 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:11.045 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:11.045 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:11.045 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:11.045 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:11.046 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:11.304 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:11.304 15:18:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:11.873 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:11.873 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe6144 1 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:12.133 15:18:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:12.701 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:12.701 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:12.701 { 00:13:12.701 "cntlid": 35, 00:13:12.701 "qid": 0, 00:13:12.701 "state": "enabled", 00:13:12.701 "thread": "nvmf_tgt_poll_group_000", 00:13:12.701 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:12.701 "listen_address": { 00:13:12.701 "trtype": "RDMA", 00:13:12.701 "adrfam": "IPv4", 00:13:12.701 "traddr": "10.0.0.2", 00:13:12.701 "trsvcid": "4420" 00:13:12.701 }, 00:13:12.701 "peer_address": { 00:13:12.701 "trtype": "RDMA", 00:13:12.701 "adrfam": "IPv4", 00:13:12.701 "traddr": "10.0.0.2", 00:13:12.701 "trsvcid": "42859" 00:13:12.701 }, 00:13:12.701 "auth": { 00:13:12.701 "state": "completed", 00:13:12.701 "digest": "sha256", 00:13:12.701 "dhgroup": "ffdhe6144" 00:13:12.702 } 00:13:12.702 } 00:13:12.702 ]' 00:13:12.702 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:12.961 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:13.220 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:13.220 15:18:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:13.787 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:13.787 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe6144 2 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:14.046 15:18:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:14.614 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:14.614 { 00:13:14.614 "cntlid": 37, 00:13:14.614 "qid": 0, 00:13:14.614 "state": "enabled", 00:13:14.614 "thread": "nvmf_tgt_poll_group_000", 00:13:14.614 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:14.614 "listen_address": { 00:13:14.614 "trtype": "RDMA", 00:13:14.614 "adrfam": "IPv4", 00:13:14.614 "traddr": "10.0.0.2", 00:13:14.614 "trsvcid": "4420" 00:13:14.614 }, 00:13:14.614 "peer_address": { 00:13:14.614 "trtype": "RDMA", 00:13:14.614 "adrfam": "IPv4", 00:13:14.614 "traddr": "10.0.0.2", 00:13:14.614 "trsvcid": "53551" 00:13:14.614 }, 00:13:14.614 "auth": { 00:13:14.614 "state": "completed", 00:13:14.614 "digest": "sha256", 00:13:14.614 "dhgroup": "ffdhe6144" 00:13:14.614 } 00:13:14.614 } 00:13:14.614 ]' 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:14.614 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:14.873 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:14.873 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:14.873 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:14.873 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:14.873 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:14.873 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:15.132 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:15.132 15:18:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:15.699 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:15.699 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe6144 3 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:15.958 15:18:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:16.220 00:13:16.220 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:16.220 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:16.220 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:16.481 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:16.481 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:16.481 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:16.482 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:16.482 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:16.482 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:16.482 { 00:13:16.482 "cntlid": 39, 00:13:16.482 "qid": 0, 00:13:16.482 "state": "enabled", 00:13:16.482 "thread": "nvmf_tgt_poll_group_000", 00:13:16.482 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:16.482 "listen_address": { 00:13:16.482 "trtype": "RDMA", 00:13:16.482 "adrfam": "IPv4", 00:13:16.482 "traddr": "10.0.0.2", 00:13:16.482 "trsvcid": "4420" 00:13:16.482 }, 00:13:16.482 "peer_address": { 00:13:16.482 "trtype": "RDMA", 00:13:16.482 "adrfam": "IPv4", 00:13:16.482 "traddr": "10.0.0.2", 00:13:16.482 "trsvcid": "47268" 00:13:16.482 }, 00:13:16.482 "auth": { 00:13:16.482 "state": "completed", 00:13:16.482 "digest": "sha256", 00:13:16.482 "dhgroup": "ffdhe6144" 00:13:16.482 } 00:13:16.482 } 00:13:16.482 ]' 00:13:16.482 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:16.482 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:16.482 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:16.740 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:16.740 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:16.740 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:16.741 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:16.741 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:16.999 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:16.999 15:18:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:17.567 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:17.567 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe8192 0 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:17.826 15:18:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:18.395 00:13:18.395 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:18.395 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:18.395 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:18.654 { 00:13:18.654 "cntlid": 41, 00:13:18.654 "qid": 0, 00:13:18.654 "state": "enabled", 00:13:18.654 "thread": "nvmf_tgt_poll_group_000", 00:13:18.654 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:18.654 "listen_address": { 00:13:18.654 "trtype": "RDMA", 00:13:18.654 "adrfam": "IPv4", 00:13:18.654 "traddr": "10.0.0.2", 00:13:18.654 "trsvcid": "4420" 00:13:18.654 }, 00:13:18.654 "peer_address": { 00:13:18.654 "trtype": "RDMA", 00:13:18.654 "adrfam": "IPv4", 00:13:18.654 "traddr": "10.0.0.2", 00:13:18.654 "trsvcid": "59053" 00:13:18.654 }, 00:13:18.654 "auth": { 00:13:18.654 "state": "completed", 00:13:18.654 "digest": "sha256", 00:13:18.654 "dhgroup": "ffdhe8192" 00:13:18.654 } 00:13:18.654 } 00:13:18.654 ]' 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:18.654 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:18.913 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:18.913 15:18:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:19.480 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:19.738 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe8192 1 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:19.738 15:18:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:20.306 00:13:20.306 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:20.306 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:20.306 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:20.566 { 00:13:20.566 "cntlid": 43, 00:13:20.566 "qid": 0, 00:13:20.566 "state": "enabled", 00:13:20.566 "thread": "nvmf_tgt_poll_group_000", 00:13:20.566 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:20.566 "listen_address": { 00:13:20.566 "trtype": "RDMA", 00:13:20.566 "adrfam": "IPv4", 00:13:20.566 "traddr": "10.0.0.2", 00:13:20.566 "trsvcid": "4420" 00:13:20.566 }, 00:13:20.566 "peer_address": { 00:13:20.566 "trtype": "RDMA", 00:13:20.566 "adrfam": "IPv4", 00:13:20.566 "traddr": "10.0.0.2", 00:13:20.566 "trsvcid": "33308" 00:13:20.566 }, 00:13:20.566 "auth": { 00:13:20.566 "state": "completed", 00:13:20.566 "digest": "sha256", 00:13:20.566 "dhgroup": "ffdhe8192" 00:13:20.566 } 00:13:20.566 } 00:13:20.566 ]' 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:13:20.566 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:20.825 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:20.825 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:20.825 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:20.825 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:20.825 15:18:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:21.762 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:21.762 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe8192 2 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:22.021 15:18:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:22.280 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:22.539 { 00:13:22.539 "cntlid": 45, 00:13:22.539 "qid": 0, 00:13:22.539 "state": "enabled", 00:13:22.539 "thread": "nvmf_tgt_poll_group_000", 00:13:22.539 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:22.539 "listen_address": { 00:13:22.539 "trtype": "RDMA", 00:13:22.539 "adrfam": "IPv4", 00:13:22.539 "traddr": "10.0.0.2", 00:13:22.539 "trsvcid": "4420" 00:13:22.539 }, 00:13:22.539 "peer_address": { 00:13:22.539 "trtype": "RDMA", 00:13:22.539 "adrfam": "IPv4", 00:13:22.539 "traddr": "10.0.0.2", 00:13:22.539 "trsvcid": "58950" 00:13:22.539 }, 00:13:22.539 "auth": { 00:13:22.539 "state": "completed", 00:13:22.539 "digest": "sha256", 00:13:22.539 "dhgroup": "ffdhe8192" 00:13:22.539 } 00:13:22.539 } 00:13:22.539 ]' 00:13:22.539 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:22.798 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:23.057 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:23.057 15:18:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:23.625 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:23.625 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha256 ffdhe8192 3 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha256 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:23.885 15:18:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:24.453 00:13:24.453 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:24.453 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:24.453 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:24.712 { 00:13:24.712 "cntlid": 47, 00:13:24.712 "qid": 0, 00:13:24.712 "state": "enabled", 00:13:24.712 "thread": "nvmf_tgt_poll_group_000", 00:13:24.712 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:24.712 "listen_address": { 00:13:24.712 "trtype": "RDMA", 00:13:24.712 "adrfam": "IPv4", 00:13:24.712 "traddr": "10.0.0.2", 00:13:24.712 "trsvcid": "4420" 00:13:24.712 }, 00:13:24.712 "peer_address": { 00:13:24.712 "trtype": "RDMA", 00:13:24.712 "adrfam": "IPv4", 00:13:24.712 "traddr": "10.0.0.2", 00:13:24.712 "trsvcid": "57423" 00:13:24.712 }, 00:13:24.712 "auth": { 00:13:24.712 "state": "completed", 00:13:24.712 "digest": "sha256", 00:13:24.712 "dhgroup": "ffdhe8192" 00:13:24.712 } 00:13:24.712 } 00:13:24.712 ]' 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha256 == \s\h\a\2\5\6 ]] 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:24.712 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:24.971 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:24.971 15:18:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:25.907 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@118 -- # for digest in "${digests[@]}" 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 null 0 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:25.907 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:25.908 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:25.908 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:25.908 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:25.908 15:18:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:26.166 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:26.429 { 00:13:26.429 "cntlid": 49, 00:13:26.429 "qid": 0, 00:13:26.429 "state": "enabled", 00:13:26.429 "thread": "nvmf_tgt_poll_group_000", 00:13:26.429 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:26.429 "listen_address": { 00:13:26.429 "trtype": "RDMA", 00:13:26.429 "adrfam": "IPv4", 00:13:26.429 "traddr": "10.0.0.2", 00:13:26.429 "trsvcid": "4420" 00:13:26.429 }, 00:13:26.429 "peer_address": { 00:13:26.429 "trtype": "RDMA", 00:13:26.429 "adrfam": "IPv4", 00:13:26.429 "traddr": "10.0.0.2", 00:13:26.429 "trsvcid": "57910" 00:13:26.429 }, 00:13:26.429 "auth": { 00:13:26.429 "state": "completed", 00:13:26.429 "digest": "sha384", 00:13:26.429 "dhgroup": "null" 00:13:26.429 } 00:13:26.429 } 00:13:26.429 ]' 00:13:26.429 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:26.755 15:18:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:27.410 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:27.669 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:27.669 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 null 1 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:27.928 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:28.188 00:13:28.188 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:28.188 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:28.188 15:18:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:28.447 { 00:13:28.447 "cntlid": 51, 00:13:28.447 "qid": 0, 00:13:28.447 "state": "enabled", 00:13:28.447 "thread": "nvmf_tgt_poll_group_000", 00:13:28.447 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:28.447 "listen_address": { 00:13:28.447 "trtype": "RDMA", 00:13:28.447 "adrfam": "IPv4", 00:13:28.447 "traddr": "10.0.0.2", 00:13:28.447 "trsvcid": "4420" 00:13:28.447 }, 00:13:28.447 "peer_address": { 00:13:28.447 "trtype": "RDMA", 00:13:28.447 "adrfam": "IPv4", 00:13:28.447 "traddr": "10.0.0.2", 00:13:28.447 "trsvcid": "35901" 00:13:28.447 }, 00:13:28.447 "auth": { 00:13:28.447 "state": "completed", 00:13:28.447 "digest": "sha384", 00:13:28.447 "dhgroup": "null" 00:13:28.447 } 00:13:28.447 } 00:13:28.447 ]' 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:28.447 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:28.706 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:28.706 15:18:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:29.275 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:29.533 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:29.533 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:29.533 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 null 2 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:29.534 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:29.793 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:29.793 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:29.793 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:29.793 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:30.052 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:30.052 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:30.052 { 00:13:30.052 "cntlid": 53, 00:13:30.052 "qid": 0, 00:13:30.052 "state": "enabled", 00:13:30.052 "thread": "nvmf_tgt_poll_group_000", 00:13:30.052 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:30.052 "listen_address": { 00:13:30.052 "trtype": "RDMA", 00:13:30.052 "adrfam": "IPv4", 00:13:30.052 "traddr": "10.0.0.2", 00:13:30.052 "trsvcid": "4420" 00:13:30.052 }, 00:13:30.052 "peer_address": { 00:13:30.052 "trtype": "RDMA", 00:13:30.052 "adrfam": "IPv4", 00:13:30.052 "traddr": "10.0.0.2", 00:13:30.052 "trsvcid": "56090" 00:13:30.052 }, 00:13:30.052 "auth": { 00:13:30.052 "state": "completed", 00:13:30.052 "digest": "sha384", 00:13:30.052 "dhgroup": "null" 00:13:30.052 } 00:13:30.052 } 00:13:30.052 ]' 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:30.311 15:18:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:30.570 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:30.570 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:31.138 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:31.138 15:18:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups null 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 null 3 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:31.396 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:31.655 00:13:31.655 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:31.655 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:31.655 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:31.914 { 00:13:31.914 "cntlid": 55, 00:13:31.914 "qid": 0, 00:13:31.914 "state": "enabled", 00:13:31.914 "thread": "nvmf_tgt_poll_group_000", 00:13:31.914 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:31.914 "listen_address": { 00:13:31.914 "trtype": "RDMA", 00:13:31.914 "adrfam": "IPv4", 00:13:31.914 "traddr": "10.0.0.2", 00:13:31.914 "trsvcid": "4420" 00:13:31.914 }, 00:13:31.914 "peer_address": { 00:13:31.914 "trtype": "RDMA", 00:13:31.914 "adrfam": "IPv4", 00:13:31.914 "traddr": "10.0.0.2", 00:13:31.914 "trsvcid": "45520" 00:13:31.914 }, 00:13:31.914 "auth": { 00:13:31.914 "state": "completed", 00:13:31.914 "digest": "sha384", 00:13:31.914 "dhgroup": "null" 00:13:31.914 } 00:13:31.914 } 00:13:31.914 ]' 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:13:31.914 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:32.172 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:32.172 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:32.172 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:32.172 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:32.172 15:18:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:32.739 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:32.997 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:32.997 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe2048 0 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:33.254 15:18:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:33.512 00:13:33.512 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:33.512 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:33.512 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:33.770 { 00:13:33.770 "cntlid": 57, 00:13:33.770 "qid": 0, 00:13:33.770 "state": "enabled", 00:13:33.770 "thread": "nvmf_tgt_poll_group_000", 00:13:33.770 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:33.770 "listen_address": { 00:13:33.770 "trtype": "RDMA", 00:13:33.770 "adrfam": "IPv4", 00:13:33.770 "traddr": "10.0.0.2", 00:13:33.770 "trsvcid": "4420" 00:13:33.770 }, 00:13:33.770 "peer_address": { 00:13:33.770 "trtype": "RDMA", 00:13:33.770 "adrfam": "IPv4", 00:13:33.770 "traddr": "10.0.0.2", 00:13:33.770 "trsvcid": "38775" 00:13:33.770 }, 00:13:33.770 "auth": { 00:13:33.770 "state": "completed", 00:13:33.770 "digest": "sha384", 00:13:33.770 "dhgroup": "ffdhe2048" 00:13:33.770 } 00:13:33.770 } 00:13:33.770 ]' 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:33.770 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:34.028 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:34.028 15:18:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:34.595 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:34.855 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe2048 1 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:34.855 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:34.856 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:34.856 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:35.114 00:13:35.374 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:35.374 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:35.374 15:18:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:35.374 { 00:13:35.374 "cntlid": 59, 00:13:35.374 "qid": 0, 00:13:35.374 "state": "enabled", 00:13:35.374 "thread": "nvmf_tgt_poll_group_000", 00:13:35.374 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:35.374 "listen_address": { 00:13:35.374 "trtype": "RDMA", 00:13:35.374 "adrfam": "IPv4", 00:13:35.374 "traddr": "10.0.0.2", 00:13:35.374 "trsvcid": "4420" 00:13:35.374 }, 00:13:35.374 "peer_address": { 00:13:35.374 "trtype": "RDMA", 00:13:35.374 "adrfam": "IPv4", 00:13:35.374 "traddr": "10.0.0.2", 00:13:35.374 "trsvcid": "40107" 00:13:35.374 }, 00:13:35.374 "auth": { 00:13:35.374 "state": "completed", 00:13:35.374 "digest": "sha384", 00:13:35.374 "dhgroup": "ffdhe2048" 00:13:35.374 } 00:13:35.374 } 00:13:35.374 ]' 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:35.374 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:35.633 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:13:35.633 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:35.633 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:35.633 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:35.633 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:35.892 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:35.892 15:18:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:36.458 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:36.458 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:36.716 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe2048 2 00:13:36.716 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:36.717 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:36.975 00:13:36.975 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:36.975 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:36.975 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:37.235 { 00:13:37.235 "cntlid": 61, 00:13:37.235 "qid": 0, 00:13:37.235 "state": "enabled", 00:13:37.235 "thread": "nvmf_tgt_poll_group_000", 00:13:37.235 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:37.235 "listen_address": { 00:13:37.235 "trtype": "RDMA", 00:13:37.235 "adrfam": "IPv4", 00:13:37.235 "traddr": "10.0.0.2", 00:13:37.235 "trsvcid": "4420" 00:13:37.235 }, 00:13:37.235 "peer_address": { 00:13:37.235 "trtype": "RDMA", 00:13:37.235 "adrfam": "IPv4", 00:13:37.235 "traddr": "10.0.0.2", 00:13:37.235 "trsvcid": "36591" 00:13:37.235 }, 00:13:37.235 "auth": { 00:13:37.235 "state": "completed", 00:13:37.235 "digest": "sha384", 00:13:37.235 "dhgroup": "ffdhe2048" 00:13:37.235 } 00:13:37.235 } 00:13:37.235 ]' 00:13:37.235 15:18:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:37.235 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:37.235 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:37.235 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:13:37.235 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:37.494 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:37.494 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:37.494 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:37.494 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:37.494 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:38.432 15:18:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:38.432 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:38.432 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe2048 3 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:38.690 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:38.947 00:13:38.947 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:38.947 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:38.947 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:39.206 { 00:13:39.206 "cntlid": 63, 00:13:39.206 "qid": 0, 00:13:39.206 "state": "enabled", 00:13:39.206 "thread": "nvmf_tgt_poll_group_000", 00:13:39.206 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:39.206 "listen_address": { 00:13:39.206 "trtype": "RDMA", 00:13:39.206 "adrfam": "IPv4", 00:13:39.206 "traddr": "10.0.0.2", 00:13:39.206 "trsvcid": "4420" 00:13:39.206 }, 00:13:39.206 "peer_address": { 00:13:39.206 "trtype": "RDMA", 00:13:39.206 "adrfam": "IPv4", 00:13:39.206 "traddr": "10.0.0.2", 00:13:39.206 "trsvcid": "36952" 00:13:39.206 }, 00:13:39.206 "auth": { 00:13:39.206 "state": "completed", 00:13:39.206 "digest": "sha384", 00:13:39.206 "dhgroup": "ffdhe2048" 00:13:39.206 } 00:13:39.206 } 00:13:39.206 ]' 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:39.206 15:18:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:39.465 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:39.465 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:40.033 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:40.293 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:40.293 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:40.294 15:18:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe3072 0 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:40.294 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:40.553 00:13:40.553 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:40.812 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:40.812 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:40.812 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:40.812 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:40.813 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:40.813 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:40.813 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:40.813 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:40.813 { 00:13:40.813 "cntlid": 65, 00:13:40.813 "qid": 0, 00:13:40.813 "state": "enabled", 00:13:40.813 "thread": "nvmf_tgt_poll_group_000", 00:13:40.813 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:40.813 "listen_address": { 00:13:40.813 "trtype": "RDMA", 00:13:40.813 "adrfam": "IPv4", 00:13:40.813 "traddr": "10.0.0.2", 00:13:40.813 "trsvcid": "4420" 00:13:40.813 }, 00:13:40.813 "peer_address": { 00:13:40.813 "trtype": "RDMA", 00:13:40.813 "adrfam": "IPv4", 00:13:40.813 "traddr": "10.0.0.2", 00:13:40.813 "trsvcid": "37433" 00:13:40.813 }, 00:13:40.813 "auth": { 00:13:40.813 "state": "completed", 00:13:40.813 "digest": "sha384", 00:13:40.813 "dhgroup": "ffdhe3072" 00:13:40.813 } 00:13:40.813 } 00:13:40.813 ]' 00:13:40.813 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:41.072 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:41.331 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:41.331 15:18:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:41.900 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:41.900 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe3072 1 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:42.159 15:18:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:42.418 00:13:42.418 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:42.418 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:42.418 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:42.678 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:42.678 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:42.678 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:42.678 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:42.678 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:42.678 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:42.678 { 00:13:42.678 "cntlid": 67, 00:13:42.678 "qid": 0, 00:13:42.679 "state": "enabled", 00:13:42.679 "thread": "nvmf_tgt_poll_group_000", 00:13:42.679 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:42.679 "listen_address": { 00:13:42.679 "trtype": "RDMA", 00:13:42.679 "adrfam": "IPv4", 00:13:42.679 "traddr": "10.0.0.2", 00:13:42.679 "trsvcid": "4420" 00:13:42.679 }, 00:13:42.679 "peer_address": { 00:13:42.679 "trtype": "RDMA", 00:13:42.679 "adrfam": "IPv4", 00:13:42.679 "traddr": "10.0.0.2", 00:13:42.679 "trsvcid": "42510" 00:13:42.679 }, 00:13:42.679 "auth": { 00:13:42.679 "state": "completed", 00:13:42.679 "digest": "sha384", 00:13:42.679 "dhgroup": "ffdhe3072" 00:13:42.679 } 00:13:42.679 } 00:13:42.679 ]' 00:13:42.679 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:42.679 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:42.679 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:42.679 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:13:42.679 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:42.937 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:42.937 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:42.937 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:42.937 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:43.197 15:18:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:43.764 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:43.764 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe3072 2 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:44.022 15:18:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:44.280 00:13:44.280 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:44.280 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:44.280 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:44.538 { 00:13:44.538 "cntlid": 69, 00:13:44.538 "qid": 0, 00:13:44.538 "state": "enabled", 00:13:44.538 "thread": "nvmf_tgt_poll_group_000", 00:13:44.538 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:44.538 "listen_address": { 00:13:44.538 "trtype": "RDMA", 00:13:44.538 "adrfam": "IPv4", 00:13:44.538 "traddr": "10.0.0.2", 00:13:44.538 "trsvcid": "4420" 00:13:44.538 }, 00:13:44.538 "peer_address": { 00:13:44.538 "trtype": "RDMA", 00:13:44.538 "adrfam": "IPv4", 00:13:44.538 "traddr": "10.0.0.2", 00:13:44.538 "trsvcid": "47201" 00:13:44.538 }, 00:13:44.538 "auth": { 00:13:44.538 "state": "completed", 00:13:44.538 "digest": "sha384", 00:13:44.538 "dhgroup": "ffdhe3072" 00:13:44.538 } 00:13:44.538 } 00:13:44.538 ]' 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:44.538 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:44.539 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:44.539 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:44.798 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:44.798 15:18:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:45.365 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:45.625 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:45.625 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe3072 3 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:45.882 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:46.141 00:13:46.141 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:46.141 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:46.141 15:18:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:46.400 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:46.400 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:46.400 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:46.400 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:46.400 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:46.400 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:46.400 { 00:13:46.400 "cntlid": 71, 00:13:46.400 "qid": 0, 00:13:46.401 "state": "enabled", 00:13:46.401 "thread": "nvmf_tgt_poll_group_000", 00:13:46.401 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:46.401 "listen_address": { 00:13:46.401 "trtype": "RDMA", 00:13:46.401 "adrfam": "IPv4", 00:13:46.401 "traddr": "10.0.0.2", 00:13:46.401 "trsvcid": "4420" 00:13:46.401 }, 00:13:46.401 "peer_address": { 00:13:46.401 "trtype": "RDMA", 00:13:46.401 "adrfam": "IPv4", 00:13:46.401 "traddr": "10.0.0.2", 00:13:46.401 "trsvcid": "54999" 00:13:46.401 }, 00:13:46.401 "auth": { 00:13:46.401 "state": "completed", 00:13:46.401 "digest": "sha384", 00:13:46.401 "dhgroup": "ffdhe3072" 00:13:46.401 } 00:13:46.401 } 00:13:46.401 ]' 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:46.401 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:46.660 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:46.660 15:18:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:47.226 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:47.485 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:47.485 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe4096 0 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:47.744 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:48.003 00:13:48.003 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:48.003 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:48.003 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:48.265 { 00:13:48.265 "cntlid": 73, 00:13:48.265 "qid": 0, 00:13:48.265 "state": "enabled", 00:13:48.265 "thread": "nvmf_tgt_poll_group_000", 00:13:48.265 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:48.265 "listen_address": { 00:13:48.265 "trtype": "RDMA", 00:13:48.265 "adrfam": "IPv4", 00:13:48.265 "traddr": "10.0.0.2", 00:13:48.265 "trsvcid": "4420" 00:13:48.265 }, 00:13:48.265 "peer_address": { 00:13:48.265 "trtype": "RDMA", 00:13:48.265 "adrfam": "IPv4", 00:13:48.265 "traddr": "10.0.0.2", 00:13:48.265 "trsvcid": "48617" 00:13:48.265 }, 00:13:48.265 "auth": { 00:13:48.265 "state": "completed", 00:13:48.265 "digest": "sha384", 00:13:48.265 "dhgroup": "ffdhe4096" 00:13:48.265 } 00:13:48.265 } 00:13:48.265 ]' 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:48.265 15:18:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:48.265 15:18:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:48.265 15:18:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:48.265 15:18:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:48.524 15:18:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:48.524 15:18:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:49.091 15:18:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:49.350 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:49.350 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe4096 1 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:49.607 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:49.865 00:13:49.865 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:49.865 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:49.865 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:50.153 { 00:13:50.153 "cntlid": 75, 00:13:50.153 "qid": 0, 00:13:50.153 "state": "enabled", 00:13:50.153 "thread": "nvmf_tgt_poll_group_000", 00:13:50.153 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:50.153 "listen_address": { 00:13:50.153 "trtype": "RDMA", 00:13:50.153 "adrfam": "IPv4", 00:13:50.153 "traddr": "10.0.0.2", 00:13:50.153 "trsvcid": "4420" 00:13:50.153 }, 00:13:50.153 "peer_address": { 00:13:50.153 "trtype": "RDMA", 00:13:50.153 "adrfam": "IPv4", 00:13:50.153 "traddr": "10.0.0.2", 00:13:50.153 "trsvcid": "36284" 00:13:50.153 }, 00:13:50.153 "auth": { 00:13:50.153 "state": "completed", 00:13:50.153 "digest": "sha384", 00:13:50.153 "dhgroup": "ffdhe4096" 00:13:50.153 } 00:13:50.153 } 00:13:50.153 ]' 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:50.153 15:18:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:50.411 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:50.411 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:50.978 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:51.236 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:51.236 15:18:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe4096 2 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:51.494 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:51.753 00:13:51.753 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:51.753 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:51.753 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:52.012 { 00:13:52.012 "cntlid": 77, 00:13:52.012 "qid": 0, 00:13:52.012 "state": "enabled", 00:13:52.012 "thread": "nvmf_tgt_poll_group_000", 00:13:52.012 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:52.012 "listen_address": { 00:13:52.012 "trtype": "RDMA", 00:13:52.012 "adrfam": "IPv4", 00:13:52.012 "traddr": "10.0.0.2", 00:13:52.012 "trsvcid": "4420" 00:13:52.012 }, 00:13:52.012 "peer_address": { 00:13:52.012 "trtype": "RDMA", 00:13:52.012 "adrfam": "IPv4", 00:13:52.012 "traddr": "10.0.0.2", 00:13:52.012 "trsvcid": "46783" 00:13:52.012 }, 00:13:52.012 "auth": { 00:13:52.012 "state": "completed", 00:13:52.012 "digest": "sha384", 00:13:52.012 "dhgroup": "ffdhe4096" 00:13:52.012 } 00:13:52.012 } 00:13:52.012 ]' 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:52.012 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:52.271 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:52.271 15:18:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:52.840 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:53.100 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe4096 3 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:53.100 15:18:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:13:53.359 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:53.618 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:53.877 { 00:13:53.877 "cntlid": 79, 00:13:53.877 "qid": 0, 00:13:53.877 "state": "enabled", 00:13:53.877 "thread": "nvmf_tgt_poll_group_000", 00:13:53.877 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:53.877 "listen_address": { 00:13:53.877 "trtype": "RDMA", 00:13:53.877 "adrfam": "IPv4", 00:13:53.877 "traddr": "10.0.0.2", 00:13:53.877 "trsvcid": "4420" 00:13:53.877 }, 00:13:53.877 "peer_address": { 00:13:53.877 "trtype": "RDMA", 00:13:53.877 "adrfam": "IPv4", 00:13:53.877 "traddr": "10.0.0.2", 00:13:53.877 "trsvcid": "51947" 00:13:53.877 }, 00:13:53.877 "auth": { 00:13:53.877 "state": "completed", 00:13:53.877 "digest": "sha384", 00:13:53.877 "dhgroup": "ffdhe4096" 00:13:53.877 } 00:13:53.877 } 00:13:53.877 ]' 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:53.877 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:54.143 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:54.143 15:18:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:54.714 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:13:54.714 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:13:55.094 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe6144 0 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:55.095 15:18:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:13:55.353 00:13:55.353 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:55.353 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:55.353 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:55.611 { 00:13:55.611 "cntlid": 81, 00:13:55.611 "qid": 0, 00:13:55.611 "state": "enabled", 00:13:55.611 "thread": "nvmf_tgt_poll_group_000", 00:13:55.611 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:55.611 "listen_address": { 00:13:55.611 "trtype": "RDMA", 00:13:55.611 "adrfam": "IPv4", 00:13:55.611 "traddr": "10.0.0.2", 00:13:55.611 "trsvcid": "4420" 00:13:55.611 }, 00:13:55.611 "peer_address": { 00:13:55.611 "trtype": "RDMA", 00:13:55.611 "adrfam": "IPv4", 00:13:55.611 "traddr": "10.0.0.2", 00:13:55.611 "trsvcid": "53004" 00:13:55.611 }, 00:13:55.611 "auth": { 00:13:55.611 "state": "completed", 00:13:55.611 "digest": "sha384", 00:13:55.611 "dhgroup": "ffdhe6144" 00:13:55.611 } 00:13:55.611 } 00:13:55.611 ]' 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:55.611 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:55.870 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:55.870 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:55.870 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:55.870 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:55.870 15:18:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:56.806 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe6144 1 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:56.806 15:18:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:13:57.373 00:13:57.373 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:57.373 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:57.373 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:57.632 { 00:13:57.632 "cntlid": 83, 00:13:57.632 "qid": 0, 00:13:57.632 "state": "enabled", 00:13:57.632 "thread": "nvmf_tgt_poll_group_000", 00:13:57.632 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:57.632 "listen_address": { 00:13:57.632 "trtype": "RDMA", 00:13:57.632 "adrfam": "IPv4", 00:13:57.632 "traddr": "10.0.0.2", 00:13:57.632 "trsvcid": "4420" 00:13:57.632 }, 00:13:57.632 "peer_address": { 00:13:57.632 "trtype": "RDMA", 00:13:57.632 "adrfam": "IPv4", 00:13:57.632 "traddr": "10.0.0.2", 00:13:57.632 "trsvcid": "44510" 00:13:57.632 }, 00:13:57.632 "auth": { 00:13:57.632 "state": "completed", 00:13:57.632 "digest": "sha384", 00:13:57.632 "dhgroup": "ffdhe6144" 00:13:57.632 } 00:13:57.632 } 00:13:57.632 ]' 00:13:57.632 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:57.633 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:57.891 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:57.891 15:18:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:13:58.458 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:13:58.717 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe6144 2 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:13:58.717 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:58.718 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:13:59.284 00:13:59.284 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:13:59.284 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:13:59.284 15:19:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:13:59.284 { 00:13:59.284 "cntlid": 85, 00:13:59.284 "qid": 0, 00:13:59.284 "state": "enabled", 00:13:59.284 "thread": "nvmf_tgt_poll_group_000", 00:13:59.284 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:13:59.284 "listen_address": { 00:13:59.284 "trtype": "RDMA", 00:13:59.284 "adrfam": "IPv4", 00:13:59.284 "traddr": "10.0.0.2", 00:13:59.284 "trsvcid": "4420" 00:13:59.284 }, 00:13:59.284 "peer_address": { 00:13:59.284 "trtype": "RDMA", 00:13:59.284 "adrfam": "IPv4", 00:13:59.284 "traddr": "10.0.0.2", 00:13:59.284 "trsvcid": "52293" 00:13:59.284 }, 00:13:59.284 "auth": { 00:13:59.284 "state": "completed", 00:13:59.284 "digest": "sha384", 00:13:59.284 "dhgroup": "ffdhe6144" 00:13:59.284 } 00:13:59.284 } 00:13:59.284 ]' 00:13:59.284 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:13:59.544 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:13:59.802 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:13:59.803 15:19:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:00.370 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:00.630 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe6144 3 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:00.630 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:01.198 00:14:01.198 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:01.198 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:01.198 15:19:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:01.198 { 00:14:01.198 "cntlid": 87, 00:14:01.198 "qid": 0, 00:14:01.198 "state": "enabled", 00:14:01.198 "thread": "nvmf_tgt_poll_group_000", 00:14:01.198 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:01.198 "listen_address": { 00:14:01.198 "trtype": "RDMA", 00:14:01.198 "adrfam": "IPv4", 00:14:01.198 "traddr": "10.0.0.2", 00:14:01.198 "trsvcid": "4420" 00:14:01.198 }, 00:14:01.198 "peer_address": { 00:14:01.198 "trtype": "RDMA", 00:14:01.198 "adrfam": "IPv4", 00:14:01.198 "traddr": "10.0.0.2", 00:14:01.198 "trsvcid": "55346" 00:14:01.198 }, 00:14:01.198 "auth": { 00:14:01.198 "state": "completed", 00:14:01.198 "digest": "sha384", 00:14:01.198 "dhgroup": "ffdhe6144" 00:14:01.198 } 00:14:01.198 } 00:14:01.198 ]' 00:14:01.198 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:01.458 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:01.717 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:01.717 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:02.285 15:19:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:02.285 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:02.285 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:02.285 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:02.285 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe8192 0 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:02.544 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:03.111 00:14:03.111 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:03.111 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:03.111 15:19:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:03.370 { 00:14:03.370 "cntlid": 89, 00:14:03.370 "qid": 0, 00:14:03.370 "state": "enabled", 00:14:03.370 "thread": "nvmf_tgt_poll_group_000", 00:14:03.370 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:03.370 "listen_address": { 00:14:03.370 "trtype": "RDMA", 00:14:03.370 "adrfam": "IPv4", 00:14:03.370 "traddr": "10.0.0.2", 00:14:03.370 "trsvcid": "4420" 00:14:03.370 }, 00:14:03.370 "peer_address": { 00:14:03.370 "trtype": "RDMA", 00:14:03.370 "adrfam": "IPv4", 00:14:03.370 "traddr": "10.0.0.2", 00:14:03.370 "trsvcid": "47742" 00:14:03.370 }, 00:14:03.370 "auth": { 00:14:03.370 "state": "completed", 00:14:03.370 "digest": "sha384", 00:14:03.370 "dhgroup": "ffdhe8192" 00:14:03.370 } 00:14:03.370 } 00:14:03.370 ]' 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:03.370 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:03.629 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:03.629 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:03.629 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:03.629 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:03.629 15:19:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:04.567 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe8192 1 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:04.567 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:05.134 00:14:05.134 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:05.134 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:05.134 15:19:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:05.392 { 00:14:05.392 "cntlid": 91, 00:14:05.392 "qid": 0, 00:14:05.392 "state": "enabled", 00:14:05.392 "thread": "nvmf_tgt_poll_group_000", 00:14:05.392 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:05.392 "listen_address": { 00:14:05.392 "trtype": "RDMA", 00:14:05.392 "adrfam": "IPv4", 00:14:05.392 "traddr": "10.0.0.2", 00:14:05.392 "trsvcid": "4420" 00:14:05.392 }, 00:14:05.392 "peer_address": { 00:14:05.392 "trtype": "RDMA", 00:14:05.392 "adrfam": "IPv4", 00:14:05.392 "traddr": "10.0.0.2", 00:14:05.392 "trsvcid": "44249" 00:14:05.392 }, 00:14:05.392 "auth": { 00:14:05.392 "state": "completed", 00:14:05.392 "digest": "sha384", 00:14:05.392 "dhgroup": "ffdhe8192" 00:14:05.392 } 00:14:05.392 } 00:14:05.392 ]' 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:05.392 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:05.651 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:05.651 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:05.651 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:05.651 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:05.651 15:19:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:06.586 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe8192 2 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:06.586 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:06.845 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:06.845 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:06.845 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:06.845 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:07.103 00:14:07.103 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:07.103 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:07.103 15:19:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:07.361 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:07.361 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:07.361 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:07.361 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:07.361 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:07.361 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:07.361 { 00:14:07.361 "cntlid": 93, 00:14:07.361 "qid": 0, 00:14:07.361 "state": "enabled", 00:14:07.361 "thread": "nvmf_tgt_poll_group_000", 00:14:07.361 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:07.361 "listen_address": { 00:14:07.361 "trtype": "RDMA", 00:14:07.361 "adrfam": "IPv4", 00:14:07.361 "traddr": "10.0.0.2", 00:14:07.361 "trsvcid": "4420" 00:14:07.361 }, 00:14:07.361 "peer_address": { 00:14:07.361 "trtype": "RDMA", 00:14:07.361 "adrfam": "IPv4", 00:14:07.361 "traddr": "10.0.0.2", 00:14:07.361 "trsvcid": "35099" 00:14:07.361 }, 00:14:07.361 "auth": { 00:14:07.361 "state": "completed", 00:14:07.361 "digest": "sha384", 00:14:07.361 "dhgroup": "ffdhe8192" 00:14:07.361 } 00:14:07.361 } 00:14:07.361 ]' 00:14:07.362 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:07.620 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:07.878 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:07.878 15:19:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:08.446 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:08.446 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:08.446 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:08.446 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:08.446 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:08.446 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:08.447 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:08.447 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:08.447 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha384 ffdhe8192 3 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha384 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:08.705 15:19:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:09.273 00:14:09.273 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:09.273 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:09.273 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:09.532 { 00:14:09.532 "cntlid": 95, 00:14:09.532 "qid": 0, 00:14:09.532 "state": "enabled", 00:14:09.532 "thread": "nvmf_tgt_poll_group_000", 00:14:09.532 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:09.532 "listen_address": { 00:14:09.532 "trtype": "RDMA", 00:14:09.532 "adrfam": "IPv4", 00:14:09.532 "traddr": "10.0.0.2", 00:14:09.532 "trsvcid": "4420" 00:14:09.532 }, 00:14:09.532 "peer_address": { 00:14:09.532 "trtype": "RDMA", 00:14:09.532 "adrfam": "IPv4", 00:14:09.532 "traddr": "10.0.0.2", 00:14:09.532 "trsvcid": "59499" 00:14:09.532 }, 00:14:09.532 "auth": { 00:14:09.532 "state": "completed", 00:14:09.532 "digest": "sha384", 00:14:09.532 "dhgroup": "ffdhe8192" 00:14:09.532 } 00:14:09.532 } 00:14:09.532 ]' 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha384 == \s\h\a\3\8\4 ]] 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:09.532 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:09.791 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:09.791 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:09.791 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:09.791 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:09.791 15:19:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:10.727 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:10.727 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:10.727 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:10.727 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:10.727 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@118 -- # for digest in "${digests[@]}" 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 null 0 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:10.728 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:10.986 00:14:11.245 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:11.245 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:11.245 15:19:12 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:11.245 { 00:14:11.245 "cntlid": 97, 00:14:11.245 "qid": 0, 00:14:11.245 "state": "enabled", 00:14:11.245 "thread": "nvmf_tgt_poll_group_000", 00:14:11.245 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:11.245 "listen_address": { 00:14:11.245 "trtype": "RDMA", 00:14:11.245 "adrfam": "IPv4", 00:14:11.245 "traddr": "10.0.0.2", 00:14:11.245 "trsvcid": "4420" 00:14:11.245 }, 00:14:11.245 "peer_address": { 00:14:11.245 "trtype": "RDMA", 00:14:11.245 "adrfam": "IPv4", 00:14:11.245 "traddr": "10.0.0.2", 00:14:11.245 "trsvcid": "50031" 00:14:11.245 }, 00:14:11.245 "auth": { 00:14:11.245 "state": "completed", 00:14:11.245 "digest": "sha512", 00:14:11.245 "dhgroup": "null" 00:14:11.245 } 00:14:11.245 } 00:14:11.245 ]' 00:14:11.245 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:11.504 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:11.763 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:11.763 15:19:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:12.331 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:12.331 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 null 1 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:12.591 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:12.850 00:14:12.850 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:12.850 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:12.850 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:13.109 { 00:14:13.109 "cntlid": 99, 00:14:13.109 "qid": 0, 00:14:13.109 "state": "enabled", 00:14:13.109 "thread": "nvmf_tgt_poll_group_000", 00:14:13.109 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:13.109 "listen_address": { 00:14:13.109 "trtype": "RDMA", 00:14:13.109 "adrfam": "IPv4", 00:14:13.109 "traddr": "10.0.0.2", 00:14:13.109 "trsvcid": "4420" 00:14:13.109 }, 00:14:13.109 "peer_address": { 00:14:13.109 "trtype": "RDMA", 00:14:13.109 "adrfam": "IPv4", 00:14:13.109 "traddr": "10.0.0.2", 00:14:13.109 "trsvcid": "34727" 00:14:13.109 }, 00:14:13.109 "auth": { 00:14:13.109 "state": "completed", 00:14:13.109 "digest": "sha512", 00:14:13.109 "dhgroup": "null" 00:14:13.109 } 00:14:13.109 } 00:14:13.109 ]' 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:13.109 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:13.368 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:14:13.368 15:19:14 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:13.368 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:13.368 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:13.368 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:13.625 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:13.625 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:14.190 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:14.190 15:19:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 null 2 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:14.448 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:14.706 00:14:14.706 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:14.707 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:14.707 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:14.965 { 00:14:14.965 "cntlid": 101, 00:14:14.965 "qid": 0, 00:14:14.965 "state": "enabled", 00:14:14.965 "thread": "nvmf_tgt_poll_group_000", 00:14:14.965 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:14.965 "listen_address": { 00:14:14.965 "trtype": "RDMA", 00:14:14.965 "adrfam": "IPv4", 00:14:14.965 "traddr": "10.0.0.2", 00:14:14.965 "trsvcid": "4420" 00:14:14.965 }, 00:14:14.965 "peer_address": { 00:14:14.965 "trtype": "RDMA", 00:14:14.965 "adrfam": "IPv4", 00:14:14.965 "traddr": "10.0.0.2", 00:14:14.965 "trsvcid": "38498" 00:14:14.965 }, 00:14:14.965 "auth": { 00:14:14.965 "state": "completed", 00:14:14.965 "digest": "sha512", 00:14:14.965 "dhgroup": "null" 00:14:14.965 } 00:14:14.965 } 00:14:14.965 ]' 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:14.965 15:19:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:15.224 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:15.224 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:15.790 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:16.049 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:16.049 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups null 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 null 3 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=null 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:16.309 15:19:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:16.568 00:14:16.568 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:16.568 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:16.568 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:16.828 { 00:14:16.828 "cntlid": 103, 00:14:16.828 "qid": 0, 00:14:16.828 "state": "enabled", 00:14:16.828 "thread": "nvmf_tgt_poll_group_000", 00:14:16.828 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:16.828 "listen_address": { 00:14:16.828 "trtype": "RDMA", 00:14:16.828 "adrfam": "IPv4", 00:14:16.828 "traddr": "10.0.0.2", 00:14:16.828 "trsvcid": "4420" 00:14:16.828 }, 00:14:16.828 "peer_address": { 00:14:16.828 "trtype": "RDMA", 00:14:16.828 "adrfam": "IPv4", 00:14:16.828 "traddr": "10.0.0.2", 00:14:16.828 "trsvcid": "50349" 00:14:16.828 }, 00:14:16.828 "auth": { 00:14:16.828 "state": "completed", 00:14:16.828 "digest": "sha512", 00:14:16.828 "dhgroup": "null" 00:14:16.828 } 00:14:16.828 } 00:14:16.828 ]' 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ null == \n\u\l\l ]] 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:16.828 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:17.087 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:17.087 15:19:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:17.655 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:17.914 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:17.914 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:17.914 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:17.914 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:17.914 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:17.914 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:17.915 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:17.915 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:17.915 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe2048 0 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:18.174 15:19:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:18.433 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:18.433 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:18.692 { 00:14:18.692 "cntlid": 105, 00:14:18.692 "qid": 0, 00:14:18.692 "state": "enabled", 00:14:18.692 "thread": "nvmf_tgt_poll_group_000", 00:14:18.692 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:18.692 "listen_address": { 00:14:18.692 "trtype": "RDMA", 00:14:18.692 "adrfam": "IPv4", 00:14:18.692 "traddr": "10.0.0.2", 00:14:18.692 "trsvcid": "4420" 00:14:18.692 }, 00:14:18.692 "peer_address": { 00:14:18.692 "trtype": "RDMA", 00:14:18.692 "adrfam": "IPv4", 00:14:18.692 "traddr": "10.0.0.2", 00:14:18.692 "trsvcid": "33751" 00:14:18.692 }, 00:14:18.692 "auth": { 00:14:18.692 "state": "completed", 00:14:18.692 "digest": "sha512", 00:14:18.692 "dhgroup": "ffdhe2048" 00:14:18.692 } 00:14:18.692 } 00:14:18.692 ]' 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:18.692 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:18.951 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:18.951 15:19:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:19.519 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:19.519 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:19.519 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:19.519 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:19.519 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe2048 1 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:19.778 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:20.037 00:14:20.037 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:20.037 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:20.037 15:19:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:20.297 { 00:14:20.297 "cntlid": 107, 00:14:20.297 "qid": 0, 00:14:20.297 "state": "enabled", 00:14:20.297 "thread": "nvmf_tgt_poll_group_000", 00:14:20.297 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:20.297 "listen_address": { 00:14:20.297 "trtype": "RDMA", 00:14:20.297 "adrfam": "IPv4", 00:14:20.297 "traddr": "10.0.0.2", 00:14:20.297 "trsvcid": "4420" 00:14:20.297 }, 00:14:20.297 "peer_address": { 00:14:20.297 "trtype": "RDMA", 00:14:20.297 "adrfam": "IPv4", 00:14:20.297 "traddr": "10.0.0.2", 00:14:20.297 "trsvcid": "54445" 00:14:20.297 }, 00:14:20.297 "auth": { 00:14:20.297 "state": "completed", 00:14:20.297 "digest": "sha512", 00:14:20.297 "dhgroup": "ffdhe2048" 00:14:20.297 } 00:14:20.297 } 00:14:20.297 ]' 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:14:20.297 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:20.556 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:20.556 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:20.556 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:20.815 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:20.815 15:19:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:21.382 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:21.382 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:21.640 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe2048 2 00:14:21.640 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:21.640 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:21.641 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:21.899 00:14:21.899 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:21.899 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:21.899 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:22.158 { 00:14:22.158 "cntlid": 109, 00:14:22.158 "qid": 0, 00:14:22.158 "state": "enabled", 00:14:22.158 "thread": "nvmf_tgt_poll_group_000", 00:14:22.158 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:22.158 "listen_address": { 00:14:22.158 "trtype": "RDMA", 00:14:22.158 "adrfam": "IPv4", 00:14:22.158 "traddr": "10.0.0.2", 00:14:22.158 "trsvcid": "4420" 00:14:22.158 }, 00:14:22.158 "peer_address": { 00:14:22.158 "trtype": "RDMA", 00:14:22.158 "adrfam": "IPv4", 00:14:22.158 "traddr": "10.0.0.2", 00:14:22.158 "trsvcid": "52113" 00:14:22.158 }, 00:14:22.158 "auth": { 00:14:22.158 "state": "completed", 00:14:22.158 "digest": "sha512", 00:14:22.158 "dhgroup": "ffdhe2048" 00:14:22.158 } 00:14:22.158 } 00:14:22.158 ]' 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:22.158 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:14:22.159 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:22.159 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:22.159 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:22.159 15:19:23 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:22.416 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:22.416 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:22.983 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:23.245 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:23.246 15:19:24 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe2048 3 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe2048 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:23.669 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:23.669 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:23.929 { 00:14:23.929 "cntlid": 111, 00:14:23.929 "qid": 0, 00:14:23.929 "state": "enabled", 00:14:23.929 "thread": "nvmf_tgt_poll_group_000", 00:14:23.929 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:23.929 "listen_address": { 00:14:23.929 "trtype": "RDMA", 00:14:23.929 "adrfam": "IPv4", 00:14:23.929 "traddr": "10.0.0.2", 00:14:23.929 "trsvcid": "4420" 00:14:23.929 }, 00:14:23.929 "peer_address": { 00:14:23.929 "trtype": "RDMA", 00:14:23.929 "adrfam": "IPv4", 00:14:23.929 "traddr": "10.0.0.2", 00:14:23.929 "trsvcid": "42455" 00:14:23.929 }, 00:14:23.929 "auth": { 00:14:23.929 "state": "completed", 00:14:23.929 "digest": "sha512", 00:14:23.929 "dhgroup": "ffdhe2048" 00:14:23.929 } 00:14:23.929 } 00:14:23.929 ]' 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe2048 == \f\f\d\h\e\2\0\4\8 ]] 00:14:23.929 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:24.188 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:24.188 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:24.188 15:19:25 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:24.188 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:24.188 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:25.124 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:25.124 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:25.125 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe3072 0 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:25.383 15:19:26 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:25.642 00:14:25.642 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:25.642 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:25.642 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:25.642 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:25.901 { 00:14:25.901 "cntlid": 113, 00:14:25.901 "qid": 0, 00:14:25.901 "state": "enabled", 00:14:25.901 "thread": "nvmf_tgt_poll_group_000", 00:14:25.901 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:25.901 "listen_address": { 00:14:25.901 "trtype": "RDMA", 00:14:25.901 "adrfam": "IPv4", 00:14:25.901 "traddr": "10.0.0.2", 00:14:25.901 "trsvcid": "4420" 00:14:25.901 }, 00:14:25.901 "peer_address": { 00:14:25.901 "trtype": "RDMA", 00:14:25.901 "adrfam": "IPv4", 00:14:25.901 "traddr": "10.0.0.2", 00:14:25.901 "trsvcid": "36977" 00:14:25.901 }, 00:14:25.901 "auth": { 00:14:25.901 "state": "completed", 00:14:25.901 "digest": "sha512", 00:14:25.901 "dhgroup": "ffdhe3072" 00:14:25.901 } 00:14:25.901 } 00:14:25.901 ]' 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:25.901 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:26.159 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:26.159 15:19:27 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:26.727 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:26.986 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:26.986 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe3072 1 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:27.258 15:19:28 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:27.258 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:27.517 { 00:14:27.517 "cntlid": 115, 00:14:27.517 "qid": 0, 00:14:27.517 "state": "enabled", 00:14:27.517 "thread": "nvmf_tgt_poll_group_000", 00:14:27.517 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:27.517 "listen_address": { 00:14:27.517 "trtype": "RDMA", 00:14:27.517 "adrfam": "IPv4", 00:14:27.517 "traddr": "10.0.0.2", 00:14:27.517 "trsvcid": "4420" 00:14:27.517 }, 00:14:27.517 "peer_address": { 00:14:27.517 "trtype": "RDMA", 00:14:27.517 "adrfam": "IPv4", 00:14:27.517 "traddr": "10.0.0.2", 00:14:27.517 "trsvcid": "34081" 00:14:27.517 }, 00:14:27.517 "auth": { 00:14:27.517 "state": "completed", 00:14:27.517 "digest": "sha512", 00:14:27.517 "dhgroup": "ffdhe3072" 00:14:27.517 } 00:14:27.517 } 00:14:27.517 ]' 00:14:27.517 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:27.775 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:28.033 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:28.033 15:19:29 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:28.661 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:28.661 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe3072 2 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:28.919 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:28.920 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:28.920 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:28.920 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:28.920 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:28.920 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:28.920 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:29.178 00:14:29.178 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:29.178 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:29.178 15:19:30 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:29.435 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:29.436 { 00:14:29.436 "cntlid": 117, 00:14:29.436 "qid": 0, 00:14:29.436 "state": "enabled", 00:14:29.436 "thread": "nvmf_tgt_poll_group_000", 00:14:29.436 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:29.436 "listen_address": { 00:14:29.436 "trtype": "RDMA", 00:14:29.436 "adrfam": "IPv4", 00:14:29.436 "traddr": "10.0.0.2", 00:14:29.436 "trsvcid": "4420" 00:14:29.436 }, 00:14:29.436 "peer_address": { 00:14:29.436 "trtype": "RDMA", 00:14:29.436 "adrfam": "IPv4", 00:14:29.436 "traddr": "10.0.0.2", 00:14:29.436 "trsvcid": "34038" 00:14:29.436 }, 00:14:29.436 "auth": { 00:14:29.436 "state": "completed", 00:14:29.436 "digest": "sha512", 00:14:29.436 "dhgroup": "ffdhe3072" 00:14:29.436 } 00:14:29.436 } 00:14:29.436 ]' 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:29.436 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:29.693 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:29.693 15:19:31 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:30.260 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:30.519 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:30.519 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe3072 3 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe3072 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:30.777 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:31.035 00:14:31.035 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:31.035 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:31.035 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:31.293 { 00:14:31.293 "cntlid": 119, 00:14:31.293 "qid": 0, 00:14:31.293 "state": "enabled", 00:14:31.293 "thread": "nvmf_tgt_poll_group_000", 00:14:31.293 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:31.293 "listen_address": { 00:14:31.293 "trtype": "RDMA", 00:14:31.293 "adrfam": "IPv4", 00:14:31.293 "traddr": "10.0.0.2", 00:14:31.293 "trsvcid": "4420" 00:14:31.293 }, 00:14:31.293 "peer_address": { 00:14:31.293 "trtype": "RDMA", 00:14:31.293 "adrfam": "IPv4", 00:14:31.293 "traddr": "10.0.0.2", 00:14:31.293 "trsvcid": "56044" 00:14:31.293 }, 00:14:31.293 "auth": { 00:14:31.293 "state": "completed", 00:14:31.293 "digest": "sha512", 00:14:31.293 "dhgroup": "ffdhe3072" 00:14:31.293 } 00:14:31.293 } 00:14:31.293 ]' 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:31.293 15:19:32 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:31.293 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe3072 == \f\f\d\h\e\3\0\7\2 ]] 00:14:31.293 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:31.293 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:31.293 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:31.293 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:31.551 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:31.551 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:32.118 15:19:33 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:32.377 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:32.377 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe4096 0 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:32.636 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:32.895 00:14:32.895 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:32.895 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:32.895 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:33.153 { 00:14:33.153 "cntlid": 121, 00:14:33.153 "qid": 0, 00:14:33.153 "state": "enabled", 00:14:33.153 "thread": "nvmf_tgt_poll_group_000", 00:14:33.153 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:33.153 "listen_address": { 00:14:33.153 "trtype": "RDMA", 00:14:33.153 "adrfam": "IPv4", 00:14:33.153 "traddr": "10.0.0.2", 00:14:33.153 "trsvcid": "4420" 00:14:33.153 }, 00:14:33.153 "peer_address": { 00:14:33.153 "trtype": "RDMA", 00:14:33.153 "adrfam": "IPv4", 00:14:33.153 "traddr": "10.0.0.2", 00:14:33.153 "trsvcid": "48461" 00:14:33.153 }, 00:14:33.153 "auth": { 00:14:33.153 "state": "completed", 00:14:33.153 "digest": "sha512", 00:14:33.153 "dhgroup": "ffdhe4096" 00:14:33.153 } 00:14:33.153 } 00:14:33.153 ]' 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:33.153 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:33.154 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:33.154 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:14:33.154 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:33.154 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:33.154 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:33.154 15:19:34 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:33.412 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:33.412 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:33.979 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:34.237 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:34.238 15:19:35 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe4096 1 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:34.496 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:34.754 00:14:34.754 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:34.754 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:34.754 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:35.012 { 00:14:35.012 "cntlid": 123, 00:14:35.012 "qid": 0, 00:14:35.012 "state": "enabled", 00:14:35.012 "thread": "nvmf_tgt_poll_group_000", 00:14:35.012 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:35.012 "listen_address": { 00:14:35.012 "trtype": "RDMA", 00:14:35.012 "adrfam": "IPv4", 00:14:35.012 "traddr": "10.0.0.2", 00:14:35.012 "trsvcid": "4420" 00:14:35.012 }, 00:14:35.012 "peer_address": { 00:14:35.012 "trtype": "RDMA", 00:14:35.012 "adrfam": "IPv4", 00:14:35.012 "traddr": "10.0.0.2", 00:14:35.012 "trsvcid": "34631" 00:14:35.012 }, 00:14:35.012 "auth": { 00:14:35.012 "state": "completed", 00:14:35.012 "digest": "sha512", 00:14:35.012 "dhgroup": "ffdhe4096" 00:14:35.012 } 00:14:35.012 } 00:14:35.012 ]' 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:35.012 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:35.271 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:35.271 15:19:36 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:35.836 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:36.093 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:36.093 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe4096 2 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:36.351 15:19:37 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:36.610 00:14:36.610 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:36.610 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:36.610 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:36.868 { 00:14:36.868 "cntlid": 125, 00:14:36.868 "qid": 0, 00:14:36.868 "state": "enabled", 00:14:36.868 "thread": "nvmf_tgt_poll_group_000", 00:14:36.868 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:36.868 "listen_address": { 00:14:36.868 "trtype": "RDMA", 00:14:36.868 "adrfam": "IPv4", 00:14:36.868 "traddr": "10.0.0.2", 00:14:36.868 "trsvcid": "4420" 00:14:36.868 }, 00:14:36.868 "peer_address": { 00:14:36.868 "trtype": "RDMA", 00:14:36.868 "adrfam": "IPv4", 00:14:36.868 "traddr": "10.0.0.2", 00:14:36.868 "trsvcid": "49793" 00:14:36.868 }, 00:14:36.868 "auth": { 00:14:36.868 "state": "completed", 00:14:36.868 "digest": "sha512", 00:14:36.868 "dhgroup": "ffdhe4096" 00:14:36.868 } 00:14:36.868 } 00:14:36.868 ]' 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:36.868 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:37.126 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:37.126 15:19:38 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:37.692 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:37.950 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe4096 3 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:37.951 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe4096 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:38.208 15:19:39 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:38.466 00:14:38.466 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:38.466 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:38.466 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:38.466 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:38.466 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:38.725 { 00:14:38.725 "cntlid": 127, 00:14:38.725 "qid": 0, 00:14:38.725 "state": "enabled", 00:14:38.725 "thread": "nvmf_tgt_poll_group_000", 00:14:38.725 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:38.725 "listen_address": { 00:14:38.725 "trtype": "RDMA", 00:14:38.725 "adrfam": "IPv4", 00:14:38.725 "traddr": "10.0.0.2", 00:14:38.725 "trsvcid": "4420" 00:14:38.725 }, 00:14:38.725 "peer_address": { 00:14:38.725 "trtype": "RDMA", 00:14:38.725 "adrfam": "IPv4", 00:14:38.725 "traddr": "10.0.0.2", 00:14:38.725 "trsvcid": "52125" 00:14:38.725 }, 00:14:38.725 "auth": { 00:14:38.725 "state": "completed", 00:14:38.725 "digest": "sha512", 00:14:38.725 "dhgroup": "ffdhe4096" 00:14:38.725 } 00:14:38.725 } 00:14:38.725 ]' 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe4096 == \f\f\d\h\e\4\0\9\6 ]] 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:38.725 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:38.982 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:38.982 15:19:40 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:39.548 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:39.548 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe6144 0 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:39.806 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:40.374 00:14:40.374 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:40.374 15:19:41 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:40.374 { 00:14:40.374 "cntlid": 129, 00:14:40.374 "qid": 0, 00:14:40.374 "state": "enabled", 00:14:40.374 "thread": "nvmf_tgt_poll_group_000", 00:14:40.374 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:40.374 "listen_address": { 00:14:40.374 "trtype": "RDMA", 00:14:40.374 "adrfam": "IPv4", 00:14:40.374 "traddr": "10.0.0.2", 00:14:40.374 "trsvcid": "4420" 00:14:40.374 }, 00:14:40.374 "peer_address": { 00:14:40.374 "trtype": "RDMA", 00:14:40.374 "adrfam": "IPv4", 00:14:40.374 "traddr": "10.0.0.2", 00:14:40.374 "trsvcid": "43200" 00:14:40.374 }, 00:14:40.374 "auth": { 00:14:40.374 "state": "completed", 00:14:40.374 "digest": "sha512", 00:14:40.374 "dhgroup": "ffdhe6144" 00:14:40.374 } 00:14:40.374 } 00:14:40.374 ]' 00:14:40.374 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:40.632 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:40.890 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:40.890 15:19:42 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:41.457 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:41.457 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe6144 1 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:41.716 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:42.283 00:14:42.283 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:42.283 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:42.283 15:19:43 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:42.283 { 00:14:42.283 "cntlid": 131, 00:14:42.283 "qid": 0, 00:14:42.283 "state": "enabled", 00:14:42.283 "thread": "nvmf_tgt_poll_group_000", 00:14:42.283 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:42.283 "listen_address": { 00:14:42.283 "trtype": "RDMA", 00:14:42.283 "adrfam": "IPv4", 00:14:42.283 "traddr": "10.0.0.2", 00:14:42.283 "trsvcid": "4420" 00:14:42.283 }, 00:14:42.283 "peer_address": { 00:14:42.283 "trtype": "RDMA", 00:14:42.283 "adrfam": "IPv4", 00:14:42.283 "traddr": "10.0.0.2", 00:14:42.283 "trsvcid": "51746" 00:14:42.283 }, 00:14:42.283 "auth": { 00:14:42.283 "state": "completed", 00:14:42.283 "digest": "sha512", 00:14:42.283 "dhgroup": "ffdhe6144" 00:14:42.283 } 00:14:42.283 } 00:14:42.283 ]' 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:42.283 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:42.542 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:14:42.542 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:42.542 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:42.542 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:42.542 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:42.801 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:42.801 15:19:44 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:43.367 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:43.367 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe6144 2 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:43.626 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:43.884 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:44.141 { 00:14:44.141 "cntlid": 133, 00:14:44.141 "qid": 0, 00:14:44.141 "state": "enabled", 00:14:44.141 "thread": "nvmf_tgt_poll_group_000", 00:14:44.141 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:44.141 "listen_address": { 00:14:44.141 "trtype": "RDMA", 00:14:44.141 "adrfam": "IPv4", 00:14:44.141 "traddr": "10.0.0.2", 00:14:44.141 "trsvcid": "4420" 00:14:44.141 }, 00:14:44.141 "peer_address": { 00:14:44.141 "trtype": "RDMA", 00:14:44.141 "adrfam": "IPv4", 00:14:44.141 "traddr": "10.0.0.2", 00:14:44.141 "trsvcid": "60515" 00:14:44.141 }, 00:14:44.141 "auth": { 00:14:44.141 "state": "completed", 00:14:44.141 "digest": "sha512", 00:14:44.141 "dhgroup": "ffdhe6144" 00:14:44.141 } 00:14:44.141 } 00:14:44.141 ]' 00:14:44.141 15:19:45 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:44.399 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:44.658 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:44.658 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:45.226 15:19:46 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:45.226 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:45.226 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe6144 3 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe6144 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:45.484 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:46.052 00:14:46.052 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:46.052 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:46.052 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:46.052 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:46.053 { 00:14:46.053 "cntlid": 135, 00:14:46.053 "qid": 0, 00:14:46.053 "state": "enabled", 00:14:46.053 "thread": "nvmf_tgt_poll_group_000", 00:14:46.053 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:46.053 "listen_address": { 00:14:46.053 "trtype": "RDMA", 00:14:46.053 "adrfam": "IPv4", 00:14:46.053 "traddr": "10.0.0.2", 00:14:46.053 "trsvcid": "4420" 00:14:46.053 }, 00:14:46.053 "peer_address": { 00:14:46.053 "trtype": "RDMA", 00:14:46.053 "adrfam": "IPv4", 00:14:46.053 "traddr": "10.0.0.2", 00:14:46.053 "trsvcid": "50260" 00:14:46.053 }, 00:14:46.053 "auth": { 00:14:46.053 "state": "completed", 00:14:46.053 "digest": "sha512", 00:14:46.053 "dhgroup": "ffdhe6144" 00:14:46.053 } 00:14:46.053 } 00:14:46.053 ]' 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:46.053 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:46.312 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe6144 == \f\f\d\h\e\6\1\4\4 ]] 00:14:46.312 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:46.312 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:46.312 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:46.312 15:19:47 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:46.571 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:46.571 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:47.140 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@119 -- # for dhgroup in "${dhgroups[@]}" 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:47.140 15:19:48 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe8192 0 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:47.399 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:47.967 00:14:47.967 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:47.967 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:47.967 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:48.226 { 00:14:48.226 "cntlid": 137, 00:14:48.226 "qid": 0, 00:14:48.226 "state": "enabled", 00:14:48.226 "thread": "nvmf_tgt_poll_group_000", 00:14:48.226 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:48.226 "listen_address": { 00:14:48.226 "trtype": "RDMA", 00:14:48.226 "adrfam": "IPv4", 00:14:48.226 "traddr": "10.0.0.2", 00:14:48.226 "trsvcid": "4420" 00:14:48.226 }, 00:14:48.226 "peer_address": { 00:14:48.226 "trtype": "RDMA", 00:14:48.226 "adrfam": "IPv4", 00:14:48.226 "traddr": "10.0.0.2", 00:14:48.226 "trsvcid": "45223" 00:14:48.226 }, 00:14:48.226 "auth": { 00:14:48.226 "state": "completed", 00:14:48.226 "digest": "sha512", 00:14:48.226 "dhgroup": "ffdhe8192" 00:14:48.226 } 00:14:48.226 } 00:14:48.226 ]' 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:48.226 15:19:49 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:48.226 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:48.226 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:48.226 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:48.485 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:48.486 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:49.053 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:49.312 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:49.312 15:19:50 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe8192 1 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key1 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:49.571 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:50.140 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:50.140 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:50.140 { 00:14:50.140 "cntlid": 139, 00:14:50.140 "qid": 0, 00:14:50.140 "state": "enabled", 00:14:50.140 "thread": "nvmf_tgt_poll_group_000", 00:14:50.141 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:50.141 "listen_address": { 00:14:50.141 "trtype": "RDMA", 00:14:50.141 "adrfam": "IPv4", 00:14:50.141 "traddr": "10.0.0.2", 00:14:50.141 "trsvcid": "4420" 00:14:50.141 }, 00:14:50.141 "peer_address": { 00:14:50.141 "trtype": "RDMA", 00:14:50.141 "adrfam": "IPv4", 00:14:50.141 "traddr": "10.0.0.2", 00:14:50.141 "trsvcid": "35993" 00:14:50.141 }, 00:14:50.141 "auth": { 00:14:50.141 "state": "completed", 00:14:50.141 "digest": "sha512", 00:14:50.141 "dhgroup": "ffdhe8192" 00:14:50.141 } 00:14:50.141 } 00:14:50.141 ]' 00:14:50.141 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:50.141 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:50.141 15:19:51 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:50.400 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:50.400 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:50.400 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:50.400 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:50.400 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:50.659 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:50.659 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: --dhchap-ctrl-secret DHHC-1:02:NjU2ZWM4YWUxMjVkZTBjYWU3YmMwNTljZDNkZWZjNWIyNjU4NjBlMWFkMGRlN2JhwLLAMQ==: 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:51.227 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:51.227 15:19:52 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe8192 2 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key2 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:51.486 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:14:52.156 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:52.156 { 00:14:52.156 "cntlid": 141, 00:14:52.156 "qid": 0, 00:14:52.156 "state": "enabled", 00:14:52.156 "thread": "nvmf_tgt_poll_group_000", 00:14:52.156 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:52.156 "listen_address": { 00:14:52.156 "trtype": "RDMA", 00:14:52.156 "adrfam": "IPv4", 00:14:52.156 "traddr": "10.0.0.2", 00:14:52.156 "trsvcid": "4420" 00:14:52.156 }, 00:14:52.156 "peer_address": { 00:14:52.156 "trtype": "RDMA", 00:14:52.156 "adrfam": "IPv4", 00:14:52.156 "traddr": "10.0.0.2", 00:14:52.156 "trsvcid": "48469" 00:14:52.156 }, 00:14:52.156 "auth": { 00:14:52.156 "state": "completed", 00:14:52.156 "digest": "sha512", 00:14:52.156 "dhgroup": "ffdhe8192" 00:14:52.156 } 00:14:52.156 } 00:14:52.156 ]' 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:52.156 15:19:53 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:52.447 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:01:MzMxNjlhMmRjNjdiMGZlNGE3YTM0ODg4N2JlNjBkYjOe9vCq: 00:14:53.385 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:53.385 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:53.385 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:53.385 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:53.385 15:19:54 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@120 -- # for keyid in "${!keys[@]}" 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@121 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@123 -- # connect_authenticate sha512 ffdhe8192 3 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:53.385 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:14:53.953 00:14:53.953 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:53.953 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:53.953 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:54.216 { 00:14:54.216 "cntlid": 143, 00:14:54.216 "qid": 0, 00:14:54.216 "state": "enabled", 00:14:54.216 "thread": "nvmf_tgt_poll_group_000", 00:14:54.216 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:54.216 "listen_address": { 00:14:54.216 "trtype": "RDMA", 00:14:54.216 "adrfam": "IPv4", 00:14:54.216 "traddr": "10.0.0.2", 00:14:54.216 "trsvcid": "4420" 00:14:54.216 }, 00:14:54.216 "peer_address": { 00:14:54.216 "trtype": "RDMA", 00:14:54.216 "adrfam": "IPv4", 00:14:54.216 "traddr": "10.0.0.2", 00:14:54.216 "trsvcid": "37210" 00:14:54.216 }, 00:14:54.216 "auth": { 00:14:54.216 "state": "completed", 00:14:54.216 "digest": "sha512", 00:14:54.216 "dhgroup": "ffdhe8192" 00:14:54.216 } 00:14:54.216 } 00:14:54.216 ]' 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:54.216 15:19:55 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:54.216 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:54.216 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:54.477 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:54.477 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:54.477 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:54.477 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:54.477 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:14:55.415 15:19:56 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:55.415 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@129 -- # IFS=, 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@130 -- # printf %s sha256,sha384,sha512 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@129 -- # IFS=, 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@130 -- # printf %s null,ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@129 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256,sha384,sha512 --dhchap-dhgroups null,ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256,sha384,sha512 --dhchap-dhgroups null,ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@141 -- # connect_authenticate sha512 ffdhe8192 0 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key0 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:55.415 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:55.675 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:55.675 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:55.675 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:14:55.933 00:14:55.933 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:14:55.933 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:14:55.933 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:14:56.192 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:14:56.192 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:14:56.192 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:56.192 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:56.192 15:19:57 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:56.192 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:14:56.192 { 00:14:56.192 "cntlid": 145, 00:14:56.192 "qid": 0, 00:14:56.192 "state": "enabled", 00:14:56.192 "thread": "nvmf_tgt_poll_group_000", 00:14:56.192 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:56.192 "listen_address": { 00:14:56.192 "trtype": "RDMA", 00:14:56.192 "adrfam": "IPv4", 00:14:56.192 "traddr": "10.0.0.2", 00:14:56.192 "trsvcid": "4420" 00:14:56.192 }, 00:14:56.192 "peer_address": { 00:14:56.192 "trtype": "RDMA", 00:14:56.192 "adrfam": "IPv4", 00:14:56.192 "traddr": "10.0.0.2", 00:14:56.192 "trsvcid": "48005" 00:14:56.192 }, 00:14:56.192 "auth": { 00:14:56.192 "state": "completed", 00:14:56.192 "digest": "sha512", 00:14:56.192 "dhgroup": "ffdhe8192" 00:14:56.192 } 00:14:56.192 } 00:14:56.192 ]' 00:14:56.192 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:14:56.452 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:14:56.711 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:56.711 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:00:MzRiYTkyNWZiYzY1OTdiNjU5Yjc1M2VjZjNjNWY5YmEzMDQzNTdiOTU5MDgzNTMyL0jSKA==: --dhchap-ctrl-secret DHHC-1:03:YmFkOWNiMmZjZjkxODRkZWJjN2JhMmI5ZDg4NzIzMDM4NGIyMjkyYWExYmYxNjhjYTY1YTMxNzkxN2VlMTU1MX9fLvE=: 00:14:57.279 15:19:58 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:14:57.279 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:14:57.279 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:57.279 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:57.279 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:57.279 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@144 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@145 -- # NOT bdev_connect -b nvme0 --dhchap-key key2 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key2 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key2 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 00:14:57.280 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 00:14:57.848 request: 00:14:57.848 { 00:14:57.848 "name": "nvme0", 00:14:57.848 "trtype": "rdma", 00:14:57.848 "traddr": "10.0.0.2", 00:14:57.848 "adrfam": "ipv4", 00:14:57.848 "trsvcid": "4420", 00:14:57.848 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:14:57.848 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:57.848 "prchk_reftag": false, 00:14:57.848 "prchk_guard": false, 00:14:57.848 "hdgst": false, 00:14:57.848 "ddgst": false, 00:14:57.848 "dhchap_key": "key2", 00:14:57.848 "allow_unrecognized_csi": false, 00:14:57.848 "method": "bdev_nvme_attach_controller", 00:14:57.848 "req_id": 1 00:14:57.848 } 00:14:57.848 Got JSON-RPC error response 00:14:57.848 response: 00:14:57.848 { 00:14:57.848 "code": -5, 00:14:57.848 "message": "Input/output error" 00:14:57.848 } 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@146 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@149 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@150 -- # NOT bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:14:57.848 15:19:59 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:14:58.417 request: 00:14:58.417 { 00:14:58.417 "name": "nvme0", 00:14:58.417 "trtype": "rdma", 00:14:58.417 "traddr": "10.0.0.2", 00:14:58.417 "adrfam": "ipv4", 00:14:58.417 "trsvcid": "4420", 00:14:58.417 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:14:58.417 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:58.417 "prchk_reftag": false, 00:14:58.417 "prchk_guard": false, 00:14:58.417 "hdgst": false, 00:14:58.417 "ddgst": false, 00:14:58.417 "dhchap_key": "key1", 00:14:58.417 "dhchap_ctrlr_key": "ckey2", 00:14:58.417 "allow_unrecognized_csi": false, 00:14:58.417 "method": "bdev_nvme_attach_controller", 00:14:58.417 "req_id": 1 00:14:58.417 } 00:14:58.417 Got JSON-RPC error response 00:14:58.417 response: 00:14:58.417 { 00:14:58.417 "code": -5, 00:14:58.417 "message": "Input/output error" 00:14:58.417 } 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@151 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@154 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@155 -- # NOT bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:58.417 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:14:58.985 request: 00:14:58.985 { 00:14:58.985 "name": "nvme0", 00:14:58.985 "trtype": "rdma", 00:14:58.985 "traddr": "10.0.0.2", 00:14:58.985 "adrfam": "ipv4", 00:14:58.985 "trsvcid": "4420", 00:14:58.985 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:14:58.985 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:14:58.985 "prchk_reftag": false, 00:14:58.985 "prchk_guard": false, 00:14:58.985 "hdgst": false, 00:14:58.985 "ddgst": false, 00:14:58.985 "dhchap_key": "key1", 00:14:58.985 "dhchap_ctrlr_key": "ckey1", 00:14:58.985 "allow_unrecognized_csi": false, 00:14:58.985 "method": "bdev_nvme_attach_controller", 00:14:58.985 "req_id": 1 00:14:58.985 } 00:14:58.985 Got JSON-RPC error response 00:14:58.985 response: 00:14:58.985 { 00:14:58.985 "code": -5, 00:14:58.985 "message": "Input/output error" 00:14:58.985 } 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@156 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@159 -- # killprocess 1771121 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@950 -- # '[' -z 1771121 ']' 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@954 -- # kill -0 1771121 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@955 -- # uname 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1771121 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1771121' 00:14:58.985 killing process with pid 1771121 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@969 -- # kill 1771121 00:14:58.985 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@974 -- # wait 1771121 00:14:59.245 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@160 -- # nvmfappstart --wait-for-rpc -L nvmf_auth 00:14:59.245 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:14:59.245 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@724 -- # xtrace_disable 00:14:59.245 15:20:00 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@324 -- # nvmfpid=1791302 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF --wait-for-rpc -L nvmf_auth 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@325 -- # waitforlisten 1791302 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@831 -- # '[' -z 1791302 ']' 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:59.245 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@864 -- # return 0 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@730 -- # xtrace_disable 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@161 -- # trap 'dumplogs; cleanup' SIGINT SIGTERM EXIT 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@163 -- # waitforlisten 1791302 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@831 -- # '[' -z 1791302 ']' 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:15:00.182 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:00.182 15:20:01 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.442 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:00.442 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@864 -- # return 0 00:15:00.442 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@164 -- # rpc_cmd 00:15:00.442 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.442 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.442 null0 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@174 -- # for i in "${!keys[@]}" 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@175 -- # rpc_cmd keyring_file_add_key key0 /tmp/spdk.key-null.NVj 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # [[ -n /tmp/spdk.key-sha512.Dgs ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # rpc_cmd keyring_file_add_key ckey0 /tmp/spdk.key-sha512.Dgs 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@174 -- # for i in "${!keys[@]}" 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@175 -- # rpc_cmd keyring_file_add_key key1 /tmp/spdk.key-sha256.1MK 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # [[ -n /tmp/spdk.key-sha384.yGT ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # rpc_cmd keyring_file_add_key ckey1 /tmp/spdk.key-sha384.yGT 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@174 -- # for i in "${!keys[@]}" 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@175 -- # rpc_cmd keyring_file_add_key key2 /tmp/spdk.key-sha384.qrk 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # [[ -n /tmp/spdk.key-sha256.GOT ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # rpc_cmd keyring_file_add_key ckey2 /tmp/spdk.key-sha256.GOT 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@174 -- # for i in "${!keys[@]}" 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@175 -- # rpc_cmd keyring_file_add_key key3 /tmp/spdk.key-sha512.dEA 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@176 -- # [[ -n '' ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@179 -- # connect_authenticate sha512 ffdhe8192 3 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@65 -- # local digest dhgroup key ckey qpairs 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # digest=sha512 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # dhgroup=ffdhe8192 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@67 -- # key=key3 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@68 -- # ckey=(${ckeys[$3]:+--dhchap-ctrlr-key "ckey$3"}) 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@70 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@71 -- # bdev_connect -b nvme0 --dhchap-key key3 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:15:00.701 15:20:02 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:15:01.638 nvme0n1 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # hostrpc bdev_nvme_get_controllers 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # jq -r '.[].name' 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@73 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # rpc_cmd nvmf_subsystem_get_qpairs nqn.2024-03.io.spdk:cnode0 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@74 -- # qpairs='[ 00:15:01.638 { 00:15:01.638 "cntlid": 1, 00:15:01.638 "qid": 0, 00:15:01.638 "state": "enabled", 00:15:01.638 "thread": "nvmf_tgt_poll_group_000", 00:15:01.638 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:15:01.638 "listen_address": { 00:15:01.638 "trtype": "RDMA", 00:15:01.638 "adrfam": "IPv4", 00:15:01.638 "traddr": "10.0.0.2", 00:15:01.638 "trsvcid": "4420" 00:15:01.638 }, 00:15:01.638 "peer_address": { 00:15:01.638 "trtype": "RDMA", 00:15:01.638 "adrfam": "IPv4", 00:15:01.638 "traddr": "10.0.0.2", 00:15:01.638 "trsvcid": "60218" 00:15:01.638 }, 00:15:01.638 "auth": { 00:15:01.638 "state": "completed", 00:15:01.638 "digest": "sha512", 00:15:01.638 "dhgroup": "ffdhe8192" 00:15:01.638 } 00:15:01.638 } 00:15:01.638 ]' 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # jq -r '.[0].auth.digest' 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@75 -- # [[ sha512 == \s\h\a\5\1\2 ]] 00:15:01.638 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # jq -r '.[0].auth.dhgroup' 00:15:01.897 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@76 -- # [[ ffdhe8192 == \f\f\d\h\e\8\1\9\2 ]] 00:15:01.897 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # jq -r '.[0].auth.state' 00:15:01.897 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@77 -- # [[ completed == \c\o\m\p\l\e\t\e\d ]] 00:15:01.897 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@78 -- # hostrpc bdev_nvme_detach_controller nvme0 00:15:01.897 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:15:02.156 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@80 -- # nvme_connect --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:15:02.156 15:20:03 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@82 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:15:02.723 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@83 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@182 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key3 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@183 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256 00:15:02.723 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@184 -- # NOT bdev_connect -b nvme0 --dhchap-key key3 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key3 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key3 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:15:02.982 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:15:03.241 request: 00:15:03.241 { 00:15:03.241 "name": "nvme0", 00:15:03.241 "trtype": "rdma", 00:15:03.241 "traddr": "10.0.0.2", 00:15:03.241 "adrfam": "ipv4", 00:15:03.241 "trsvcid": "4420", 00:15:03.241 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:15:03.241 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:15:03.241 "prchk_reftag": false, 00:15:03.241 "prchk_guard": false, 00:15:03.241 "hdgst": false, 00:15:03.241 "ddgst": false, 00:15:03.241 "dhchap_key": "key3", 00:15:03.241 "allow_unrecognized_csi": false, 00:15:03.241 "method": "bdev_nvme_attach_controller", 00:15:03.241 "req_id": 1 00:15:03.241 } 00:15:03.241 Got JSON-RPC error response 00:15:03.241 response: 00:15:03.241 { 00:15:03.241 "code": -5, 00:15:03.241 "message": "Input/output error" 00:15:03.241 } 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@187 -- # IFS=, 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@188 -- # printf %s sha256,sha384,sha512 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@187 -- # hostrpc bdev_nvme_set_options --dhchap-dhgroups ffdhe2048 --dhchap-digests sha256,sha384,sha512 00:15:03.241 15:20:04 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-dhgroups ffdhe2048 --dhchap-digests sha256,sha384,sha512 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@193 -- # NOT bdev_connect -b nvme0 --dhchap-key key3 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key3 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key3 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:15:03.500 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key3 00:15:03.759 request: 00:15:03.759 { 00:15:03.759 "name": "nvme0", 00:15:03.759 "trtype": "rdma", 00:15:03.759 "traddr": "10.0.0.2", 00:15:03.759 "adrfam": "ipv4", 00:15:03.759 "trsvcid": "4420", 00:15:03.759 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:15:03.759 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:15:03.759 "prchk_reftag": false, 00:15:03.759 "prchk_guard": false, 00:15:03.759 "hdgst": false, 00:15:03.759 "ddgst": false, 00:15:03.759 "dhchap_key": "key3", 00:15:03.759 "allow_unrecognized_csi": false, 00:15:03.759 "method": "bdev_nvme_attach_controller", 00:15:03.759 "req_id": 1 00:15:03.759 } 00:15:03.759 Got JSON-RPC error response 00:15:03.759 response: 00:15:03.759 { 00:15:03.759 "code": -5, 00:15:03.759 "message": "Input/output error" 00:15:03.759 } 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@197 -- # IFS=, 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@198 -- # printf %s sha256,sha384,sha512 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@197 -- # IFS=, 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@198 -- # printf %s null,ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@197 -- # hostrpc bdev_nvme_set_options --dhchap-digests sha256,sha384,sha512 --dhchap-dhgroups null,ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:15:03.759 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_options --dhchap-digests sha256,sha384,sha512 --dhchap-dhgroups null,ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@208 -- # rpc_cmd nvmf_subsystem_remove_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@209 -- # rpc_cmd nvmf_subsystem_add_host nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@210 -- # NOT bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:04.018 15:20:05 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:04.277 request: 00:15:04.277 { 00:15:04.277 "name": "nvme0", 00:15:04.277 "trtype": "rdma", 00:15:04.277 "traddr": "10.0.0.2", 00:15:04.277 "adrfam": "ipv4", 00:15:04.277 "trsvcid": "4420", 00:15:04.277 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:15:04.277 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:15:04.277 "prchk_reftag": false, 00:15:04.277 "prchk_guard": false, 00:15:04.277 "hdgst": false, 00:15:04.277 "ddgst": false, 00:15:04.277 "dhchap_key": "key0", 00:15:04.277 "dhchap_ctrlr_key": "key1", 00:15:04.277 "allow_unrecognized_csi": false, 00:15:04.277 "method": "bdev_nvme_attach_controller", 00:15:04.277 "req_id": 1 00:15:04.277 } 00:15:04.277 Got JSON-RPC error response 00:15:04.277 response: 00:15:04.277 { 00:15:04.277 "code": -5, 00:15:04.277 "message": "Input/output error" 00:15:04.277 } 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@213 -- # bdev_connect -b nvme0 --dhchap-key key0 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 00:15:04.277 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 00:15:04.539 nvme0n1 00:15:04.539 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@214 -- # hostrpc bdev_nvme_get_controllers 00:15:04.539 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@214 -- # jq -r '.[].name' 00:15:04.539 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:04.798 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@214 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:04.798 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@215 -- # hostrpc bdev_nvme_detach_controller nvme0 00:15:04.798 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@218 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@219 -- # bdev_connect -b nvme0 --dhchap-key key1 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 00:15:05.057 15:20:06 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 00:15:05.993 nvme0n1 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@220 -- # hostrpc bdev_nvme_get_controllers 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@220 -- # jq -r '.[].name' 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@220 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@222 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@223 -- # hostrpc bdev_nvme_get_controllers 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@223 -- # jq -r '.[].name' 00:15:05.993 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:06.252 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@223 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:06.252 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@225 -- # nvme_connect --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:15:06.252 15:20:07 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@36 -- # nvme connect -t rdma -a 10.0.0.2 -n nqn.2024-03.io.spdk:cnode0 -i 1 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid 00e1c02b-5999-e811-99d6-a4bf01488b4e -l 0 --dhchap-secret DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: --dhchap-ctrl-secret DHHC-1:03:ZWYxNDY1OGU4ZWM3NTVkNjkyOWI4ODhiNjExOTE0YjM5MWZhY2YwODFiZDAxYzY5NzhjOWFkNzk5Y2Y2MjBkZhiOLv0=: 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@226 -- # nvme_get_ctrlr 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@41 -- # local dev 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@43 -- # for dev in /sys/devices/virtual/nvme-fabrics/ctl/nvme* 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@44 -- # [[ nqn.2024-03.io.spdk:cnode0 == \n\q\n\.\2\0\2\4\-\0\3\.\i\o\.\s\p\d\k\:\c\n\o\d\e\0 ]] 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@44 -- # echo nvme0 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@44 -- # break 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@226 -- # nctrlr=nvme0 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@227 -- # hostrpc bdev_nvme_detach_controller nvme0 00:15:06.818 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:15:07.076 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@228 -- # NOT bdev_connect -b nvme0 --dhchap-key key1 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg bdev_connect -b nvme0 --dhchap-key key1 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=bdev_connect 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t bdev_connect 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # bdev_connect -b nvme0 --dhchap-key key1 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 00:15:07.077 15:20:08 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key1 00:15:07.643 request: 00:15:07.643 { 00:15:07.643 "name": "nvme0", 00:15:07.643 "trtype": "rdma", 00:15:07.643 "traddr": "10.0.0.2", 00:15:07.643 "adrfam": "ipv4", 00:15:07.643 "trsvcid": "4420", 00:15:07.643 "subnqn": "nqn.2024-03.io.spdk:cnode0", 00:15:07.643 "hostnqn": "nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e", 00:15:07.643 "prchk_reftag": false, 00:15:07.643 "prchk_guard": false, 00:15:07.643 "hdgst": false, 00:15:07.643 "ddgst": false, 00:15:07.643 "dhchap_key": "key1", 00:15:07.643 "allow_unrecognized_csi": false, 00:15:07.643 "method": "bdev_nvme_attach_controller", 00:15:07.643 "req_id": 1 00:15:07.643 } 00:15:07.643 Got JSON-RPC error response 00:15:07.643 response: 00:15:07.643 { 00:15:07.643 "code": -5, 00:15:07.643 "message": "Input/output error" 00:15:07.643 } 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@229 -- # bdev_connect -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:07.643 15:20:09 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:08.209 nvme0n1 00:15:08.209 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@230 -- # hostrpc bdev_nvme_get_controllers 00:15:08.210 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@230 -- # jq -r '.[].name' 00:15:08.210 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:08.468 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@230 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:08.468 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@231 -- # hostrpc bdev_nvme_detach_controller nvme0 00:15:08.468 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:15:08.726 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@233 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:08.726 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:08.727 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:08.727 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:08.727 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@234 -- # bdev_connect -b nvme0 00:15:08.727 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 00:15:08.727 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 00:15:08.985 nvme0n1 00:15:08.985 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@235 -- # hostrpc bdev_nvme_get_controllers 00:15:08.985 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@235 -- # jq -r '.[].name' 00:15:08.985 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:09.244 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@235 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:09.244 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@236 -- # hostrpc bdev_nvme_detach_controller nvme0 00:15:09.244 15:20:10 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_detach_controller nvme0 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@239 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key key3 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@240 -- # nvme_set_keys nvme0 DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: '' 2s 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@49 -- # local ctl key ckey dev timeout 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # ctl=nvme0 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # key=DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # ckey= 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # timeout=2s 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@52 -- # dev=/sys/devices/virtual/nvme-fabrics/ctl/nvme0 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@54 -- # [[ -z DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: ]] 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@54 -- # echo DHHC-1:01:NDhjZjFjMjUyZmFlOWUxZTFlMTc5MjFiNzNiZmU3Nji7Dby4: 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@55 -- # [[ -z '' ]] 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@56 -- # [[ -z 2s ]] 00:15:09.503 15:20:11 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@56 -- # sleep 2s 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@241 -- # waitforblk nvme0n1 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1235 -- # local i=0 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1236 -- # grep -q -w nvme0n1 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1242 -- # grep -q -w nvme0n1 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1246 -- # return 0 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@243 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key1 --dhchap-ctrlr-key key2 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@244 -- # nvme_set_keys nvme0 '' DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: 2s 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@49 -- # local ctl key ckey dev timeout 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # ctl=nvme0 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # key= 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # ckey=DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: 00:15:11.404 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@51 -- # timeout=2s 00:15:11.405 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@52 -- # dev=/sys/devices/virtual/nvme-fabrics/ctl/nvme0 00:15:11.405 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@54 -- # [[ -z '' ]] 00:15:11.405 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@55 -- # [[ -z DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: ]] 00:15:11.405 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@55 -- # echo DHHC-1:02:NzM1ODhlYjFmZWNkOWNjYTNjMDFmZTAwMjNhZGQ4NTg3MzU0MjdhMmFhYTgxNWE0KOX/fQ==: 00:15:11.405 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@56 -- # [[ -z 2s ]] 00:15:11.405 15:20:13 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@56 -- # sleep 2s 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@245 -- # waitforblk nvme0n1 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1235 -- # local i=0 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1236 -- # grep -q -w nvme0n1 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1242 -- # grep -q -w nvme0n1 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1246 -- # return 0 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@246 -- # nvme disconnect -n nqn.2024-03.io.spdk:cnode0 00:15:13.937 NQN:nqn.2024-03.io.spdk:cnode0 disconnected 1 controller(s) 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@249 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@250 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:15:13.937 15:20:15 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:15:14.510 nvme0n1 00:15:14.510 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@252 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:14.510 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:14.510 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:14.510 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:14.510 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@253 -- # hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:14.510 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_keys nvme0 --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@254 -- # hostrpc bdev_nvme_get_controllers 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@254 -- # jq -r '.[].name' 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@254 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@256 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@257 -- # hostrpc bdev_nvme_set_keys nvme0 00:15:15.078 15:20:16 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_keys nvme0 00:15:15.337 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@258 -- # hostrpc bdev_nvme_get_controllers 00:15:15.337 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@258 -- # jq -r '.[].name' 00:15:15.337 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@258 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@260 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@261 -- # NOT hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key1 --dhchap-ctrlr-key key3 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key1 --dhchap-ctrlr-key key3 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=hostrpc 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t hostrpc 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key1 --dhchap-ctrlr-key key3 00:15:15.596 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_keys nvme0 --dhchap-key key1 --dhchap-ctrlr-key key3 00:15:15.854 request: 00:15:15.854 { 00:15:15.854 "name": "nvme0", 00:15:15.854 "dhchap_key": "key1", 00:15:15.854 "dhchap_ctrlr_key": "key3", 00:15:15.854 "method": "bdev_nvme_set_keys", 00:15:15.854 "req_id": 1 00:15:15.854 } 00:15:15.854 Got JSON-RPC error response 00:15:15.854 response: 00:15:15.854 { 00:15:15.854 "code": -13, 00:15:15.854 "message": "Permission denied" 00:15:15.854 } 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@262 -- # hostrpc bdev_nvme_get_controllers 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@262 -- # jq length 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@262 -- # (( 1 != 0 )) 00:15:16.113 15:20:17 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@263 -- # sleep 1s 00:15:17.490 15:20:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@262 -- # hostrpc bdev_nvme_get_controllers 00:15:17.490 15:20:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@262 -- # jq length 00:15:17.490 15:20:18 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@262 -- # (( 0 != 0 )) 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@267 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key0 --dhchap-ctrlr-key key1 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@268 -- # bdev_connect -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@60 -- # hostrpc bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:15:17.490 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_attach_controller -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e -n nqn.2024-03.io.spdk:cnode0 -b nvme0 --dhchap-key key0 --dhchap-ctrlr-key key1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:15:18.057 nvme0n1 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@270 -- # rpc_cmd nvmf_subsystem_set_keys nqn.2024-03.io.spdk:cnode0 nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --dhchap-key key2 --dhchap-ctrlr-key key3 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@271 -- # NOT hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key2 --dhchap-ctrlr-key key0 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@650 -- # local es=0 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@652 -- # valid_exec_arg hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key2 --dhchap-ctrlr-key key0 00:15:18.057 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@638 -- # local arg=hostrpc 00:15:18.316 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:18.316 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # type -t hostrpc 00:15:18.316 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:18.316 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # hostrpc bdev_nvme_set_keys nvme0 --dhchap-key key2 --dhchap-ctrlr-key key0 00:15:18.316 15:20:19 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_set_keys nvme0 --dhchap-key key2 --dhchap-ctrlr-key key0 00:15:18.574 request: 00:15:18.574 { 00:15:18.574 "name": "nvme0", 00:15:18.574 "dhchap_key": "key2", 00:15:18.574 "dhchap_ctrlr_key": "key0", 00:15:18.574 "method": "bdev_nvme_set_keys", 00:15:18.574 "req_id": 1 00:15:18.574 } 00:15:18.574 Got JSON-RPC error response 00:15:18.574 response: 00:15:18.574 { 00:15:18.574 "code": -13, 00:15:18.574 "message": "Permission denied" 00:15:18.574 } 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@653 -- # es=1 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@272 -- # hostrpc bdev_nvme_get_controllers 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@272 -- # jq length 00:15:18.575 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:18.833 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@272 -- # (( 1 != 0 )) 00:15:18.833 15:20:20 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@273 -- # sleep 1s 00:15:19.768 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@272 -- # hostrpc bdev_nvme_get_controllers 00:15:19.768 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@272 -- # jq length 00:15:19.768 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/host.sock bdev_nvme_get_controllers 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@272 -- # (( 0 != 0 )) 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@276 -- # trap - SIGINT SIGTERM EXIT 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@277 -- # cleanup 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@21 -- # killprocess 1771309 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@950 -- # '[' -z 1771309 ']' 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@954 -- # kill -0 1771309 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@955 -- # uname 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1771309 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1771309' 00:15:20.027 killing process with pid 1771309 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@969 -- # kill 1771309 00:15:20.027 15:20:21 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@974 -- # wait 1771309 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@22 -- # nvmftestfini 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@331 -- # nvmfcleanup 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@99 -- # sync 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@102 -- # set +e 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@103 -- # for i in {1..20} 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:15:20.596 rmmod nvme_rdma 00:15:20.596 rmmod nvme_fabrics 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@106 -- # set -e 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@107 -- # return 0 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@332 -- # '[' -n 1791302 ']' 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@333 -- # killprocess 1791302 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@950 -- # '[' -z 1791302 ']' 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@954 -- # kill -0 1791302 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@955 -- # uname 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1791302 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1791302' 00:15:20.596 killing process with pid 1791302 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@969 -- # kill 1791302 00:15:20.596 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@974 -- # wait 1791302 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@338 -- # nvmf_fini 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@264 -- # local dev 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@267 -- # remove_target_ns 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@22 -- # _remove_target_ns 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@268 -- # delete_main_bridge 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@130 -- # return 0 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@41 -- # _dev=0 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@41 -- # dev_map=() 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/setup.sh@284 -- # iptr 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@538 -- # iptables-save 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- nvmf/common.sh@538 -- # iptables-restore 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- target/auth.sh@23 -- # rm -f /tmp/spdk.key-null.NVj /tmp/spdk.key-sha256.1MK /tmp/spdk.key-sha384.qrk /tmp/spdk.key-sha512.dEA /tmp/spdk.key-sha512.Dgs /tmp/spdk.key-sha384.yGT /tmp/spdk.key-sha256.GOT '' /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/nvme-auth.log /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/nvmf-auth.log 00:15:20.855 00:15:20.855 real 2m50.470s 00:15:20.855 user 6m30.936s 00:15:20.855 sys 0m25.859s 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:20.855 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_auth_target -- common/autotest_common.sh@10 -- # set +x 00:15:20.855 ************************************ 00:15:20.855 END TEST nvmf_auth_target 00:15:20.855 ************************************ 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@39 -- # '[' rdma = tcp ']' 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@47 -- # '[' 0 -eq 1 ']' 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@53 -- # [[ phy == phy ]] 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@54 -- # '[' rdma = tcp ']' 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@60 -- # [[ rdma == \r\d\m\a ]] 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@63 -- # run_test nvmf_srq_overwhelm /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/srq_overwhelm.sh --transport=rdma 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:21.114 15:20:22 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:15:21.114 ************************************ 00:15:21.114 START TEST nvmf_srq_overwhelm 00:15:21.114 ************************************ 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/srq_overwhelm.sh --transport=rdma 00:15:21.115 * Looking for test storage... 00:15:21.115 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1681 -- # lcov --version 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@333 -- # local ver1 ver1_l 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@334 -- # local ver2 ver2_l 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@336 -- # IFS=.-: 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@336 -- # read -ra ver1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@337 -- # IFS=.-: 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@337 -- # read -ra ver2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@338 -- # local 'op=<' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@340 -- # ver1_l=2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@341 -- # ver2_l=1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@344 -- # case "$op" in 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@345 -- # : 1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@364 -- # (( v = 0 )) 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@365 -- # decimal 1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@353 -- # local d=1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@355 -- # echo 1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@365 -- # ver1[v]=1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@366 -- # decimal 2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@353 -- # local d=2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@355 -- # echo 2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@366 -- # ver2[v]=2 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@368 -- # return 0 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:15:21.115 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:21.115 --rc genhtml_branch_coverage=1 00:15:21.115 --rc genhtml_function_coverage=1 00:15:21.115 --rc genhtml_legend=1 00:15:21.115 --rc geninfo_all_blocks=1 00:15:21.115 --rc geninfo_unexecuted_blocks=1 00:15:21.115 00:15:21.115 ' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:15:21.115 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:21.115 --rc genhtml_branch_coverage=1 00:15:21.115 --rc genhtml_function_coverage=1 00:15:21.115 --rc genhtml_legend=1 00:15:21.115 --rc geninfo_all_blocks=1 00:15:21.115 --rc geninfo_unexecuted_blocks=1 00:15:21.115 00:15:21.115 ' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:15:21.115 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:21.115 --rc genhtml_branch_coverage=1 00:15:21.115 --rc genhtml_function_coverage=1 00:15:21.115 --rc genhtml_legend=1 00:15:21.115 --rc geninfo_all_blocks=1 00:15:21.115 --rc geninfo_unexecuted_blocks=1 00:15:21.115 00:15:21.115 ' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:15:21.115 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:21.115 --rc genhtml_branch_coverage=1 00:15:21.115 --rc genhtml_function_coverage=1 00:15:21.115 --rc genhtml_legend=1 00:15:21.115 --rc geninfo_all_blocks=1 00:15:21.115 --rc geninfo_unexecuted_blocks=1 00:15:21.115 00:15:21.115 ' 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@7 -- # uname -s 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:15:21.115 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@15 -- # shopt -s extglob 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- paths/export.sh@5 -- # export PATH 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@50 -- # : 0 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:15:21.375 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@54 -- # have_pci_nics=0 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@11 -- # MALLOC_BDEV_SIZE=64 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@13 -- # NVME_CONNECT='nvme connect -i 16' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@15 -- # nvmftestinit 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@292 -- # prepare_net_devs 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@254 -- # local -g is_hw=no 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@256 -- # remove_target_ns 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@22 -- # _remove_target_ns 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@125 -- # xtrace_disable 00:15:21.375 15:20:22 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@131 -- # pci_devs=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@131 -- # local -a pci_devs 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@132 -- # pci_net_devs=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@133 -- # pci_drivers=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@133 -- # local -A pci_drivers 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@135 -- # net_devs=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@135 -- # local -ga net_devs 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@136 -- # e810=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@136 -- # local -ga e810 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@137 -- # x722=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@137 -- # local -ga x722 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@138 -- # mlx=() 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@138 -- # local -ga mlx 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:15:28.064 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:15:28.064 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:15:28.064 Found net devices under 0000:18:00.0: mlx_0_0 00:15:28.064 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:15:28.065 Found net devices under 0000:18:00.1: mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@249 -- # get_rdma_if_list 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@75 -- # rdma_devs=() 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@89 -- # continue 2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@89 -- # continue 2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@258 -- # is_hw=yes 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@61 -- # uname 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@65 -- # modprobe ib_cm 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@66 -- # modprobe ib_core 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@67 -- # modprobe ib_umad 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@69 -- # modprobe iw_cm 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@27 -- # local -gA dev_map 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@28 -- # local -g _dev 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@44 -- # ips=() 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@58 -- # key_initiator=target1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@11 -- # local val=167772161 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:15:28.065 10.0.0.1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@11 -- # local val=167772162 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:15:28.065 10.0.0.2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@38 -- # ping_ips 1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # get_net_dev target0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@107 -- # local dev=target0 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:15:28.065 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:15:28.065 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:15:28.065 00:15:28.065 --- 10.0.0.2 ping statistics --- 00:15:28.065 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:15:28.065 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:15:28.065 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # get_net_dev target0 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@107 -- # local dev=target0 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:15:28.066 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:15:28.326 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:15:28.326 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:15:28.326 00:15:28.326 --- 10.0.0.2 ping statistics --- 00:15:28.326 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:15:28.326 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@98 -- # (( pair++ )) 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@266 -- # return 0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # get_net_dev target0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@107 -- # local dev=target0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # get_net_dev target1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@107 -- # local dev=target1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # get_net_dev target0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@107 -- # local dev=target0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # get_net_dev target1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@107 -- # local dev=target1 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:15:28.326 15:20:29 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:15:28.326 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@17 -- # nvmfappstart -m 0xF 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@724 -- # xtrace_disable 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@324 -- # nvmfpid=1797268 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@325 -- # waitforlisten 1797268 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@831 -- # '[' -z 1797268 ']' 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:15:28.327 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:28.327 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:28.327 [2024-09-27 15:20:30.100544] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:15:28.327 [2024-09-27 15:20:30.100609] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:28.585 [2024-09-27 15:20:30.184948] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:15:28.585 [2024-09-27 15:20:30.276108] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:15:28.585 [2024-09-27 15:20:30.276156] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:15:28.585 [2024-09-27 15:20:30.276166] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:15:28.585 [2024-09-27 15:20:30.276175] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:15:28.585 [2024-09-27 15:20:30.276182] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:15:28.585 [2024-09-27 15:20:30.276254] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:15:28.585 [2024-09-27 15:20:30.276385] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:15:28.585 [2024-09-27 15:20:30.276880] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:15:28.585 [2024-09-27 15:20:30.276880] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:15:29.151 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:29.151 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@864 -- # return 0 00:15:29.151 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:15:29.151 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@730 -- # xtrace_disable 00:15:29.151 15:20:30 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@20 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 -s 1024 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 [2024-09-27 15:20:31.034238] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1cc24a0/0x1cc6990) succeed. 00:15:29.410 [2024-09-27 15:20:31.044849] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1cc3ae0/0x1d08030) succeed. 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # seq 0 5 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # for i in $(seq 0 5) 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a -s SPDK00000000000000 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 Malloc0 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 Malloc0 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4420 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:29.410 [2024-09-27 15:20:31.142974] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:29.410 15:20:31 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@27 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode0 -a 10.0.0.2 -s 4420 00:15:30.345 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@28 -- # waitforblk nvme0n1 00:15:30.345 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1235 -- # local i=0 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # grep -q -w nvme0n1 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # grep -q -w nvme0n1 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1246 -- # return 0 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # for i in $(seq 0 5) 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc1 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:30.346 Malloc1 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:30.346 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:30.603 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:30.603 15:20:32 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@27 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode1 -a 10.0.0.2 -s 4420 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@28 -- # waitforblk nvme1n1 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1235 -- # local i=0 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # grep -q -w nvme1n1 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # grep -q -w nvme1n1 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1246 -- # return 0 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # for i in $(seq 0 5) 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode2 -a -s SPDK00000000000002 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc2 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:31.538 Malloc2 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode2 Malloc2 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode2 -t rdma -a 10.0.0.2 -s 4420 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:31.538 15:20:33 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@27 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode2 -a 10.0.0.2 -s 4420 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@28 -- # waitforblk nvme2n1 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1235 -- # local i=0 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # grep -q -w nvme2n1 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # grep -q -w nvme2n1 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1246 -- # return 0 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # for i in $(seq 0 5) 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode3 -a -s SPDK00000000000003 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc3 00:15:32.474 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:32.475 Malloc3 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode3 Malloc3 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode3 -t rdma -a 10.0.0.2 -s 4420 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:32.475 15:20:34 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@27 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode3 -a 10.0.0.2 -s 4420 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@28 -- # waitforblk nvme3n1 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1235 -- # local i=0 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # grep -q -w nvme3n1 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # grep -q -w nvme3n1 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1246 -- # return 0 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # for i in $(seq 0 5) 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode4 -a -s SPDK00000000000004 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc4 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:33.853 Malloc4 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode4 Malloc4 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode4 -t rdma -a 10.0.0.2 -s 4420 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:33.853 15:20:35 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@27 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode4 -a 10.0.0.2 -s 4420 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@28 -- # waitforblk nvme4n1 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1235 -- # local i=0 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # grep -q -w nvme4n1 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # grep -q -w nvme4n1 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1246 -- # return 0 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@22 -- # for i in $(seq 0 5) 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@23 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode5 -a -s SPDK00000000000005 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@24 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc5 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:34.789 Malloc5 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@25 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode5 Malloc5 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@26 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode5 -t rdma -a 10.0.0.2 -s 4420 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:34.789 15:20:36 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@27 -- # nvme connect -i 15 --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -t rdma -n nqn.2016-06.io.spdk:cnode5 -a 10.0.0.2 -s 4420 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@28 -- # waitforblk nvme5n1 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1235 -- # local i=0 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # lsblk -l -o NAME 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1236 -- # grep -q -w nvme5n1 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # lsblk -l -o NAME 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1242 -- # grep -q -w nvme5n1 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1246 -- # return 0 00:15:35.726 15:20:37 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/fio-wrapper -p nvmf -i 1048576 -d 128 -t read -r 10 -n 13 00:15:35.726 [global] 00:15:35.726 thread=1 00:15:35.726 invalidate=1 00:15:35.726 rw=read 00:15:35.726 time_based=1 00:15:35.726 runtime=10 00:15:35.726 ioengine=libaio 00:15:35.726 direct=1 00:15:35.726 bs=1048576 00:15:35.726 iodepth=128 00:15:35.726 norandommap=1 00:15:35.726 numjobs=13 00:15:35.726 00:15:35.726 [job0] 00:15:35.726 filename=/dev/nvme0n1 00:15:35.726 [job1] 00:15:35.726 filename=/dev/nvme1n1 00:15:35.726 [job2] 00:15:35.726 filename=/dev/nvme2n1 00:15:35.726 [job3] 00:15:35.726 filename=/dev/nvme3n1 00:15:35.726 [job4] 00:15:35.726 filename=/dev/nvme4n1 00:15:35.726 [job5] 00:15:35.726 filename=/dev/nvme5n1 00:15:35.984 Could not set queue depth (nvme0n1) 00:15:35.984 Could not set queue depth (nvme1n1) 00:15:35.984 Could not set queue depth (nvme2n1) 00:15:35.984 Could not set queue depth (nvme3n1) 00:15:35.984 Could not set queue depth (nvme4n1) 00:15:35.984 Could not set queue depth (nvme5n1) 00:15:35.984 job0: (g=0): rw=read, bs=(R) 1024KiB-1024KiB, (W) 1024KiB-1024KiB, (T) 1024KiB-1024KiB, ioengine=libaio, iodepth=128 00:15:35.984 ... 00:15:35.984 job1: (g=0): rw=read, bs=(R) 1024KiB-1024KiB, (W) 1024KiB-1024KiB, (T) 1024KiB-1024KiB, ioengine=libaio, iodepth=128 00:15:35.984 ... 00:15:35.984 job2: (g=0): rw=read, bs=(R) 1024KiB-1024KiB, (W) 1024KiB-1024KiB, (T) 1024KiB-1024KiB, ioengine=libaio, iodepth=128 00:15:35.984 ... 00:15:35.984 job3: (g=0): rw=read, bs=(R) 1024KiB-1024KiB, (W) 1024KiB-1024KiB, (T) 1024KiB-1024KiB, ioengine=libaio, iodepth=128 00:15:35.984 ... 00:15:35.984 job4: (g=0): rw=read, bs=(R) 1024KiB-1024KiB, (W) 1024KiB-1024KiB, (T) 1024KiB-1024KiB, ioengine=libaio, iodepth=128 00:15:35.984 ... 00:15:35.984 job5: (g=0): rw=read, bs=(R) 1024KiB-1024KiB, (W) 1024KiB-1024KiB, (T) 1024KiB-1024KiB, ioengine=libaio, iodepth=128 00:15:35.984 ... 00:15:35.984 fio-3.35 00:15:35.984 Starting 78 threads 00:15:50.863 00:15:50.863 job0: (groupid=0, jobs=1): err= 0: pid=1798422: Fri Sep 27 15:20:51 2024 00:15:50.863 read: IOPS=63, BW=63.2MiB/s (66.3MB/s)(811MiB/12823msec) 00:15:50.863 slat (usec): min=46, max=6377.9k, avg=13173.51, stdev=233536.92 00:15:50.863 clat (msec): min=242, max=12676, avg=1738.56, stdev=3204.52 00:15:50.863 lat (msec): min=244, max=12677, avg=1751.74, stdev=3220.93 00:15:50.863 clat percentiles (msec): 00:15:50.863 | 1.00th=[ 245], 5.00th=[ 247], 10.00th=[ 249], 20.00th=[ 264], 00:15:50.863 | 30.00th=[ 268], 40.00th=[ 275], 50.00th=[ 347], 60.00th=[ 363], 00:15:50.863 | 70.00th=[ 397], 80.00th=[ 405], 90.00th=[ 8792], 95.00th=[ 8926], 00:15:50.863 | 99.00th=[12684], 99.50th=[12684], 99.90th=[12684], 99.95th=[12684], 00:15:50.863 | 99.99th=[12684] 00:15:50.863 bw ( KiB/s): min= 2048, max=456704, per=9.20%, avg=233472.00, stdev=169695.06, samples=6 00:15:50.863 iops : min= 2, max= 446, avg=228.00, stdev=165.72, samples=6 00:15:50.863 lat (msec) : 250=10.48%, 500=71.15%, >=2000=18.37% 00:15:50.863 cpu : usr=0.03%, sys=1.22%, ctx=681, majf=0, minf=32769 00:15:50.863 IO depths : 1=0.1%, 2=0.2%, 4=0.5%, 8=1.0%, 16=2.0%, 32=3.9%, >=64=92.2% 00:15:50.863 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.863 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.863 issued rwts: total=811,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.863 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.863 job0: (groupid=0, jobs=1): err= 0: pid=1798423: Fri Sep 27 15:20:51 2024 00:15:50.863 read: IOPS=0, BW=320KiB/s (328kB/s)(4096KiB/12797msec) 00:15:50.863 slat (usec): min=1036, max=10550k, avg=2668338.11, stdev=5254525.94 00:15:50.863 clat (msec): min=2122, max=12795, avg=10066.72, stdev=5296.29 00:15:50.863 lat (msec): min=12672, max=12796, avg=12735.06, stdev=69.91 00:15:50.863 clat percentiles (msec): 00:15:50.863 | 1.00th=[ 2123], 5.00th=[ 2123], 10.00th=[ 2123], 20.00th=[ 2123], 00:15:50.863 | 30.00th=[12684], 40.00th=[12684], 50.00th=[12684], 60.00th=[12684], 00:15:50.863 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.863 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.863 | 99.99th=[12818] 00:15:50.863 lat (msec) : >=2000=100.00% 00:15:50.863 cpu : usr=0.01%, sys=0.02%, ctx=8, majf=0, minf=1025 00:15:50.863 IO depths : 1=25.0%, 2=50.0%, 4=25.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:15:50.863 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.863 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.863 issued rwts: total=4,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.863 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.863 job0: (groupid=0, jobs=1): err= 0: pid=1798424: Fri Sep 27 15:20:51 2024 00:15:50.863 read: IOPS=1, BW=1434KiB/s (1469kB/s)(18.0MiB/12851msec) 00:15:50.863 slat (usec): min=1045, max=2165.5k, avg=596086.29, stdev=956830.76 00:15:50.863 clat (msec): min=2120, max=12828, avg=10293.69, stdev=3442.47 00:15:50.863 lat (msec): min=4220, max=12850, avg=10889.77, stdev=2815.96 00:15:50.863 clat percentiles (msec): 00:15:50.863 | 1.00th=[ 2123], 5.00th=[ 2123], 10.00th=[ 4212], 20.00th=[ 6409], 00:15:50.863 | 30.00th=[ 8557], 40.00th=[10805], 50.00th=[12684], 60.00th=[12684], 00:15:50.863 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.863 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.863 | 99.99th=[12818] 00:15:50.863 lat (msec) : >=2000=100.00% 00:15:50.863 cpu : usr=0.00%, sys=0.14%, ctx=44, majf=0, minf=4609 00:15:50.863 IO depths : 1=5.6%, 2=11.1%, 4=22.2%, 8=44.4%, 16=16.7%, 32=0.0%, >=64=0.0% 00:15:50.863 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.863 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.863 issued rwts: total=18,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.863 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.863 job0: (groupid=0, jobs=1): err= 0: pid=1798425: Fri Sep 27 15:20:51 2024 00:15:50.863 read: IOPS=3, BW=3178KiB/s (3254kB/s)(40.0MiB/12890msec) 00:15:50.863 slat (usec): min=718, max=2147.3k, avg=269141.95, stdev=695401.63 00:15:50.863 clat (msec): min=2124, max=12889, avg=11339.91, stdev=2964.93 00:15:50.863 lat (msec): min=4220, max=12889, avg=11609.05, stdev=2569.12 00:15:50.863 clat percentiles (msec): 00:15:50.863 | 1.00th=[ 2123], 5.00th=[ 4212], 10.00th=[ 6409], 20.00th=[ 8557], 00:15:50.863 | 30.00th=[12684], 40.00th=[12818], 50.00th=[12818], 60.00th=[12818], 00:15:50.863 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12953], 00:15:50.863 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.863 | 99.99th=[12953] 00:15:50.863 lat (msec) : >=2000=100.00% 00:15:50.863 cpu : usr=0.00%, sys=0.32%, ctx=63, majf=0, minf=10241 00:15:50.863 IO depths : 1=2.5%, 2=5.0%, 4=10.0%, 8=20.0%, 16=40.0%, 32=22.5%, >=64=0.0% 00:15:50.863 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.863 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.863 issued rwts: total=40,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.863 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.863 job0: (groupid=0, jobs=1): err= 0: pid=1798426: Fri Sep 27 15:20:51 2024 00:15:50.863 read: IOPS=47, BW=47.4MiB/s (49.7MB/s)(613MiB/12926msec) 00:15:50.863 slat (usec): min=57, max=2140.2k, avg=17596.39, stdev=150456.15 00:15:50.863 clat (msec): min=408, max=8396, avg=2259.93, stdev=2928.93 00:15:50.863 lat (msec): min=411, max=8417, avg=2277.53, stdev=2937.06 00:15:50.863 clat percentiles (msec): 00:15:50.863 | 1.00th=[ 414], 5.00th=[ 426], 10.00th=[ 443], 20.00th=[ 506], 00:15:50.863 | 30.00th=[ 609], 40.00th=[ 667], 50.00th=[ 726], 60.00th=[ 810], 00:15:50.863 | 70.00th=[ 986], 80.00th=[ 7483], 90.00th=[ 7953], 95.00th=[ 8221], 00:15:50.863 | 99.00th=[ 8356], 99.50th=[ 8423], 99.90th=[ 8423], 99.95th=[ 8423], 00:15:50.863 | 99.99th=[ 8423] 00:15:50.863 bw ( KiB/s): min= 1501, max=284672, per=4.36%, avg=110543.44, stdev=102822.76, samples=9 00:15:50.863 iops : min= 1, max= 278, avg=107.89, stdev=100.48, samples=9 00:15:50.863 lat (msec) : 500=18.76%, 750=34.58%, 1000=17.78%, 2000=3.43%, >=2000=25.45% 00:15:50.863 cpu : usr=0.02%, sys=1.01%, ctx=686, majf=0, minf=32769 00:15:50.863 IO depths : 1=0.2%, 2=0.3%, 4=0.7%, 8=1.3%, 16=2.6%, 32=5.2%, >=64=89.7% 00:15:50.863 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.863 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.863 issued rwts: total=613,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798427: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=97, BW=97.3MiB/s (102MB/s)(1250MiB/12852msec) 00:15:50.864 slat (usec): min=45, max=2128.7k, avg=8581.07, stdev=101788.24 00:15:50.864 clat (msec): min=108, max=10718, avg=843.21, stdev=1477.32 00:15:50.864 lat (msec): min=109, max=12667, avg=851.79, stdev=1503.24 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 109], 5.00th=[ 110], 10.00th=[ 110], 20.00th=[ 111], 00:15:50.864 | 30.00th=[ 144], 40.00th=[ 226], 50.00th=[ 230], 60.00th=[ 347], 00:15:50.864 | 70.00th=[ 609], 80.00th=[ 852], 90.00th=[ 4111], 95.00th=[ 5067], 00:15:50.864 | 99.00th=[ 5134], 99.50th=[ 5470], 99.90th=[10671], 99.95th=[10671], 00:15:50.864 | 99.99th=[10671] 00:15:50.864 bw ( KiB/s): min= 2048, max=692224, per=11.32%, avg=287440.38, stdev=279114.17, samples=8 00:15:50.864 iops : min= 2, max= 676, avg=280.62, stdev=272.60, samples=8 00:15:50.864 lat (msec) : 250=52.72%, 500=12.00%, 750=13.52%, 1000=7.12%, 2000=3.76% 00:15:50.864 lat (msec) : >=2000=10.88% 00:15:50.864 cpu : usr=0.04%, sys=1.23%, ctx=1294, majf=0, minf=32769 00:15:50.864 IO depths : 1=0.1%, 2=0.2%, 4=0.3%, 8=0.6%, 16=1.3%, 32=2.6%, >=64=95.0% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.864 issued rwts: total=1250,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798428: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=1, BW=1518KiB/s (1554kB/s)(19.0MiB/12819msec) 00:15:50.864 slat (usec): min=722, max=10542k, avg=562546.34, stdev=2416786.04 00:15:50.864 clat (msec): min=2130, max=12817, avg=12218.18, stdev=2443.48 00:15:50.864 lat (msec): min=12672, max=12818, avg=12780.72, stdev=54.17 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 2123], 5.00th=[ 2123], 10.00th=[12684], 20.00th=[12684], 00:15:50.864 | 30.00th=[12818], 40.00th=[12818], 50.00th=[12818], 60.00th=[12818], 00:15:50.864 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.864 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.864 | 99.99th=[12818] 00:15:50.864 lat (msec) : >=2000=100.00% 00:15:50.864 cpu : usr=0.01%, sys=0.12%, ctx=19, majf=0, minf=4865 00:15:50.864 IO depths : 1=5.3%, 2=10.5%, 4=21.1%, 8=42.1%, 16=21.1%, 32=0.0%, >=64=0.0% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.864 issued rwts: total=19,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798429: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=4, BW=4188KiB/s (4288kB/s)(53.0MiB/12960msec) 00:15:50.864 slat (usec): min=708, max=4281.8k, avg=204326.63, stdev=747374.78 00:15:50.864 clat (msec): min=2129, max=12956, avg=11967.99, stdev=2094.95 00:15:50.864 lat (msec): min=6411, max=12959, avg=12172.31, stdev=1582.32 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 2123], 5.00th=[ 8557], 10.00th=[ 8658], 20.00th=[10805], 00:15:50.864 | 30.00th=[12818], 40.00th=[12818], 50.00th=[12953], 60.00th=[12953], 00:15:50.864 | 70.00th=[12953], 80.00th=[12953], 90.00th=[12953], 95.00th=[12953], 00:15:50.864 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.864 | 99.99th=[12953] 00:15:50.864 lat (msec) : >=2000=100.00% 00:15:50.864 cpu : usr=0.00%, sys=0.43%, ctx=55, majf=0, minf=13569 00:15:50.864 IO depths : 1=1.9%, 2=3.8%, 4=7.5%, 8=15.1%, 16=30.2%, 32=41.5%, >=64=0.0% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.864 issued rwts: total=53,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798430: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=67, BW=68.0MiB/s (71.3MB/s)(872MiB/12828msec) 00:15:50.864 slat (usec): min=45, max=2223.0k, avg=12252.26, stdev=146144.25 00:15:50.864 clat (msec): min=236, max=11060, avg=1815.95, stdev=3742.97 00:15:50.864 lat (msec): min=238, max=11062, avg=1828.20, stdev=3755.21 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 241], 5.00th=[ 245], 10.00th=[ 249], 20.00th=[ 253], 00:15:50.864 | 30.00th=[ 253], 40.00th=[ 255], 50.00th=[ 255], 60.00th=[ 257], 00:15:50.864 | 70.00th=[ 262], 80.00th=[ 292], 90.00th=[10939], 95.00th=[10939], 00:15:50.864 | 99.00th=[11073], 99.50th=[11073], 99.90th=[11073], 99.95th=[11073], 00:15:50.864 | 99.99th=[11073] 00:15:50.864 bw ( KiB/s): min= 2043, max=522240, per=7.51%, avg=190719.38, stdev=260641.78, samples=8 00:15:50.864 iops : min= 1, max= 510, avg=186.12, stdev=254.64, samples=8 00:15:50.864 lat (msec) : 250=11.12%, 500=73.85%, >=2000=15.02% 00:15:50.864 cpu : usr=0.01%, sys=1.02%, ctx=834, majf=0, minf=32769 00:15:50.864 IO depths : 1=0.1%, 2=0.2%, 4=0.5%, 8=0.9%, 16=1.8%, 32=3.7%, >=64=92.8% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.864 issued rwts: total=872,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798431: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=140, BW=140MiB/s (147MB/s)(1810MiB/12885msec) 00:15:50.864 slat (usec): min=38, max=2100.3k, avg=5917.61, stdev=93200.39 00:15:50.864 clat (msec): min=110, max=8621, avg=777.35, stdev=1680.26 00:15:50.864 lat (msec): min=111, max=10661, avg=783.26, stdev=1695.98 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 112], 5.00th=[ 112], 10.00th=[ 113], 20.00th=[ 114], 00:15:50.864 | 30.00th=[ 115], 40.00th=[ 118], 50.00th=[ 122], 60.00th=[ 157], 00:15:50.864 | 70.00th=[ 205], 80.00th=[ 239], 90.00th=[ 4463], 95.00th=[ 4530], 00:15:50.864 | 99.00th=[ 6342], 99.50th=[ 6477], 99.90th=[ 8658], 99.95th=[ 8658], 00:15:50.864 | 99.99th=[ 8658] 00:15:50.864 bw ( KiB/s): min= 1563, max=1058816, per=19.40%, avg=492314.29, stdev=472126.90, samples=7 00:15:50.864 iops : min= 1, max= 1034, avg=480.43, stdev=461.47, samples=7 00:15:50.864 lat (msec) : 250=86.85%, 500=0.06%, >=2000=13.09% 00:15:50.864 cpu : usr=0.03%, sys=1.57%, ctx=1730, majf=0, minf=32769 00:15:50.864 IO depths : 1=0.1%, 2=0.1%, 4=0.2%, 8=0.4%, 16=0.9%, 32=1.8%, >=64=96.5% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.864 issued rwts: total=1810,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798432: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=1, BW=1992KiB/s (2040kB/s)(25.0MiB/12853msec) 00:15:50.864 slat (usec): min=763, max=8561.1k, avg=428753.76, stdev=1738377.98 00:15:50.864 clat (msec): min=2133, max=12851, avg=11965.69, stdev=2217.92 00:15:50.864 lat (msec): min=10694, max=12852, avg=12394.44, stdev=855.96 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 2140], 5.00th=[10671], 10.00th=[10671], 20.00th=[10671], 00:15:50.864 | 30.00th=[12684], 40.00th=[12684], 50.00th=[12818], 60.00th=[12818], 00:15:50.864 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.864 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.864 | 99.99th=[12818] 00:15:50.864 lat (msec) : >=2000=100.00% 00:15:50.864 cpu : usr=0.00%, sys=0.18%, ctx=26, majf=0, minf=6401 00:15:50.864 IO depths : 1=4.0%, 2=8.0%, 4=16.0%, 8=32.0%, 16=40.0%, 32=0.0%, >=64=0.0% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.864 issued rwts: total=25,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798433: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=20, BW=20.8MiB/s (21.8MB/s)(268MiB/12862msec) 00:15:50.864 slat (usec): min=59, max=4271.9k, avg=40027.65, stdev=328245.31 00:15:50.864 clat (msec): min=249, max=12387, avg=5963.41, stdev=5899.59 00:15:50.864 lat (msec): min=251, max=12387, avg=6003.44, stdev=5906.24 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 251], 5.00th=[ 255], 10.00th=[ 255], 20.00th=[ 259], 00:15:50.864 | 30.00th=[ 268], 40.00th=[ 330], 50.00th=[ 447], 60.00th=[12147], 00:15:50.864 | 70.00th=[12281], 80.00th=[12281], 90.00th=[12281], 95.00th=[12416], 00:15:50.864 | 99.00th=[12416], 99.50th=[12416], 99.90th=[12416], 99.95th=[12416], 00:15:50.864 | 99.99th=[12416] 00:15:50.864 bw ( KiB/s): min= 2043, max=219575, per=1.90%, avg=48200.33, stdev=86722.32, samples=6 00:15:50.864 iops : min= 1, max= 214, avg=46.83, stdev=84.63, samples=6 00:15:50.864 lat (msec) : 250=0.37%, 500=50.00%, 2000=0.75%, >=2000=48.88% 00:15:50.864 cpu : usr=0.02%, sys=0.97%, ctx=202, majf=0, minf=32769 00:15:50.864 IO depths : 1=0.4%, 2=0.7%, 4=1.5%, 8=3.0%, 16=6.0%, 32=11.9%, >=64=76.5% 00:15:50.864 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.864 complete : 0=0.0%, 4=99.3%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.7% 00:15:50.864 issued rwts: total=268,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.864 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.864 job0: (groupid=0, jobs=1): err= 0: pid=1798434: Fri Sep 27 15:20:51 2024 00:15:50.864 read: IOPS=2, BW=2790KiB/s (2857kB/s)(35.0MiB/12847msec) 00:15:50.864 slat (usec): min=962, max=6425.9k, avg=305774.18, stdev=1165325.18 00:15:50.864 clat (msec): min=2143, max=12843, avg=12053.34, stdev=2045.58 00:15:50.864 lat (msec): min=8569, max=12846, avg=12359.11, stdev=1103.84 00:15:50.864 clat percentiles (msec): 00:15:50.864 | 1.00th=[ 2140], 5.00th=[ 8557], 10.00th=[10671], 20.00th=[12684], 00:15:50.864 | 30.00th=[12684], 40.00th=[12684], 50.00th=[12684], 60.00th=[12818], 00:15:50.864 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.865 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.865 | 99.99th=[12818] 00:15:50.865 lat (msec) : >=2000=100.00% 00:15:50.865 cpu : usr=0.00%, sys=0.29%, ctx=33, majf=0, minf=8961 00:15:50.865 IO depths : 1=2.9%, 2=5.7%, 4=11.4%, 8=22.9%, 16=45.7%, 32=11.4%, >=64=0.0% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.865 issued rwts: total=35,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798435: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=98, BW=98.7MiB/s (103MB/s)(1275MiB/12921msec) 00:15:50.865 slat (usec): min=38, max=2209.5k, avg=8434.01, stdev=116276.89 00:15:50.865 clat (msec): min=107, max=6656, avg=955.80, stdev=2025.95 00:15:50.865 lat (msec): min=108, max=6657, avg=964.23, stdev=2034.84 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 118], 5.00th=[ 120], 10.00th=[ 120], 20.00th=[ 120], 00:15:50.865 | 30.00th=[ 121], 40.00th=[ 121], 50.00th=[ 122], 60.00th=[ 130], 00:15:50.865 | 70.00th=[ 228], 80.00th=[ 247], 90.00th=[ 5067], 95.00th=[ 6611], 00:15:50.865 | 99.00th=[ 6611], 99.50th=[ 6678], 99.90th=[ 6678], 99.95th=[ 6678], 00:15:50.865 | 99.99th=[ 6678] 00:15:50.865 bw ( KiB/s): min= 1501, max=1064960, per=18.52%, avg=470111.40, stdev=472126.60, samples=5 00:15:50.865 iops : min= 1, max= 1040, avg=459.00, stdev=461.18, samples=5 00:15:50.865 lat (msec) : 250=80.31%, 500=2.82%, 750=2.82%, 1000=0.31%, >=2000=13.73% 00:15:50.865 cpu : usr=0.00%, sys=1.43%, ctx=1521, majf=0, minf=32769 00:15:50.865 IO depths : 1=0.1%, 2=0.2%, 4=0.3%, 8=0.6%, 16=1.3%, 32=2.5%, >=64=95.1% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.865 issued rwts: total=1275,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798436: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=99, BW=99.1MiB/s (104MB/s)(1061MiB/10703msec) 00:15:50.865 slat (usec): min=45, max=2076.8k, avg=10017.19, stdev=124649.26 00:15:50.865 clat (msec): min=70, max=8588, avg=825.34, stdev=1513.77 00:15:50.865 lat (msec): min=127, max=10554, avg=835.36, stdev=1538.43 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 128], 5.00th=[ 129], 10.00th=[ 129], 20.00th=[ 130], 00:15:50.865 | 30.00th=[ 130], 40.00th=[ 131], 50.00th=[ 153], 60.00th=[ 226], 00:15:50.865 | 70.00th=[ 241], 80.00th=[ 709], 90.00th=[ 4396], 95.00th=[ 4665], 00:15:50.865 | 99.00th=[ 4933], 99.50th=[ 5000], 99.90th=[ 8557], 99.95th=[ 8557], 00:15:50.865 | 99.99th=[ 8557] 00:15:50.865 bw ( KiB/s): min= 1622, max=931840, per=10.76%, avg=273160.43, stdev=330158.79, samples=7 00:15:50.865 iops : min= 1, max= 910, avg=266.57, stdev=322.55, samples=7 00:15:50.865 lat (msec) : 100=0.09%, 250=72.57%, 500=3.39%, 750=5.09%, 1000=4.34% 00:15:50.865 lat (msec) : >=2000=14.51% 00:15:50.865 cpu : usr=0.05%, sys=1.40%, ctx=1243, majf=0, minf=32769 00:15:50.865 IO depths : 1=0.1%, 2=0.2%, 4=0.4%, 8=0.8%, 16=1.5%, 32=3.0%, >=64=94.1% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.865 issued rwts: total=1061,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798437: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=4, BW=4329KiB/s (4433kB/s)(54.0MiB/12773msec) 00:15:50.865 slat (usec): min=862, max=2082.2k, avg=196781.67, stdev=592590.36 00:15:50.865 clat (msec): min=2145, max=12770, avg=9493.29, stdev=3427.19 00:15:50.865 lat (msec): min=4216, max=12772, avg=9690.07, stdev=3300.05 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 2140], 5.00th=[ 4212], 10.00th=[ 4245], 20.00th=[ 6342], 00:15:50.865 | 30.00th=[ 6477], 40.00th=[ 8557], 50.00th=[10671], 60.00th=[12684], 00:15:50.865 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.865 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.865 | 99.99th=[12818] 00:15:50.865 lat (msec) : >=2000=100.00% 00:15:50.865 cpu : usr=0.02%, sys=0.39%, ctx=42, majf=0, minf=13825 00:15:50.865 IO depths : 1=1.9%, 2=3.7%, 4=7.4%, 8=14.8%, 16=29.6%, 32=42.6%, >=64=0.0% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.865 issued rwts: total=54,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798438: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=4, BW=5097KiB/s (5219kB/s)(64.0MiB/12858msec) 00:15:50.865 slat (usec): min=958, max=2107.7k, avg=167221.24, stdev=548202.29 00:15:50.865 clat (msec): min=2155, max=12855, avg=9594.35, stdev=3109.41 00:15:50.865 lat (msec): min=4214, max=12857, avg=9761.57, stdev=2988.43 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 2165], 5.00th=[ 4245], 10.00th=[ 4279], 20.00th=[ 6409], 00:15:50.865 | 30.00th=[ 8557], 40.00th=[ 8658], 50.00th=[10671], 60.00th=[10805], 00:15:50.865 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.865 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.865 | 99.99th=[12818] 00:15:50.865 lat (msec) : >=2000=100.00% 00:15:50.865 cpu : usr=0.02%, sys=0.49%, ctx=63, majf=0, minf=16385 00:15:50.865 IO depths : 1=1.6%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.0%, >=64=1.6% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.865 issued rwts: total=64,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798439: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=5, BW=5531KiB/s (5664kB/s)(70.0MiB/12960msec) 00:15:50.865 slat (usec): min=1048, max=4221.1k, avg=154264.66, stdev=644847.04 00:15:50.865 clat (msec): min=2160, max=12955, avg=11557.61, stdev=2436.09 00:15:50.865 lat (msec): min=6381, max=12959, avg=11711.87, stdev=2158.50 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 2165], 5.00th=[ 6409], 10.00th=[ 6477], 20.00th=[10671], 00:15:50.865 | 30.00th=[12684], 40.00th=[12818], 50.00th=[12818], 60.00th=[12953], 00:15:50.865 | 70.00th=[12953], 80.00th=[12953], 90.00th=[12953], 95.00th=[12953], 00:15:50.865 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.865 | 99.99th=[12953] 00:15:50.865 lat (msec) : >=2000=100.00% 00:15:50.865 cpu : usr=0.00%, sys=0.56%, ctx=76, majf=0, minf=17921 00:15:50.865 IO depths : 1=1.4%, 2=2.9%, 4=5.7%, 8=11.4%, 16=22.9%, 32=45.7%, >=64=10.0% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.865 issued rwts: total=70,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798440: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=45, BW=45.8MiB/s (48.0MB/s)(586MiB/12804msec) 00:15:50.865 slat (usec): min=34, max=2138.8k, avg=18201.80, stdev=168236.38 00:15:50.865 clat (msec): min=136, max=6651, avg=1982.79, stdev=2542.37 00:15:50.865 lat (msec): min=137, max=6652, avg=2000.99, stdev=2550.19 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 140], 5.00th=[ 171], 10.00th=[ 192], 20.00th=[ 330], 00:15:50.865 | 30.00th=[ 502], 40.00th=[ 575], 50.00th=[ 600], 60.00th=[ 625], 00:15:50.865 | 70.00th=[ 676], 80.00th=[ 6544], 90.00th=[ 6611], 95.00th=[ 6611], 00:15:50.865 | 99.00th=[ 6611], 99.50th=[ 6678], 99.90th=[ 6678], 99.95th=[ 6678], 00:15:50.865 | 99.99th=[ 6678] 00:15:50.865 bw ( KiB/s): min= 2048, max=411648, per=6.17%, avg=156672.00, stdev=159485.16, samples=6 00:15:50.865 iops : min= 2, max= 402, avg=153.00, stdev=155.75, samples=6 00:15:50.865 lat (msec) : 250=17.06%, 500=12.80%, 750=41.13%, 1000=1.19%, >=2000=27.82% 00:15:50.865 cpu : usr=0.01%, sys=1.23%, ctx=520, majf=0, minf=32769 00:15:50.865 IO depths : 1=0.2%, 2=0.3%, 4=0.7%, 8=1.4%, 16=2.7%, 32=5.5%, >=64=89.2% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.865 issued rwts: total=586,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798441: Fri Sep 27 15:20:51 2024 00:15:50.865 read: IOPS=27, BW=27.3MiB/s (28.6MB/s)(352MiB/12895msec) 00:15:50.865 slat (usec): min=48, max=2175.9k, avg=30388.38, stdev=217349.50 00:15:50.865 clat (msec): min=645, max=7364, avg=3326.99, stdev=2725.08 00:15:50.865 lat (msec): min=667, max=7366, avg=3357.38, stdev=2728.36 00:15:50.865 clat percentiles (msec): 00:15:50.865 | 1.00th=[ 667], 5.00th=[ 701], 10.00th=[ 735], 20.00th=[ 810], 00:15:50.865 | 30.00th=[ 894], 40.00th=[ 969], 50.00th=[ 986], 60.00th=[ 4279], 00:15:50.865 | 70.00th=[ 6544], 80.00th=[ 6678], 90.00th=[ 6946], 95.00th=[ 7148], 00:15:50.865 | 99.00th=[ 7349], 99.50th=[ 7349], 99.90th=[ 7349], 99.95th=[ 7349], 00:15:50.865 | 99.99th=[ 7349] 00:15:50.865 bw ( KiB/s): min= 1501, max=194171, per=3.62%, avg=91972.80, stdev=81795.59, samples=5 00:15:50.865 iops : min= 1, max= 189, avg=89.60, stdev=79.81, samples=5 00:15:50.865 lat (msec) : 750=11.36%, 1000=40.06%, >=2000=48.58% 00:15:50.865 cpu : usr=0.02%, sys=1.05%, ctx=447, majf=0, minf=32769 00:15:50.865 IO depths : 1=0.3%, 2=0.6%, 4=1.1%, 8=2.3%, 16=4.5%, 32=9.1%, >=64=82.1% 00:15:50.865 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.865 complete : 0=0.0%, 4=99.6%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.4% 00:15:50.865 issued rwts: total=352,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.865 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.865 job1: (groupid=0, jobs=1): err= 0: pid=1798442: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=3, BW=3283KiB/s (3362kB/s)(41.0MiB/12787msec) 00:15:50.866 slat (usec): min=892, max=2130.4k, avg=259480.93, stdev=673690.15 00:15:50.866 clat (msec): min=2147, max=12772, avg=8355.39, stdev=2612.07 00:15:50.866 lat (msec): min=4220, max=12786, avg=8614.87, stdev=2506.24 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2140], 5.00th=[ 4245], 10.00th=[ 4245], 20.00th=[ 6409], 00:15:50.866 | 30.00th=[ 6477], 40.00th=[ 8490], 50.00th=[ 8557], 60.00th=[ 8658], 00:15:50.866 | 70.00th=[10671], 80.00th=[10805], 90.00th=[10805], 95.00th=[12684], 00:15:50.866 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.866 | 99.99th=[12818] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.02%, sys=0.27%, ctx=43, majf=0, minf=10497 00:15:50.866 IO depths : 1=2.4%, 2=4.9%, 4=9.8%, 8=19.5%, 16=39.0%, 32=24.4%, >=64=0.0% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.866 issued rwts: total=41,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job1: (groupid=0, jobs=1): err= 0: pid=1798443: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=3, BW=3840KiB/s (3932kB/s)(48.0MiB/12799msec) 00:15:50.866 slat (usec): min=994, max=2060.5k, avg=221846.49, stdev=619649.36 00:15:50.866 clat (msec): min=2149, max=12795, avg=7836.59, stdev=2992.97 00:15:50.866 lat (msec): min=4200, max=12798, avg=8058.44, stdev=2956.92 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2165], 5.00th=[ 4212], 10.00th=[ 4245], 20.00th=[ 4279], 00:15:50.866 | 30.00th=[ 6409], 40.00th=[ 6409], 50.00th=[ 6477], 60.00th=[ 8557], 00:15:50.866 | 70.00th=[10671], 80.00th=[10805], 90.00th=[12684], 95.00th=[12818], 00:15:50.866 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.866 | 99.99th=[12818] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.00%, sys=0.35%, ctx=56, majf=0, minf=12289 00:15:50.866 IO depths : 1=2.1%, 2=4.2%, 4=8.3%, 8=16.7%, 16=33.3%, 32=35.4%, >=64=0.0% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.866 issued rwts: total=48,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job1: (groupid=0, jobs=1): err= 0: pid=1798444: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=7, BW=7431KiB/s (7609kB/s)(94.0MiB/12954msec) 00:15:50.866 slat (usec): min=970, max=2084.9k, avg=114846.29, stdev=459729.32 00:15:50.866 clat (msec): min=2157, max=12951, avg=11298.30, stdev=2819.59 00:15:50.866 lat (msec): min=4238, max=12952, avg=11413.14, stdev=2658.51 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2165], 5.00th=[ 4279], 10.00th=[ 6409], 20.00th=[ 8557], 00:15:50.866 | 30.00th=[12684], 40.00th=[12684], 50.00th=[12818], 60.00th=[12818], 00:15:50.866 | 70.00th=[12953], 80.00th=[12953], 90.00th=[12953], 95.00th=[12953], 00:15:50.866 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.866 | 99.99th=[12953] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.02%, sys=0.76%, ctx=88, majf=0, minf=24065 00:15:50.866 IO depths : 1=1.1%, 2=2.1%, 4=4.3%, 8=8.5%, 16=17.0%, 32=34.0%, >=64=33.0% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.866 issued rwts: total=94,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job1: (groupid=0, jobs=1): err= 0: pid=1798445: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=48, BW=48.3MiB/s (50.6MB/s)(521MiB/10792msec) 00:15:50.866 slat (usec): min=46, max=2125.8k, avg=20580.95, stdev=178150.20 00:15:50.866 clat (msec): min=66, max=10643, avg=2270.21, stdev=2569.19 00:15:50.866 lat (msec): min=407, max=10657, avg=2290.80, stdev=2573.12 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 405], 5.00th=[ 414], 10.00th=[ 418], 20.00th=[ 456], 00:15:50.866 | 30.00th=[ 542], 40.00th=[ 625], 50.00th=[ 693], 60.00th=[ 768], 00:15:50.866 | 70.00th=[ 2769], 80.00th=[ 6477], 90.00th=[ 6678], 95.00th=[ 6745], 00:15:50.866 | 99.00th=[ 6812], 99.50th=[ 6879], 99.90th=[10671], 99.95th=[10671], 00:15:50.866 | 99.99th=[10671] 00:15:50.866 bw ( KiB/s): min= 2048, max=284672, per=5.29%, avg=134144.00, stdev=112477.91, samples=6 00:15:50.866 iops : min= 2, max= 278, avg=131.00, stdev=109.84, samples=6 00:15:50.866 lat (msec) : 100=0.19%, 500=23.80%, 750=34.55%, 1000=6.33%, >=2000=35.12% 00:15:50.866 cpu : usr=0.03%, sys=1.28%, ctx=491, majf=0, minf=32769 00:15:50.866 IO depths : 1=0.2%, 2=0.4%, 4=0.8%, 8=1.5%, 16=3.1%, 32=6.1%, >=64=87.9% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=99.7%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.3% 00:15:50.866 issued rwts: total=521,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job1: (groupid=0, jobs=1): err= 0: pid=1798446: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=6, BW=7049KiB/s (7218kB/s)(89.0MiB/12929msec) 00:15:50.866 slat (usec): min=965, max=4253.0k, avg=121099.04, stdev=564551.65 00:15:50.866 clat (msec): min=2150, max=12925, avg=10394.72, stdev=2404.84 00:15:50.866 lat (msec): min=6403, max=12928, avg=10515.82, stdev=2251.44 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2165], 5.00th=[ 8221], 10.00th=[ 8221], 20.00th=[ 8288], 00:15:50.866 | 30.00th=[ 8423], 40.00th=[ 8490], 50.00th=[ 8658], 60.00th=[12684], 00:15:50.866 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12953], 95.00th=[12953], 00:15:50.866 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.866 | 99.99th=[12953] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.00%, sys=0.71%, ctx=130, majf=0, minf=22785 00:15:50.866 IO depths : 1=1.1%, 2=2.2%, 4=4.5%, 8=9.0%, 16=18.0%, 32=36.0%, >=64=29.2% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.866 issued rwts: total=89,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job1: (groupid=0, jobs=1): err= 0: pid=1798447: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=5, BW=5718KiB/s (5856kB/s)(72.0MiB/12893msec) 00:15:50.866 slat (usec): min=581, max=4253.0k, avg=149206.50, stdev=640141.42 00:15:50.866 clat (msec): min=2149, max=12891, avg=12061.68, stdev=1832.23 00:15:50.866 lat (msec): min=6402, max=12892, avg=12210.88, stdev=1400.17 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2165], 5.00th=[ 8658], 10.00th=[ 8658], 20.00th=[12684], 00:15:50.866 | 30.00th=[12684], 40.00th=[12684], 50.00th=[12684], 60.00th=[12684], 00:15:50.866 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12953], 00:15:50.866 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.866 | 99.99th=[12953] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.01%, sys=0.50%, ctx=41, majf=0, minf=18433 00:15:50.866 IO depths : 1=1.4%, 2=2.8%, 4=5.6%, 8=11.1%, 16=22.2%, 32=44.4%, >=64=12.5% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.866 issued rwts: total=72,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job2: (groupid=0, jobs=1): err= 0: pid=1798448: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=6, BW=6972KiB/s (7140kB/s)(88.0MiB/12924msec) 00:15:50.866 slat (usec): min=991, max=2082.7k, avg=122592.72, stdev=473553.09 00:15:50.866 clat (msec): min=2135, max=12922, avg=10578.46, stdev=3309.25 00:15:50.866 lat (msec): min=4218, max=12923, avg=10701.06, stdev=3190.58 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2140], 5.00th=[ 4245], 10.00th=[ 4279], 20.00th=[ 6409], 00:15:50.866 | 30.00th=[10671], 40.00th=[12684], 50.00th=[12818], 60.00th=[12818], 00:15:50.866 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12953], 95.00th=[12953], 00:15:50.866 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.866 | 99.99th=[12953] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.01%, sys=0.73%, ctx=92, majf=0, minf=22529 00:15:50.866 IO depths : 1=1.1%, 2=2.3%, 4=4.5%, 8=9.1%, 16=18.2%, 32=36.4%, >=64=28.4% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.866 issued rwts: total=88,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.866 job2: (groupid=0, jobs=1): err= 0: pid=1798449: Fri Sep 27 15:20:51 2024 00:15:50.866 read: IOPS=5, BW=5812KiB/s (5951kB/s)(73.0MiB/12862msec) 00:15:50.866 slat (usec): min=872, max=2060.7k, avg=147146.29, stdev=511959.97 00:15:50.866 clat (msec): min=2119, max=12859, avg=10874.63, stdev=3112.03 00:15:50.866 lat (msec): min=4179, max=12861, avg=11021.77, stdev=2941.58 00:15:50.866 clat percentiles (msec): 00:15:50.866 | 1.00th=[ 2123], 5.00th=[ 4212], 10.00th=[ 4279], 20.00th=[ 8490], 00:15:50.866 | 30.00th=[10671], 40.00th=[12684], 50.00th=[12818], 60.00th=[12818], 00:15:50.866 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.866 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.866 | 99.99th=[12818] 00:15:50.866 lat (msec) : >=2000=100.00% 00:15:50.866 cpu : usr=0.01%, sys=0.59%, ctx=108, majf=0, minf=18689 00:15:50.866 IO depths : 1=1.4%, 2=2.7%, 4=5.5%, 8=11.0%, 16=21.9%, 32=43.8%, >=64=13.7% 00:15:50.866 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.866 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.866 issued rwts: total=73,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.866 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798450: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=2, BW=2491KiB/s (2550kB/s)(31.0MiB/12746msec) 00:15:50.867 slat (usec): min=1103, max=2061.6k, avg=342615.93, stdev=759753.97 00:15:50.867 clat (msec): min=2124, max=10698, avg=7421.61, stdev=2584.04 00:15:50.867 lat (msec): min=4179, max=12745, avg=7764.23, stdev=2562.28 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 2123], 5.00th=[ 4178], 10.00th=[ 4212], 20.00th=[ 4245], 00:15:50.867 | 30.00th=[ 6342], 40.00th=[ 6409], 50.00th=[ 8490], 60.00th=[ 8557], 00:15:50.867 | 70.00th=[ 8557], 80.00th=[10671], 90.00th=[10671], 95.00th=[10671], 00:15:50.867 | 99.00th=[10671], 99.50th=[10671], 99.90th=[10671], 99.95th=[10671], 00:15:50.867 | 99.99th=[10671] 00:15:50.867 lat (msec) : >=2000=100.00% 00:15:50.867 cpu : usr=0.00%, sys=0.20%, ctx=44, majf=0, minf=7937 00:15:50.867 IO depths : 1=3.2%, 2=6.5%, 4=12.9%, 8=25.8%, 16=51.6%, 32=0.0%, >=64=0.0% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.867 issued rwts: total=31,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798451: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=60, BW=60.8MiB/s (63.8MB/s)(648MiB/10653msec) 00:15:50.867 slat (usec): min=47, max=2079.9k, avg=16321.09, stdev=147619.31 00:15:50.867 clat (msec): min=73, max=8629, avg=2016.73, stdev=2223.27 00:15:50.867 lat (msec): min=402, max=9843, avg=2033.06, stdev=2234.23 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 405], 5.00th=[ 409], 10.00th=[ 422], 20.00th=[ 502], 00:15:50.867 | 30.00th=[ 600], 40.00th=[ 659], 50.00th=[ 735], 60.00th=[ 1821], 00:15:50.867 | 70.00th=[ 1955], 80.00th=[ 2601], 90.00th=[ 6611], 95.00th=[ 6678], 00:15:50.867 | 99.00th=[ 6812], 99.50th=[ 6812], 99.90th=[ 8658], 99.95th=[ 8658], 00:15:50.867 | 99.99th=[ 8658] 00:15:50.867 bw ( KiB/s): min= 2048, max=288768, per=4.20%, avg=106663.60, stdev=99418.39, samples=10 00:15:50.867 iops : min= 2, max= 282, avg=104.10, stdev=97.03, samples=10 00:15:50.867 lat (msec) : 100=0.15%, 500=19.14%, 750=32.41%, 1000=5.56%, 2000=16.82% 00:15:50.867 lat (msec) : >=2000=25.93% 00:15:50.867 cpu : usr=0.02%, sys=1.44%, ctx=570, majf=0, minf=32769 00:15:50.867 IO depths : 1=0.2%, 2=0.3%, 4=0.6%, 8=1.2%, 16=2.5%, 32=4.9%, >=64=90.3% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.867 issued rwts: total=648,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798452: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=2, BW=2656KiB/s (2720kB/s)(33.0MiB/12723msec) 00:15:50.867 slat (usec): min=1174, max=2052.8k, avg=321403.39, stdev=727138.57 00:15:50.867 clat (msec): min=2115, max=12711, avg=7596.13, stdev=2976.05 00:15:50.867 lat (msec): min=4158, max=12722, avg=7917.53, stdev=2938.19 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 2123], 5.00th=[ 4144], 10.00th=[ 4178], 20.00th=[ 4212], 00:15:50.867 | 30.00th=[ 6342], 40.00th=[ 6409], 50.00th=[ 6409], 60.00th=[ 8490], 00:15:50.867 | 70.00th=[10671], 80.00th=[10671], 90.00th=[10671], 95.00th=[12684], 00:15:50.867 | 99.00th=[12684], 99.50th=[12684], 99.90th=[12684], 99.95th=[12684], 00:15:50.867 | 99.99th=[12684] 00:15:50.867 lat (msec) : >=2000=100.00% 00:15:50.867 cpu : usr=0.02%, sys=0.26%, ctx=59, majf=0, minf=8449 00:15:50.867 IO depths : 1=3.0%, 2=6.1%, 4=12.1%, 8=24.2%, 16=48.5%, 32=6.1%, >=64=0.0% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.867 issued rwts: total=33,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798453: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=3, BW=3433KiB/s (3515kB/s)(43.0MiB/12827msec) 00:15:50.867 slat (usec): min=1077, max=2107.9k, avg=248813.54, stdev=662389.87 00:15:50.867 clat (msec): min=2127, max=12817, avg=9904.61, stdev=3812.08 00:15:50.867 lat (msec): min=4188, max=12826, avg=10153.43, stdev=3637.56 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 2123], 5.00th=[ 4212], 10.00th=[ 4212], 20.00th=[ 4279], 00:15:50.867 | 30.00th=[ 6409], 40.00th=[10671], 50.00th=[12684], 60.00th=[12818], 00:15:50.867 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.867 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.867 | 99.99th=[12818] 00:15:50.867 lat (msec) : >=2000=100.00% 00:15:50.867 cpu : usr=0.01%, sys=0.35%, ctx=62, majf=0, minf=11009 00:15:50.867 IO depths : 1=2.3%, 2=4.7%, 4=9.3%, 8=18.6%, 16=37.2%, 32=27.9%, >=64=0.0% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.867 issued rwts: total=43,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798454: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=7, BW=7500KiB/s (7680kB/s)(94.0MiB/12834msec) 00:15:50.867 slat (usec): min=582, max=2072.9k, avg=113824.13, stdev=458496.40 00:15:50.867 clat (msec): min=2133, max=12830, avg=9977.77, stdev=2572.14 00:15:50.867 lat (msec): min=4206, max=12833, avg=10091.59, stdev=2455.36 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 2140], 5.00th=[ 4245], 10.00th=[ 6342], 20.00th=[ 8490], 00:15:50.867 | 30.00th=[ 8658], 40.00th=[10671], 50.00th=[10671], 60.00th=[10671], 00:15:50.867 | 70.00th=[10805], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.867 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.867 | 99.99th=[12818] 00:15:50.867 lat (msec) : >=2000=100.00% 00:15:50.867 cpu : usr=0.00%, sys=0.63%, ctx=57, majf=0, minf=24065 00:15:50.867 IO depths : 1=1.1%, 2=2.1%, 4=4.3%, 8=8.5%, 16=17.0%, 32=34.0%, >=64=33.0% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.867 issued rwts: total=94,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798455: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=4, BW=4288KiB/s (4391kB/s)(54.0MiB/12895msec) 00:15:50.867 slat (usec): min=677, max=2100.4k, avg=199194.67, stdev=596402.59 00:15:50.867 clat (msec): min=2138, max=12894, avg=9473.95, stdev=3519.33 00:15:50.867 lat (msec): min=4238, max=12894, avg=9673.14, stdev=3398.63 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 2140], 5.00th=[ 4245], 10.00th=[ 4279], 20.00th=[ 6342], 00:15:50.867 | 30.00th=[ 6477], 40.00th=[ 8557], 50.00th=[10671], 60.00th=[12684], 00:15:50.867 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12953], 95.00th=[12953], 00:15:50.867 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.867 | 99.99th=[12953] 00:15:50.867 lat (msec) : >=2000=100.00% 00:15:50.867 cpu : usr=0.01%, sys=0.38%, ctx=65, majf=0, minf=13825 00:15:50.867 IO depths : 1=1.9%, 2=3.7%, 4=7.4%, 8=14.8%, 16=29.6%, 32=42.6%, >=64=0.0% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.867 issued rwts: total=54,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798456: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=4, BW=4817KiB/s (4932kB/s)(60.0MiB/12756msec) 00:15:50.867 slat (usec): min=976, max=2077.3k, avg=177198.38, stdev=563016.06 00:15:50.867 clat (msec): min=2123, max=12753, avg=9718.33, stdev=3279.78 00:15:50.867 lat (msec): min=4189, max=12755, avg=9895.53, stdev=3147.01 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 2123], 5.00th=[ 4212], 10.00th=[ 4245], 20.00th=[ 6342], 00:15:50.867 | 30.00th=[ 6409], 40.00th=[ 8557], 50.00th=[10671], 60.00th=[12550], 00:15:50.867 | 70.00th=[12684], 80.00th=[12684], 90.00th=[12684], 95.00th=[12684], 00:15:50.867 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.867 | 99.99th=[12818] 00:15:50.867 lat (msec) : >=2000=100.00% 00:15:50.867 cpu : usr=0.01%, sys=0.49%, ctx=55, majf=0, minf=15361 00:15:50.867 IO depths : 1=1.7%, 2=3.3%, 4=6.7%, 8=13.3%, 16=26.7%, 32=48.3%, >=64=0.0% 00:15:50.867 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.867 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.867 issued rwts: total=60,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.867 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.867 job2: (groupid=0, jobs=1): err= 0: pid=1798457: Fri Sep 27 15:20:51 2024 00:15:50.867 read: IOPS=128, BW=129MiB/s (135MB/s)(1661MiB/12890msec) 00:15:50.867 slat (usec): min=47, max=2055.9k, avg=6474.47, stdev=98363.48 00:15:50.867 clat (msec): min=121, max=12761, avg=894.26, stdev=2237.08 00:15:50.867 lat (msec): min=122, max=12776, avg=900.74, stdev=2250.54 00:15:50.867 clat percentiles (msec): 00:15:50.867 | 1.00th=[ 123], 5.00th=[ 124], 10.00th=[ 124], 20.00th=[ 124], 00:15:50.867 | 30.00th=[ 125], 40.00th=[ 125], 50.00th=[ 126], 60.00th=[ 126], 00:15:50.867 | 70.00th=[ 186], 80.00th=[ 264], 90.00th=[ 2366], 95.00th=[ 8658], 00:15:50.867 | 99.00th=[ 8658], 99.50th=[ 8658], 99.90th=[12684], 99.95th=[12818], 00:15:50.867 | 99.99th=[12818] 00:15:50.867 bw ( KiB/s): min= 1501, max=991232, per=13.75%, avg=349009.44, stdev=414771.07, samples=9 00:15:50.867 iops : min= 1, max= 968, avg=340.78, stdev=405.10, samples=9 00:15:50.867 lat (msec) : 250=73.57%, 500=13.73%, >=2000=12.70% 00:15:50.867 cpu : usr=0.02%, sys=1.64%, ctx=1561, majf=0, minf=32769 00:15:50.867 IO depths : 1=0.1%, 2=0.1%, 4=0.2%, 8=0.5%, 16=1.0%, 32=1.9%, >=64=96.2% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.868 issued rwts: total=1661,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job2: (groupid=0, jobs=1): err= 0: pid=1798458: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=1, BW=1363KiB/s (1396kB/s)(17.0MiB/12771msec) 00:15:50.868 slat (msec): min=11, max=2090, avg=626.67, stdev=958.32 00:15:50.868 clat (msec): min=2116, max=10731, avg=7018.42, stdev=2921.24 00:15:50.868 lat (msec): min=4172, max=12770, avg=7645.08, stdev=2946.65 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 2123], 5.00th=[ 2123], 10.00th=[ 4178], 20.00th=[ 4212], 00:15:50.868 | 30.00th=[ 4279], 40.00th=[ 6342], 50.00th=[ 6409], 60.00th=[ 8490], 00:15:50.868 | 70.00th=[ 8557], 80.00th=[10671], 90.00th=[10671], 95.00th=[10671], 00:15:50.868 | 99.00th=[10671], 99.50th=[10671], 99.90th=[10671], 99.95th=[10671], 00:15:50.868 | 99.99th=[10671] 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.00%, sys=0.13%, ctx=43, majf=0, minf=4353 00:15:50.868 IO depths : 1=5.9%, 2=11.8%, 4=23.5%, 8=47.1%, 16=11.8%, 32=0.0%, >=64=0.0% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.868 issued rwts: total=17,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job2: (groupid=0, jobs=1): err= 0: pid=1798459: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=1, BW=1284KiB/s (1315kB/s)(16.0MiB/12762msec) 00:15:50.868 slat (msec): min=2, max=2106, avg=664.18, stdev=984.88 00:15:50.868 clat (msec): min=2134, max=12755, avg=7852.62, stdev=3522.75 00:15:50.868 lat (msec): min=4220, max=12761, avg=8516.80, stdev=3371.25 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 2140], 5.00th=[ 2140], 10.00th=[ 4212], 20.00th=[ 4245], 00:15:50.868 | 30.00th=[ 6342], 40.00th=[ 6409], 50.00th=[ 6409], 60.00th=[ 8557], 00:15:50.868 | 70.00th=[10671], 80.00th=[12684], 90.00th=[12818], 95.00th=[12818], 00:15:50.868 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.868 | 99.99th=[12818] 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.02%, sys=0.10%, ctx=46, majf=0, minf=4097 00:15:50.868 IO depths : 1=6.2%, 2=12.5%, 4=25.0%, 8=50.0%, 16=6.2%, 32=0.0%, >=64=0.0% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=100.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 issued rwts: total=16,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job2: (groupid=0, jobs=1): err= 0: pid=1798460: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=3, BW=3772KiB/s (3863kB/s)(47.0MiB/12758msec) 00:15:50.868 slat (usec): min=875, max=2091.9k, avg=226009.33, stdev=637779.65 00:15:50.868 clat (msec): min=2134, max=12755, avg=8748.94, stdev=3404.01 00:15:50.868 lat (msec): min=4226, max=12757, avg=8974.94, stdev=3306.57 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 2140], 5.00th=[ 4245], 10.00th=[ 4245], 20.00th=[ 4329], 00:15:50.868 | 30.00th=[ 6409], 40.00th=[ 6477], 50.00th=[ 8557], 60.00th=[10671], 00:15:50.868 | 70.00th=[12684], 80.00th=[12684], 90.00th=[12684], 95.00th=[12684], 00:15:50.868 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.868 | 99.99th=[12818] 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.01%, sys=0.34%, ctx=39, majf=0, minf=12033 00:15:50.868 IO depths : 1=2.1%, 2=4.3%, 4=8.5%, 8=17.0%, 16=34.0%, 32=34.0%, >=64=0.0% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.868 issued rwts: total=47,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798461: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=10, BW=10.1MiB/s (10.6MB/s)(129MiB/12797msec) 00:15:50.868 slat (usec): min=403, max=2143.0k, avg=82767.47, stdev=390037.90 00:15:50.868 clat (msec): min=2118, max=12789, avg=10744.52, stdev=3022.58 00:15:50.868 lat (msec): min=4154, max=12791, avg=10827.29, stdev=2929.23 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 4144], 5.00th=[ 4212], 10.00th=[ 4245], 20.00th=[ 8557], 00:15:50.868 | 30.00th=[10537], 40.00th=[12550], 50.00th=[12550], 60.00th=[12550], 00:15:50.868 | 70.00th=[12550], 80.00th=[12684], 90.00th=[12818], 95.00th=[12818], 00:15:50.868 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.868 | 99.99th=[12818] 00:15:50.868 bw ( KiB/s): min= 1563, max= 2048, per=0.07%, avg=1805.50, stdev=342.95, samples=2 00:15:50.868 iops : min= 1, max= 2, avg= 1.50, stdev= 0.71, samples=2 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.00%, sys=0.81%, ctx=77, majf=0, minf=32769 00:15:50.868 IO depths : 1=0.8%, 2=1.6%, 4=3.1%, 8=6.2%, 16=12.4%, 32=24.8%, >=64=51.2% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=66.7%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=33.3% 00:15:50.868 issued rwts: total=129,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798462: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=6, BW=6298KiB/s (6450kB/s)(79.0MiB/12844msec) 00:15:50.868 slat (usec): min=450, max=2103.4k, avg=135743.86, stdev=480533.34 00:15:50.868 clat (msec): min=2119, max=12839, avg=10960.28, stdev=2656.90 00:15:50.868 lat (msec): min=4222, max=12843, avg=11096.02, stdev=2466.55 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 2123], 5.00th=[ 4279], 10.00th=[ 6477], 20.00th=[ 8557], 00:15:50.868 | 30.00th=[10671], 40.00th=[12281], 50.00th=[12416], 60.00th=[12416], 00:15:50.868 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.868 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.868 | 99.99th=[12818] 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.00%, sys=0.55%, ctx=139, majf=0, minf=20225 00:15:50.868 IO depths : 1=1.3%, 2=2.5%, 4=5.1%, 8=10.1%, 16=20.3%, 32=40.5%, >=64=20.3% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.868 issued rwts: total=79,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798463: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=32, BW=32.4MiB/s (34.0MB/s)(325MiB/10024msec) 00:15:50.868 slat (usec): min=112, max=2063.8k, avg=30768.62, stdev=223184.26 00:15:50.868 clat (msec): min=22, max=9412, avg=887.23, stdev=1936.15 00:15:50.868 lat (msec): min=24, max=9414, avg=918.00, stdev=1993.37 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 27], 5.00th=[ 50], 10.00th=[ 74], 20.00th=[ 169], 00:15:50.868 | 30.00th=[ 247], 40.00th=[ 305], 50.00th=[ 368], 60.00th=[ 418], 00:15:50.868 | 70.00th=[ 535], 80.00th=[ 625], 90.00th=[ 869], 95.00th=[ 5201], 00:15:50.868 | 99.00th=[ 9463], 99.50th=[ 9463], 99.90th=[ 9463], 99.95th=[ 9463], 00:15:50.868 | 99.99th=[ 9463] 00:15:50.868 bw ( KiB/s): min=40960, max=364544, per=7.99%, avg=202752.00, stdev=228808.44, samples=2 00:15:50.868 iops : min= 40, max= 356, avg=198.00, stdev=223.45, samples=2 00:15:50.868 lat (msec) : 50=5.23%, 100=7.69%, 250=17.54%, 500=37.54%, 750=17.54% 00:15:50.868 lat (msec) : 1000=6.77%, >=2000=7.69% 00:15:50.868 cpu : usr=0.00%, sys=1.21%, ctx=711, majf=0, minf=32769 00:15:50.868 IO depths : 1=0.3%, 2=0.6%, 4=1.2%, 8=2.5%, 16=4.9%, 32=9.8%, >=64=80.6% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=99.5%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.5% 00:15:50.868 issued rwts: total=325,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798464: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=5, BW=6127KiB/s (6274kB/s)(77.0MiB/12869msec) 00:15:50.868 slat (usec): min=953, max=2065.5k, avg=139612.15, stdev=500261.97 00:15:50.868 clat (msec): min=2118, max=12866, avg=9979.36, stdev=3385.54 00:15:50.868 lat (msec): min=4167, max=12868, avg=10118.97, stdev=3277.02 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 2123], 5.00th=[ 4178], 10.00th=[ 4245], 20.00th=[ 6342], 00:15:50.868 | 30.00th=[ 8490], 40.00th=[10671], 50.00th=[12550], 60.00th=[12684], 00:15:50.868 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.868 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.868 | 99.99th=[12818] 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.00%, sys=0.64%, ctx=86, majf=0, minf=19713 00:15:50.868 IO depths : 1=1.3%, 2=2.6%, 4=5.2%, 8=10.4%, 16=20.8%, 32=41.6%, >=64=18.2% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.868 issued rwts: total=77,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798465: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=46, BW=46.1MiB/s (48.4MB/s)(591MiB/12806msec) 00:15:50.868 slat (usec): min=44, max=2072.3k, avg=18069.14, stdev=163432.01 00:15:50.868 clat (msec): min=334, max=10884, avg=2633.54, stdev=3979.22 00:15:50.868 lat (msec): min=335, max=10886, avg=2651.61, stdev=3992.04 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 334], 5.00th=[ 368], 10.00th=[ 376], 20.00th=[ 380], 00:15:50.868 | 30.00th=[ 380], 40.00th=[ 405], 50.00th=[ 422], 60.00th=[ 489], 00:15:50.868 | 70.00th=[ 634], 80.00th=[ 6678], 90.00th=[10805], 95.00th=[10805], 00:15:50.868 | 99.00th=[10805], 99.50th=[10939], 99.90th=[10939], 99.95th=[10939], 00:15:50.868 | 99.99th=[10939] 00:15:50.868 bw ( KiB/s): min= 2048, max=360448, per=4.16%, avg=105583.00, stdev=143636.46, samples=9 00:15:50.868 iops : min= 2, max= 352, avg=103.00, stdev=140.35, samples=9 00:15:50.868 lat (msec) : 500=60.07%, 750=12.01%, 1000=2.54%, >=2000=25.38% 00:15:50.868 cpu : usr=0.00%, sys=1.15%, ctx=542, majf=0, minf=32769 00:15:50.868 IO depths : 1=0.2%, 2=0.3%, 4=0.7%, 8=1.4%, 16=2.7%, 32=5.4%, >=64=89.3% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.868 issued rwts: total=591,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798466: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=15, BW=15.4MiB/s (16.2MB/s)(197MiB/12781msec) 00:15:50.868 slat (usec): min=417, max=2075.3k, avg=54090.60, stdev=296679.00 00:15:50.868 clat (msec): min=633, max=9154, avg=3496.61, stdev=2057.42 00:15:50.868 lat (msec): min=638, max=9173, avg=3550.70, stdev=2083.51 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 634], 5.00th=[ 651], 10.00th=[ 659], 20.00th=[ 768], 00:15:50.868 | 30.00th=[ 3339], 40.00th=[ 3473], 50.00th=[ 3574], 60.00th=[ 3675], 00:15:50.868 | 70.00th=[ 3775], 80.00th=[ 3876], 90.00th=[ 6342], 95.00th=[ 9060], 00:15:50.868 | 99.00th=[ 9194], 99.50th=[ 9194], 99.90th=[ 9194], 99.95th=[ 9194], 00:15:50.868 | 99.99th=[ 9194] 00:15:50.868 bw ( KiB/s): min= 2048, max=141029, per=2.82%, avg=71538.50, stdev=98274.41, samples=2 00:15:50.868 iops : min= 2, max= 137, avg=69.50, stdev=95.46, samples=2 00:15:50.868 lat (msec) : 750=19.29%, 1000=1.02%, 2000=0.51%, >=2000=79.19% 00:15:50.868 cpu : usr=0.02%, sys=0.69%, ctx=459, majf=0, minf=32769 00:15:50.868 IO depths : 1=0.5%, 2=1.0%, 4=2.0%, 8=4.1%, 16=8.1%, 32=16.2%, >=64=68.0% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=98.6%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=1.4% 00:15:50.868 issued rwts: total=197,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798467: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=22, BW=22.9MiB/s (24.0MB/s)(247MiB/10777msec) 00:15:50.868 slat (usec): min=586, max=2090.0k, avg=40483.96, stdev=256735.59 00:15:50.868 clat (msec): min=599, max=9514, avg=1991.85, stdev=2786.33 00:15:50.868 lat (msec): min=605, max=9526, avg=2032.34, stdev=2825.43 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 609], 5.00th=[ 634], 10.00th=[ 659], 20.00th=[ 701], 00:15:50.868 | 30.00th=[ 793], 40.00th=[ 852], 50.00th=[ 919], 60.00th=[ 1028], 00:15:50.868 | 70.00th=[ 1133], 80.00th=[ 1284], 90.00th=[ 9463], 95.00th=[ 9463], 00:15:50.868 | 99.00th=[ 9463], 99.50th=[ 9463], 99.90th=[ 9463], 99.95th=[ 9463], 00:15:50.868 | 99.99th=[ 9463] 00:15:50.868 bw ( KiB/s): min=24576, max=221184, per=4.84%, avg=122880.00, stdev=139022.85, samples=2 00:15:50.868 iops : min= 24, max= 216, avg=120.00, stdev=135.76, samples=2 00:15:50.868 lat (msec) : 750=25.51%, 1000=31.17%, 2000=29.55%, >=2000=13.77% 00:15:50.868 cpu : usr=0.00%, sys=1.19%, ctx=685, majf=0, minf=32769 00:15:50.868 IO depths : 1=0.4%, 2=0.8%, 4=1.6%, 8=3.2%, 16=6.5%, 32=13.0%, >=64=74.5% 00:15:50.868 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.868 complete : 0=0.0%, 4=99.2%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.8% 00:15:50.868 issued rwts: total=247,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.868 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.868 job3: (groupid=0, jobs=1): err= 0: pid=1798468: Fri Sep 27 15:20:51 2024 00:15:50.868 read: IOPS=7, BW=7413KiB/s (7591kB/s)(93.0MiB/12846msec) 00:15:50.868 slat (usec): min=855, max=2040.0k, avg=115348.96, stdev=453021.97 00:15:50.868 clat (msec): min=2117, max=12841, avg=9427.31, stdev=3312.22 00:15:50.868 lat (msec): min=4149, max=12845, avg=9542.66, stdev=3240.92 00:15:50.868 clat percentiles (msec): 00:15:50.868 | 1.00th=[ 2123], 5.00th=[ 4178], 10.00th=[ 4245], 20.00th=[ 6342], 00:15:50.868 | 30.00th=[ 6409], 40.00th=[ 8557], 50.00th=[10671], 60.00th=[10671], 00:15:50.868 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.868 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.868 | 99.99th=[12818] 00:15:50.868 lat (msec) : >=2000=100.00% 00:15:50.868 cpu : usr=0.00%, sys=0.74%, ctx=90, majf=0, minf=23809 00:15:50.869 IO depths : 1=1.1%, 2=2.2%, 4=4.3%, 8=8.6%, 16=17.2%, 32=34.4%, >=64=32.3% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.869 issued rwts: total=93,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job3: (groupid=0, jobs=1): err= 0: pid=1798469: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=4, BW=4214KiB/s (4315kB/s)(53.0MiB/12878msec) 00:15:50.869 slat (usec): min=648, max=2107.5k, avg=202897.80, stdev=601657.43 00:15:50.869 clat (msec): min=2123, max=12871, avg=10795.56, stdev=3295.63 00:15:50.869 lat (msec): min=4205, max=12877, avg=10998.46, stdev=3075.15 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2123], 5.00th=[ 4245], 10.00th=[ 4279], 20.00th=[ 8490], 00:15:50.869 | 30.00th=[10537], 40.00th=[12818], 50.00th=[12818], 60.00th=[12818], 00:15:50.869 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.869 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.869 | 99.99th=[12818] 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.01%, sys=0.43%, ctx=77, majf=0, minf=13569 00:15:50.869 IO depths : 1=1.9%, 2=3.8%, 4=7.5%, 8=15.1%, 16=30.2%, 32=41.5%, >=64=0.0% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.869 issued rwts: total=53,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job3: (groupid=0, jobs=1): err= 0: pid=1798470: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=9, BW=9.93MiB/s (10.4MB/s)(128MiB/12887msec) 00:15:50.869 slat (usec): min=905, max=2095.3k, avg=84050.09, stdev=382514.21 00:15:50.869 clat (msec): min=2127, max=12882, avg=11938.69, stdev=1936.27 00:15:50.869 lat (msec): min=4195, max=12883, avg=12022.74, stdev=1729.48 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 4212], 5.00th=[ 6409], 10.00th=[10671], 20.00th=[12281], 00:15:50.869 | 30.00th=[12416], 40.00th=[12416], 50.00th=[12416], 60.00th=[12550], 00:15:50.869 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.869 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.869 | 99.99th=[12818] 00:15:50.869 bw ( KiB/s): min= 1501, max= 1501, per=0.06%, avg=1501.00, stdev= 0.00, samples=1 00:15:50.869 iops : min= 1, max= 1, avg= 1.00, stdev= 0.00, samples=1 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.00%, sys=1.01%, ctx=149, majf=0, minf=32769 00:15:50.869 IO depths : 1=0.8%, 2=1.6%, 4=3.1%, 8=6.2%, 16=12.5%, 32=25.0%, >=64=50.8% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=50.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=50.0% 00:15:50.869 issued rwts: total=128,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job3: (groupid=0, jobs=1): err= 0: pid=1798471: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=11, BW=11.4MiB/s (12.0MB/s)(146MiB/12793msec) 00:15:50.869 slat (usec): min=130, max=2055.8k, avg=73071.93, stdev=351432.12 00:15:50.869 clat (msec): min=2123, max=10662, avg=5164.33, stdev=2015.61 00:15:50.869 lat (msec): min=3853, max=10704, avg=5237.40, stdev=2079.22 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 3842], 5.00th=[ 3876], 10.00th=[ 3876], 20.00th=[ 3943], 00:15:50.869 | 30.00th=[ 3977], 40.00th=[ 4010], 50.00th=[ 4077], 60.00th=[ 4111], 00:15:50.869 | 70.00th=[ 4212], 80.00th=[ 6745], 90.00th=[ 8658], 95.00th=[ 8926], 00:15:50.869 | 99.00th=[10537], 99.50th=[10671], 99.90th=[10671], 99.95th=[10671], 00:15:50.869 | 99.99th=[10671] 00:15:50.869 bw ( KiB/s): min= 2048, max=36864, per=0.77%, avg=19456.00, stdev=24618.63, samples=2 00:15:50.869 iops : min= 2, max= 36, avg=19.00, stdev=24.04, samples=2 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.01%, sys=0.88%, ctx=154, majf=0, minf=32769 00:15:50.869 IO depths : 1=0.7%, 2=1.4%, 4=2.7%, 8=5.5%, 16=11.0%, 32=21.9%, >=64=56.8% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=95.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=5.0% 00:15:50.869 issued rwts: total=146,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job3: (groupid=0, jobs=1): err= 0: pid=1798472: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=1, BW=1283KiB/s (1314kB/s)(16.0MiB/12771msec) 00:15:50.869 slat (msec): min=6, max=2090, avg=666.02, stdev=974.87 00:15:50.869 clat (msec): min=2114, max=12764, avg=7584.97, stdev=3549.54 00:15:50.869 lat (msec): min=4202, max=12770, avg=8250.99, stdev=3453.03 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2123], 5.00th=[ 2123], 10.00th=[ 4212], 20.00th=[ 4245], 00:15:50.869 | 30.00th=[ 4279], 40.00th=[ 6409], 50.00th=[ 6409], 60.00th=[ 8557], 00:15:50.869 | 70.00th=[10671], 80.00th=[10671], 90.00th=[12684], 95.00th=[12818], 00:15:50.869 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.869 | 99.99th=[12818] 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.00%, sys=0.12%, ctx=55, majf=0, minf=4097 00:15:50.869 IO depths : 1=6.2%, 2=12.5%, 4=25.0%, 8=50.0%, 16=6.2%, 32=0.0%, >=64=0.0% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=100.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 issued rwts: total=16,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job3: (groupid=0, jobs=1): err= 0: pid=1798473: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=3, BW=3813KiB/s (3904kB/s)(48.0MiB/12892msec) 00:15:50.869 slat (usec): min=795, max=2079.1k, avg=224413.42, stdev=623093.86 00:15:50.869 clat (msec): min=2119, max=12890, avg=10326.45, stdev=3112.32 00:15:50.869 lat (msec): min=4199, max=12891, avg=10550.86, stdev=2888.29 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2123], 5.00th=[ 4212], 10.00th=[ 4279], 20.00th=[ 8423], 00:15:50.869 | 30.00th=[10537], 40.00th=[10671], 50.00th=[10671], 60.00th=[12684], 00:15:50.869 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.869 | 99.00th=[12953], 99.50th=[12953], 99.90th=[12953], 99.95th=[12953], 00:15:50.869 | 99.99th=[12953] 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.01%, sys=0.39%, ctx=91, majf=0, minf=12289 00:15:50.869 IO depths : 1=2.1%, 2=4.2%, 4=8.3%, 8=16.7%, 16=33.3%, 32=35.4%, >=64=0.0% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.869 issued rwts: total=48,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798474: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=1, BW=1844KiB/s (1888kB/s)(23.0MiB/12771msec) 00:15:50.869 slat (usec): min=824, max=2098.7k, avg=463142.75, stdev=850112.65 00:15:50.869 clat (msec): min=2118, max=12768, avg=8682.62, stdev=3575.70 00:15:50.869 lat (msec): min=4172, max=12770, avg=9145.76, stdev=3370.79 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2123], 5.00th=[ 4178], 10.00th=[ 4212], 20.00th=[ 4279], 00:15:50.869 | 30.00th=[ 6342], 40.00th=[ 6342], 50.00th=[ 8490], 60.00th=[10671], 00:15:50.869 | 70.00th=[12684], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.869 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.869 | 99.99th=[12818] 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.01%, sys=0.19%, ctx=51, majf=0, minf=5889 00:15:50.869 IO depths : 1=4.3%, 2=8.7%, 4=17.4%, 8=34.8%, 16=34.8%, 32=0.0%, >=64=0.0% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.869 issued rwts: total=23,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798475: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=1, BW=1687KiB/s (1727kB/s)(21.0MiB/12748msec) 00:15:50.869 slat (usec): min=1062, max=2116.6k, avg=506472.93, stdev=888757.83 00:15:50.869 clat (msec): min=2111, max=12702, avg=9171.42, stdev=3800.58 00:15:50.869 lat (msec): min=4170, max=12747, avg=9677.89, stdev=3510.38 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2106], 5.00th=[ 4178], 10.00th=[ 4212], 20.00th=[ 4245], 00:15:50.869 | 30.00th=[ 6409], 40.00th=[ 8557], 50.00th=[10671], 60.00th=[12684], 00:15:50.869 | 70.00th=[12684], 80.00th=[12684], 90.00th=[12684], 95.00th=[12684], 00:15:50.869 | 99.00th=[12684], 99.50th=[12684], 99.90th=[12684], 99.95th=[12684], 00:15:50.869 | 99.99th=[12684] 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.00%, sys=0.14%, ctx=54, majf=0, minf=5377 00:15:50.869 IO depths : 1=4.8%, 2=9.5%, 4=19.0%, 8=38.1%, 16=28.6%, 32=0.0%, >=64=0.0% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.869 issued rwts: total=21,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798476: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=134, BW=134MiB/s (141MB/s)(1446MiB/10783msec) 00:15:50.869 slat (usec): min=49, max=2059.8k, avg=7380.99, stdev=107471.70 00:15:50.869 clat (msec): min=105, max=8982, avg=922.64, stdev=2317.66 00:15:50.869 lat (msec): min=122, max=8982, avg=930.02, stdev=2326.89 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 123], 5.00th=[ 124], 10.00th=[ 124], 20.00th=[ 125], 00:15:50.869 | 30.00th=[ 125], 40.00th=[ 126], 50.00th=[ 126], 60.00th=[ 127], 00:15:50.869 | 70.00th=[ 153], 80.00th=[ 255], 90.00th=[ 2500], 95.00th=[ 8926], 00:15:50.869 | 99.00th=[ 8926], 99.50th=[ 8926], 99.90th=[ 8926], 99.95th=[ 8926], 00:15:50.869 | 99.99th=[ 8926] 00:15:50.869 bw ( KiB/s): min=14336, max=1038336, per=13.29%, avg=337408.00, stdev=440513.53, samples=8 00:15:50.869 iops : min= 14, max= 1014, avg=329.50, stdev=430.19, samples=8 00:15:50.869 lat (msec) : 250=77.39%, 500=11.41%, >=2000=11.20% 00:15:50.869 cpu : usr=0.10%, sys=1.96%, ctx=1311, majf=0, minf=32770 00:15:50.869 IO depths : 1=0.1%, 2=0.1%, 4=0.3%, 8=0.6%, 16=1.1%, 32=2.2%, >=64=95.6% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.869 issued rwts: total=1446,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798477: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=79, BW=79.5MiB/s (83.4MB/s)(1012MiB/12725msec) 00:15:50.869 slat (usec): min=42, max=2117.9k, avg=10481.62, stdev=125560.99 00:15:50.869 clat (msec): min=260, max=10583, avg=1330.01, stdev=2473.19 00:15:50.869 lat (msec): min=261, max=10585, avg=1340.49, stdev=2483.37 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 262], 5.00th=[ 264], 10.00th=[ 264], 20.00th=[ 266], 00:15:50.869 | 30.00th=[ 266], 40.00th=[ 268], 50.00th=[ 271], 60.00th=[ 275], 00:15:50.869 | 70.00th=[ 284], 80.00th=[ 363], 90.00th=[ 6275], 95.00th=[ 8658], 00:15:50.869 | 99.00th=[ 8658], 99.50th=[ 8658], 99.90th=[10537], 99.95th=[10537], 00:15:50.869 | 99.99th=[10537] 00:15:50.869 bw ( KiB/s): min= 2048, max=485376, per=7.14%, avg=181152.70, stdev=211183.81, samples=10 00:15:50.869 iops : min= 2, max= 474, avg=176.90, stdev=206.22, samples=10 00:15:50.869 lat (msec) : 500=81.32%, >=2000=18.68% 00:15:50.869 cpu : usr=0.00%, sys=1.28%, ctx=915, majf=0, minf=32769 00:15:50.869 IO depths : 1=0.1%, 2=0.2%, 4=0.4%, 8=0.8%, 16=1.6%, 32=3.2%, >=64=93.8% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.869 issued rwts: total=1012,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798478: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=5, BW=5207KiB/s (5332kB/s)(65.0MiB/12782msec) 00:15:50.869 slat (usec): min=717, max=2056.4k, avg=164073.73, stdev=539626.93 00:15:50.869 clat (msec): min=2116, max=12778, avg=8633.98, stdev=3282.56 00:15:50.869 lat (msec): min=4173, max=12781, avg=8798.06, stdev=3217.62 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2123], 5.00th=[ 4178], 10.00th=[ 4212], 20.00th=[ 4279], 00:15:50.869 | 30.00th=[ 6342], 40.00th=[ 6409], 50.00th=[ 8557], 60.00th=[10671], 00:15:50.869 | 70.00th=[10671], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.869 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.869 | 99.99th=[12818] 00:15:50.869 lat (msec) : >=2000=100.00% 00:15:50.869 cpu : usr=0.00%, sys=0.54%, ctx=61, majf=0, minf=16641 00:15:50.869 IO depths : 1=1.5%, 2=3.1%, 4=6.2%, 8=12.3%, 16=24.6%, 32=49.2%, >=64=3.1% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.869 issued rwts: total=65,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798479: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=3, BW=3733KiB/s (3823kB/s)(39.0MiB/10698msec) 00:15:50.869 slat (usec): min=1009, max=2056.2k, avg=271506.67, stdev=678110.13 00:15:50.869 clat (msec): min=108, max=10694, avg=6924.00, stdev=3123.89 00:15:50.869 lat (msec): min=2164, max=10697, avg=7195.50, stdev=2972.41 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 109], 5.00th=[ 2165], 10.00th=[ 2198], 20.00th=[ 4329], 00:15:50.869 | 30.00th=[ 4396], 40.00th=[ 6544], 50.00th=[ 8658], 60.00th=[ 8658], 00:15:50.869 | 70.00th=[ 8658], 80.00th=[10537], 90.00th=[10671], 95.00th=[10671], 00:15:50.869 | 99.00th=[10671], 99.50th=[10671], 99.90th=[10671], 99.95th=[10671], 00:15:50.869 | 99.99th=[10671] 00:15:50.869 lat (msec) : 250=2.56%, >=2000=97.44% 00:15:50.869 cpu : usr=0.00%, sys=0.38%, ctx=60, majf=0, minf=9985 00:15:50.869 IO depths : 1=2.6%, 2=5.1%, 4=10.3%, 8=20.5%, 16=41.0%, 32=20.5%, >=64=0.0% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=100.0%, >=64=0.0% 00:15:50.869 issued rwts: total=39,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798480: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=10, BW=10.1MiB/s (10.6MB/s)(109MiB/10818msec) 00:15:50.869 slat (usec): min=647, max=2032.9k, avg=98410.50, stdev=418113.47 00:15:50.869 clat (msec): min=90, max=10814, avg=7736.99, stdev=3349.41 00:15:50.869 lat (msec): min=2063, max=10817, avg=7835.40, stdev=3279.53 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2072], 5.00th=[ 2198], 10.00th=[ 2198], 20.00th=[ 4329], 00:15:50.869 | 30.00th=[ 4396], 40.00th=[ 8490], 50.00th=[ 8658], 60.00th=[10671], 00:15:50.869 | 70.00th=[10671], 80.00th=[10805], 90.00th=[10805], 95.00th=[10805], 00:15:50.869 | 99.00th=[10805], 99.50th=[10805], 99.90th=[10805], 99.95th=[10805], 00:15:50.869 | 99.99th=[10805] 00:15:50.869 lat (msec) : 100=0.92%, >=2000=99.08% 00:15:50.869 cpu : usr=0.02%, sys=1.00%, ctx=108, majf=0, minf=27905 00:15:50.869 IO depths : 1=0.9%, 2=1.8%, 4=3.7%, 8=7.3%, 16=14.7%, 32=29.4%, >=64=42.2% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.869 issued rwts: total=109,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798481: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=14, BW=14.2MiB/s (14.9MB/s)(154MiB/10825msec) 00:15:50.869 slat (usec): min=124, max=2092.1k, avg=69694.55, stdev=345587.87 00:15:50.869 clat (msec): min=90, max=10769, avg=5971.27, stdev=2845.56 00:15:50.869 lat (msec): min=2063, max=10770, avg=6040.96, stdev=2831.60 00:15:50.869 clat percentiles (msec): 00:15:50.869 | 1.00th=[ 2072], 5.00th=[ 2198], 10.00th=[ 3943], 20.00th=[ 3977], 00:15:50.869 | 30.00th=[ 4044], 40.00th=[ 4077], 50.00th=[ 4144], 60.00th=[ 6477], 00:15:50.869 | 70.00th=[ 6879], 80.00th=[ 8658], 90.00th=[10805], 95.00th=[10805], 00:15:50.869 | 99.00th=[10805], 99.50th=[10805], 99.90th=[10805], 99.95th=[10805], 00:15:50.869 | 99.99th=[10805] 00:15:50.869 bw ( KiB/s): min=18432, max=34816, per=1.05%, avg=26624.00, stdev=11585.24, samples=2 00:15:50.869 iops : min= 18, max= 34, avg=26.00, stdev=11.31, samples=2 00:15:50.869 lat (msec) : 100=0.65%, >=2000=99.35% 00:15:50.869 cpu : usr=0.00%, sys=1.18%, ctx=147, majf=0, minf=32769 00:15:50.869 IO depths : 1=0.6%, 2=1.3%, 4=2.6%, 8=5.2%, 16=10.4%, 32=20.8%, >=64=59.1% 00:15:50.869 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.869 complete : 0=0.0%, 4=96.4%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=3.6% 00:15:50.869 issued rwts: total=154,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.869 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.869 job4: (groupid=0, jobs=1): err= 0: pid=1798482: Fri Sep 27 15:20:51 2024 00:15:50.869 read: IOPS=35, BW=35.7MiB/s (37.4MB/s)(385MiB/10796msec) 00:15:50.869 slat (usec): min=52, max=2058.9k, avg=27747.05, stdev=213617.28 00:15:50.870 clat (msec): min=111, max=10547, avg=3011.02, stdev=3423.70 00:15:50.870 lat (msec): min=253, max=10548, avg=3038.76, stdev=3438.38 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 253], 5.00th=[ 255], 10.00th=[ 257], 20.00th=[ 259], 00:15:50.870 | 30.00th=[ 262], 40.00th=[ 266], 50.00th=[ 330], 60.00th=[ 2333], 00:15:50.870 | 70.00th=[ 5873], 80.00th=[ 7953], 90.00th=[ 8020], 95.00th=[ 8087], 00:15:50.870 | 99.00th=[ 8658], 99.50th=[10537], 99.90th=[10537], 99.95th=[10537], 00:15:50.870 | 99.99th=[10537] 00:15:50.870 bw ( KiB/s): min=10240, max=393216, per=4.15%, avg=105267.20, stdev=164775.57, samples=5 00:15:50.870 iops : min= 10, max= 384, avg=102.80, stdev=160.91, samples=5 00:15:50.870 lat (msec) : 250=0.26%, 500=53.25%, 2000=1.30%, >=2000=45.19% 00:15:50.870 cpu : usr=0.00%, sys=1.41%, ctx=344, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.3%, 2=0.5%, 4=1.0%, 8=2.1%, 16=4.2%, 32=8.3%, >=64=83.6% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.6%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.4% 00:15:50.870 issued rwts: total=385,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job4: (groupid=0, jobs=1): err= 0: pid=1798483: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=1, BW=1760KiB/s (1802kB/s)(22.0MiB/12801msec) 00:15:50.870 slat (msec): min=2, max=2108, avg=485.73, stdev=863.58 00:15:50.870 clat (msec): min=2114, max=12703, avg=8470.66, stdev=3630.28 00:15:50.870 lat (msec): min=4159, max=12800, avg=8956.38, stdev=3449.76 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 2123], 5.00th=[ 4144], 10.00th=[ 4178], 20.00th=[ 4245], 00:15:50.870 | 30.00th=[ 6409], 40.00th=[ 6409], 50.00th=[ 8490], 60.00th=[10671], 00:15:50.870 | 70.00th=[12550], 80.00th=[12684], 90.00th=[12684], 95.00th=[12684], 00:15:50.870 | 99.00th=[12684], 99.50th=[12684], 99.90th=[12684], 99.95th=[12684], 00:15:50.870 | 99.99th=[12684] 00:15:50.870 lat (msec) : >=2000=100.00% 00:15:50.870 cpu : usr=0.00%, sys=0.19%, ctx=57, majf=0, minf=5633 00:15:50.870 IO depths : 1=4.5%, 2=9.1%, 4=18.2%, 8=36.4%, 16=31.8%, 32=0.0%, >=64=0.0% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=100.0%, 64=0.0%, >=64=0.0% 00:15:50.870 issued rwts: total=22,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job4: (groupid=0, jobs=1): err= 0: pid=1798484: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=5, BW=5664KiB/s (5800kB/s)(71.0MiB/12836msec) 00:15:50.870 slat (usec): min=507, max=2127.8k, avg=150949.45, stdev=519714.86 00:15:50.870 clat (msec): min=2117, max=12832, avg=10614.28, stdev=2980.73 00:15:50.870 lat (msec): min=4172, max=12835, avg=10765.23, stdev=2810.82 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 2123], 5.00th=[ 4212], 10.00th=[ 6342], 20.00th=[ 8557], 00:15:50.870 | 30.00th=[10671], 40.00th=[10671], 50.00th=[12684], 60.00th=[12818], 00:15:50.870 | 70.00th=[12818], 80.00th=[12818], 90.00th=[12818], 95.00th=[12818], 00:15:50.870 | 99.00th=[12818], 99.50th=[12818], 99.90th=[12818], 99.95th=[12818], 00:15:50.870 | 99.99th=[12818] 00:15:50.870 lat (msec) : >=2000=100.00% 00:15:50.870 cpu : usr=0.00%, sys=0.55%, ctx=82, majf=0, minf=18177 00:15:50.870 IO depths : 1=1.4%, 2=2.8%, 4=5.6%, 8=11.3%, 16=22.5%, 32=45.1%, >=64=11.3% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.870 issued rwts: total=71,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job4: (groupid=0, jobs=1): err= 0: pid=1798485: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=82, BW=82.5MiB/s (86.5MB/s)(884MiB/10710msec) 00:15:50.870 slat (usec): min=42, max=2050.4k, avg=11307.89, stdev=120377.56 00:15:50.870 clat (msec): min=150, max=6446, avg=1125.45, stdev=1938.83 00:15:50.870 lat (msec): min=153, max=8350, avg=1136.75, stdev=1956.92 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 157], 5.00th=[ 186], 10.00th=[ 197], 20.00th=[ 215], 00:15:50.870 | 30.00th=[ 232], 40.00th=[ 234], 50.00th=[ 236], 60.00th=[ 262], 00:15:50.870 | 70.00th=[ 430], 80.00th=[ 793], 90.00th=[ 5671], 95.00th=[ 5940], 00:15:50.870 | 99.00th=[ 6342], 99.50th=[ 6409], 99.90th=[ 6477], 99.95th=[ 6477], 00:15:50.870 | 99.99th=[ 6477] 00:15:50.870 bw ( KiB/s): min= 8192, max=537548, per=11.26%, avg=285738.80, stdev=258659.70, samples=5 00:15:50.870 iops : min= 8, max= 524, avg=278.60, stdev=252.14, samples=5 00:15:50.870 lat (msec) : 250=58.37%, 500=12.67%, 750=5.77%, 1000=8.14%, >=2000=15.05% 00:15:50.870 cpu : usr=0.05%, sys=1.57%, ctx=851, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.1%, 2=0.2%, 4=0.5%, 8=0.9%, 16=1.8%, 32=3.6%, >=64=92.9% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.870 issued rwts: total=884,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job4: (groupid=0, jobs=1): err= 0: pid=1798486: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=36, BW=36.2MiB/s (38.0MB/s)(465MiB/12835msec) 00:15:50.870 slat (usec): min=43, max=2108.2k, avg=23048.82, stdev=187428.40 00:15:50.870 clat (msec): min=356, max=8938, avg=2930.69, stdev=3477.16 00:15:50.870 lat (msec): min=357, max=8939, avg=2953.74, stdev=3484.75 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 388], 5.00th=[ 405], 10.00th=[ 405], 20.00th=[ 409], 00:15:50.870 | 30.00th=[ 409], 40.00th=[ 439], 50.00th=[ 676], 60.00th=[ 1028], 00:15:50.870 | 70.00th=[ 3239], 80.00th=[ 8658], 90.00th=[ 8792], 95.00th=[ 8792], 00:15:50.870 | 99.00th=[ 8926], 99.50th=[ 8926], 99.90th=[ 8926], 99.95th=[ 8926], 00:15:50.870 | 99.99th=[ 8926] 00:15:50.870 bw ( KiB/s): min= 1501, max=317440, per=3.89%, avg=98856.71, stdev=123544.00, samples=7 00:15:50.870 iops : min= 1, max= 310, avg=96.43, stdev=120.68, samples=7 00:15:50.870 lat (msec) : 500=46.02%, 750=6.67%, 1000=7.10%, 2000=3.01%, >=2000=37.20% 00:15:50.870 cpu : usr=0.02%, sys=1.08%, ctx=502, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.2%, 2=0.4%, 4=0.9%, 8=1.7%, 16=3.4%, 32=6.9%, >=64=86.5% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.7%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.3% 00:15:50.870 issued rwts: total=465,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798487: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=23, BW=23.1MiB/s (24.2MB/s)(248MiB/10736msec) 00:15:50.870 slat (usec): min=112, max=2043.5k, avg=42739.66, stdev=262518.82 00:15:50.870 clat (msec): min=134, max=5969, avg=3353.23, stdev=2379.66 00:15:50.870 lat (msec): min=299, max=5972, avg=3395.97, stdev=2369.13 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 296], 5.00th=[ 334], 10.00th=[ 351], 20.00th=[ 435], 00:15:50.870 | 30.00th=[ 584], 40.00th=[ 2232], 50.00th=[ 3742], 60.00th=[ 5738], 00:15:50.870 | 70.00th=[ 5805], 80.00th=[ 5873], 90.00th=[ 5940], 95.00th=[ 5940], 00:15:50.870 | 99.00th=[ 5940], 99.50th=[ 5940], 99.90th=[ 5940], 99.95th=[ 5940], 00:15:50.870 | 99.99th=[ 5940] 00:15:50.870 bw ( KiB/s): min= 1635, max=190464, per=1.95%, avg=49470.00, stdev=79204.72, samples=5 00:15:50.870 iops : min= 1, max= 186, avg=48.00, stdev=77.52, samples=5 00:15:50.870 lat (msec) : 250=0.40%, 500=27.02%, 750=4.84%, 2000=4.44%, >=2000=63.31% 00:15:50.870 cpu : usr=0.00%, sys=0.79%, ctx=481, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.4%, 2=0.8%, 4=1.6%, 8=3.2%, 16=6.5%, 32=12.9%, >=64=74.6% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.2%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.8% 00:15:50.870 issued rwts: total=248,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798488: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=42, BW=42.4MiB/s (44.4MB/s)(454MiB/10711msec) 00:15:50.870 slat (usec): min=58, max=2169.8k, avg=23290.41, stdev=193455.52 00:15:50.870 clat (msec): min=134, max=9031, avg=2892.42, stdev=3535.34 00:15:50.870 lat (msec): min=357, max=9033, avg=2915.71, stdev=3542.49 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 359], 5.00th=[ 380], 10.00th=[ 384], 20.00th=[ 464], 00:15:50.870 | 30.00th=[ 609], 40.00th=[ 684], 50.00th=[ 726], 60.00th=[ 768], 00:15:50.870 | 70.00th=[ 2567], 80.00th=[ 8792], 90.00th=[ 8926], 95.00th=[ 8926], 00:15:50.870 | 99.00th=[ 9060], 99.50th=[ 9060], 99.90th=[ 9060], 99.95th=[ 9060], 00:15:50.870 | 99.99th=[ 9060] 00:15:50.870 bw ( KiB/s): min= 2048, max=321536, per=3.30%, avg=83707.88, stdev=119211.83, samples=8 00:15:50.870 iops : min= 2, max= 314, avg=81.62, stdev=116.50, samples=8 00:15:50.870 lat (msec) : 250=0.22%, 500=22.25%, 750=34.14%, 1000=10.79%, >=2000=32.60% 00:15:50.870 cpu : usr=0.02%, sys=1.21%, ctx=519, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.2%, 2=0.4%, 4=0.9%, 8=1.8%, 16=3.5%, 32=7.0%, >=64=86.1% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.7%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.3% 00:15:50.870 issued rwts: total=454,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798489: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=176, BW=176MiB/s (185MB/s)(1900MiB/10794msec) 00:15:50.870 slat (usec): min=41, max=2129.2k, avg=5605.11, stdev=90235.63 00:15:50.870 clat (msec): min=105, max=4542, avg=489.77, stdev=1134.88 00:15:50.870 lat (msec): min=105, max=4544, avg=495.37, stdev=1142.03 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 118], 5.00th=[ 121], 10.00th=[ 127], 20.00th=[ 130], 00:15:50.870 | 30.00th=[ 131], 40.00th=[ 133], 50.00th=[ 140], 60.00th=[ 142], 00:15:50.870 | 70.00th=[ 144], 80.00th=[ 148], 90.00th=[ 253], 95.00th=[ 4463], 00:15:50.870 | 99.00th=[ 4530], 99.50th=[ 4530], 99.90th=[ 4530], 99.95th=[ 4530], 00:15:50.870 | 99.99th=[ 4530] 00:15:50.870 bw ( KiB/s): min=38912, max=1007616, per=23.83%, avg=604842.67, stdev=449117.89, samples=6 00:15:50.870 iops : min= 38, max= 984, avg=590.67, stdev=438.59, samples=6 00:15:50.870 lat (msec) : 250=89.95%, 500=0.84%, >=2000=9.21% 00:15:50.870 cpu : usr=0.06%, sys=1.62%, ctx=1785, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.1%, 2=0.1%, 4=0.2%, 8=0.4%, 16=0.8%, 32=1.7%, >=64=96.7% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.870 issued rwts: total=1900,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798490: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=190, BW=190MiB/s (199MB/s)(2037MiB/10717msec) 00:15:50.870 slat (usec): min=35, max=1996.7k, avg=5208.11, stdev=62868.41 00:15:50.870 clat (msec): min=94, max=2894, avg=512.89, stdev=569.52 00:15:50.870 lat (msec): min=94, max=4653, avg=518.10, stdev=576.74 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 110], 5.00th=[ 116], 10.00th=[ 118], 20.00th=[ 209], 00:15:50.870 | 30.00th=[ 226], 40.00th=[ 234], 50.00th=[ 255], 60.00th=[ 330], 00:15:50.870 | 70.00th=[ 514], 80.00th=[ 634], 90.00th=[ 1318], 95.00th=[ 2265], 00:15:50.870 | 99.00th=[ 2333], 99.50th=[ 2366], 99.90th=[ 2869], 99.95th=[ 2903], 00:15:50.870 | 99.99th=[ 2903] 00:15:50.870 bw ( KiB/s): min= 1580, max=892928, per=11.01%, avg=279344.79, stdev=265665.79, samples=14 00:15:50.870 iops : min= 1, max= 872, avg=272.71, stdev=259.50, samples=14 00:15:50.870 lat (msec) : 100=0.34%, 250=47.52%, 500=21.60%, 750=16.94%, 1000=0.79% 00:15:50.870 lat (msec) : 2000=6.14%, >=2000=6.68% 00:15:50.870 cpu : usr=0.11%, sys=2.26%, ctx=2062, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.1%, 2=0.1%, 4=0.2%, 8=0.4%, 16=0.8%, 32=1.6%, >=64=96.9% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.870 issued rwts: total=2037,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798491: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=65, BW=65.0MiB/s (68.2MB/s)(703MiB/10810msec) 00:15:50.870 slat (usec): min=37, max=2168.5k, avg=15172.86, stdev=154081.99 00:15:50.870 clat (msec): min=139, max=9065, avg=1886.10, stdev=3090.73 00:15:50.870 lat (msec): min=244, max=9070, avg=1901.27, stdev=3100.95 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 249], 5.00th=[ 253], 10.00th=[ 253], 20.00th=[ 257], 00:15:50.870 | 30.00th=[ 259], 40.00th=[ 300], 50.00th=[ 376], 60.00th=[ 443], 00:15:50.870 | 70.00th=[ 502], 80.00th=[ 2702], 90.00th=[ 8792], 95.00th=[ 8926], 00:15:50.870 | 99.00th=[ 9060], 99.50th=[ 9060], 99.90th=[ 9060], 99.95th=[ 9060], 00:15:50.870 | 99.99th=[ 9060] 00:15:50.870 bw ( KiB/s): min= 8192, max=505856, per=5.80%, avg=147200.00, stdev=184516.37, samples=8 00:15:50.870 iops : min= 8, max= 494, avg=143.75, stdev=180.19, samples=8 00:15:50.870 lat (msec) : 250=1.71%, 500=68.42%, 750=7.82%, >=2000=22.05% 00:15:50.870 cpu : usr=0.01%, sys=1.38%, ctx=900, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.1%, 2=0.3%, 4=0.6%, 8=1.1%, 16=2.3%, 32=4.6%, >=64=91.0% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.870 issued rwts: total=703,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798492: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=77, BW=77.9MiB/s (81.7MB/s)(991MiB/12715msec) 00:15:50.870 slat (usec): min=45, max=2134.6k, avg=10659.30, stdev=130353.44 00:15:50.870 clat (msec): min=262, max=8802, avg=1382.56, stdev=2696.59 00:15:50.870 lat (msec): min=263, max=8804, avg=1393.22, stdev=2706.01 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 264], 5.00th=[ 266], 10.00th=[ 268], 20.00th=[ 268], 00:15:50.870 | 30.00th=[ 271], 40.00th=[ 271], 50.00th=[ 275], 60.00th=[ 279], 00:15:50.870 | 70.00th=[ 284], 80.00th=[ 296], 90.00th=[ 8557], 95.00th=[ 8658], 00:15:50.870 | 99.00th=[ 8792], 99.50th=[ 8792], 99.90th=[ 8792], 99.95th=[ 8792], 00:15:50.870 | 99.99th=[ 8792] 00:15:50.870 bw ( KiB/s): min= 2048, max=483328, per=8.71%, avg=221066.88, stdev=218999.88, samples=8 00:15:50.870 iops : min= 2, max= 472, avg=215.88, stdev=213.85, samples=8 00:15:50.870 lat (msec) : 500=83.65%, >=2000=16.35% 00:15:50.870 cpu : usr=0.02%, sys=1.56%, ctx=868, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.1%, 2=0.2%, 4=0.4%, 8=0.8%, 16=1.6%, 32=3.2%, >=64=93.6% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.870 issued rwts: total=991,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798493: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=34, BW=34.7MiB/s (36.3MB/s)(441MiB/12723msec) 00:15:50.870 slat (usec): min=429, max=2101.9k, avg=23975.38, stdev=194641.25 00:15:50.870 clat (msec): min=460, max=9160, avg=2995.94, stdev=3577.17 00:15:50.870 lat (msec): min=463, max=9162, avg=3019.91, stdev=3585.04 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 468], 5.00th=[ 481], 10.00th=[ 498], 20.00th=[ 542], 00:15:50.870 | 30.00th=[ 617], 40.00th=[ 701], 50.00th=[ 718], 60.00th=[ 735], 00:15:50.870 | 70.00th=[ 4279], 80.00th=[ 8658], 90.00th=[ 8926], 95.00th=[ 9060], 00:15:50.870 | 99.00th=[ 9194], 99.50th=[ 9194], 99.90th=[ 9194], 99.95th=[ 9194], 00:15:50.870 | 99.99th=[ 9194] 00:15:50.870 bw ( KiB/s): min= 2048, max=190464, per=3.62%, avg=91867.43, stdev=85946.31, samples=7 00:15:50.870 iops : min= 2, max= 186, avg=89.71, stdev=83.93, samples=7 00:15:50.870 lat (msec) : 500=10.88%, 750=55.33%, 1000=0.91%, >=2000=32.88% 00:15:50.870 cpu : usr=0.02%, sys=0.78%, ctx=761, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.2%, 2=0.5%, 4=0.9%, 8=1.8%, 16=3.6%, 32=7.3%, >=64=85.7% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=99.7%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.3% 00:15:50.870 issued rwts: total=441,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.870 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.870 job5: (groupid=0, jobs=1): err= 0: pid=1798494: Fri Sep 27 15:20:51 2024 00:15:50.870 read: IOPS=198, BW=199MiB/s (208MB/s)(2534MiB/12744msec) 00:15:50.870 slat (usec): min=42, max=2028.6k, avg=4178.03, stdev=69171.85 00:15:50.870 clat (msec): min=98, max=4397, avg=528.68, stdev=1070.55 00:15:50.870 lat (msec): min=98, max=4398, avg=532.86, stdev=1074.93 00:15:50.870 clat percentiles (msec): 00:15:50.870 | 1.00th=[ 100], 5.00th=[ 101], 10.00th=[ 101], 20.00th=[ 111], 00:15:50.870 | 30.00th=[ 118], 40.00th=[ 125], 50.00th=[ 126], 60.00th=[ 126], 00:15:50.870 | 70.00th=[ 127], 80.00th=[ 180], 90.00th=[ 2467], 95.00th=[ 3138], 00:15:50.870 | 99.00th=[ 4396], 99.50th=[ 4396], 99.90th=[ 4396], 99.95th=[ 4396], 00:15:50.870 | 99.99th=[ 4396] 00:15:50.870 bw ( KiB/s): min= 1622, max=1155072, per=17.65%, avg=448077.45, stdev=500521.97, samples=11 00:15:50.870 iops : min= 1, max= 1128, avg=437.45, stdev=488.89, samples=11 00:15:50.870 lat (msec) : 100=6.51%, 250=73.80%, 500=0.04%, 750=6.35%, 1000=2.68% 00:15:50.870 lat (msec) : 2000=0.04%, >=2000=10.58% 00:15:50.870 cpu : usr=0.09%, sys=1.89%, ctx=2738, majf=0, minf=32769 00:15:50.870 IO depths : 1=0.1%, 2=0.1%, 4=0.2%, 8=0.3%, 16=0.6%, 32=1.3%, >=64=97.5% 00:15:50.870 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.870 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.871 issued rwts: total=2534,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.871 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.871 job5: (groupid=0, jobs=1): err= 0: pid=1798495: Fri Sep 27 15:20:51 2024 00:15:50.871 read: IOPS=149, BW=149MiB/s (157MB/s)(1496MiB/10011msec) 00:15:50.871 slat (usec): min=40, max=2119.4k, avg=6681.89, stdev=93178.75 00:15:50.871 clat (msec): min=10, max=6863, avg=796.23, stdev=1759.44 00:15:50.871 lat (msec): min=11, max=6865, avg=802.91, stdev=1766.30 00:15:50.871 clat percentiles (msec): 00:15:50.871 | 1.00th=[ 22], 5.00th=[ 66], 10.00th=[ 94], 20.00th=[ 96], 00:15:50.871 | 30.00th=[ 97], 40.00th=[ 115], 50.00th=[ 211], 60.00th=[ 241], 00:15:50.871 | 70.00th=[ 284], 80.00th=[ 617], 90.00th=[ 1083], 95.00th=[ 6745], 00:15:50.871 | 99.00th=[ 6879], 99.50th=[ 6879], 99.90th=[ 6879], 99.95th=[ 6879], 00:15:50.871 | 99.99th=[ 6879] 00:15:50.871 bw ( KiB/s): min=10219, max=541636, per=7.74%, avg=196469.88, stdev=218262.30, samples=8 00:15:50.871 iops : min= 9, max= 528, avg=191.62, stdev=213.05, samples=8 00:15:50.871 lat (msec) : 20=0.87%, 50=2.74%, 100=34.02%, 250=28.94%, 500=11.43% 00:15:50.871 lat (msec) : 750=7.82%, 1000=3.81%, 2000=0.53%, >=2000=9.83% 00:15:50.871 cpu : usr=0.07%, sys=1.56%, ctx=1569, majf=0, minf=32769 00:15:50.871 IO depths : 1=0.1%, 2=0.1%, 4=0.3%, 8=0.5%, 16=1.1%, 32=2.1%, >=64=95.8% 00:15:50.871 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.871 complete : 0=0.0%, 4=99.9%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:15:50.871 issued rwts: total=1496,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.871 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.871 job5: (groupid=0, jobs=1): err= 0: pid=1798496: Fri Sep 27 15:20:51 2024 00:15:50.871 read: IOPS=10, BW=10.0MiB/s (10.5MB/s)(108MiB/10766msec) 00:15:50.871 slat (usec): min=811, max=2026.4k, avg=98388.37, stdev=417897.76 00:15:50.871 clat (msec): min=138, max=10762, avg=7337.98, stdev=2814.56 00:15:50.871 lat (msec): min=2116, max=10765, avg=7436.37, stdev=2745.43 00:15:50.871 clat percentiles (msec): 00:15:50.871 | 1.00th=[ 2123], 5.00th=[ 2265], 10.00th=[ 2265], 20.00th=[ 4329], 00:15:50.871 | 30.00th=[ 6409], 40.00th=[ 6544], 50.00th=[ 8557], 60.00th=[ 8557], 00:15:50.871 | 70.00th=[ 8658], 80.00th=[10671], 90.00th=[10805], 95.00th=[10805], 00:15:50.871 | 99.00th=[10805], 99.50th=[10805], 99.90th=[10805], 99.95th=[10805], 00:15:50.871 | 99.99th=[10805] 00:15:50.871 lat (msec) : 250=0.93%, >=2000=99.07% 00:15:50.871 cpu : usr=0.04%, sys=0.97%, ctx=70, majf=0, minf=27649 00:15:50.871 IO depths : 1=0.9%, 2=1.9%, 4=3.7%, 8=7.4%, 16=14.8%, 32=29.6%, >=64=41.7% 00:15:50.871 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.871 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.871 issued rwts: total=108,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.871 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.871 job5: (groupid=0, jobs=1): err= 0: pid=1798497: Fri Sep 27 15:20:51 2024 00:15:50.871 read: IOPS=60, BW=60.0MiB/s (63.0MB/s)(648MiB/10792msec) 00:15:50.871 slat (usec): min=36, max=1998.8k, avg=16435.34, stdev=145921.90 00:15:50.871 clat (msec): min=138, max=8366, avg=2034.07, stdev=2045.19 00:15:50.871 lat (msec): min=275, max=8423, avg=2050.51, stdev=2047.95 00:15:50.871 clat percentiles (msec): 00:15:50.871 | 1.00th=[ 275], 5.00th=[ 288], 10.00th=[ 456], 20.00th=[ 542], 00:15:50.871 | 30.00th=[ 625], 40.00th=[ 693], 50.00th=[ 768], 60.00th=[ 2198], 00:15:50.871 | 70.00th=[ 2500], 80.00th=[ 2601], 90.00th=[ 6141], 95.00th=[ 6208], 00:15:50.871 | 99.00th=[ 6275], 99.50th=[ 6275], 99.90th=[ 8356], 99.95th=[ 8356], 00:15:50.871 | 99.99th=[ 8356] 00:15:50.871 bw ( KiB/s): min= 2048, max=266240, per=4.20%, avg=106496.00, stdev=98009.63, samples=10 00:15:50.871 iops : min= 2, max= 260, avg=104.00, stdev=95.71, samples=10 00:15:50.871 lat (msec) : 250=0.15%, 500=17.59%, 750=30.71%, 1000=8.80%, 2000=1.23% 00:15:50.871 lat (msec) : >=2000=41.51% 00:15:50.871 cpu : usr=0.01%, sys=1.25%, ctx=967, majf=0, minf=32769 00:15:50.871 IO depths : 1=0.2%, 2=0.3%, 4=0.6%, 8=1.2%, 16=2.5%, 32=4.9%, >=64=90.3% 00:15:50.871 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.871 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.871 issued rwts: total=648,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.871 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.871 job5: (groupid=0, jobs=1): err= 0: pid=1798498: Fri Sep 27 15:20:51 2024 00:15:50.871 read: IOPS=6, BW=6981KiB/s (7148kB/s)(74.0MiB/10855msec) 00:15:50.871 slat (usec): min=879, max=2162.0k, avg=144872.21, stdev=512333.42 00:15:50.871 clat (msec): min=133, max=10852, avg=9002.08, stdev=3100.13 00:15:50.871 lat (msec): min=2183, max=10853, avg=9146.95, stdev=2925.58 00:15:50.871 clat percentiles (msec): 00:15:50.871 | 1.00th=[ 133], 5.00th=[ 2232], 10.00th=[ 4279], 20.00th=[ 6477], 00:15:50.871 | 30.00th=[10537], 40.00th=[10671], 50.00th=[10671], 60.00th=[10805], 00:15:50.871 | 70.00th=[10805], 80.00th=[10805], 90.00th=[10805], 95.00th=[10805], 00:15:50.871 | 99.00th=[10805], 99.50th=[10805], 99.90th=[10805], 99.95th=[10805], 00:15:50.871 | 99.99th=[10805] 00:15:50.871 lat (msec) : 250=1.35%, >=2000=98.65% 00:15:50.871 cpu : usr=0.00%, sys=0.75%, ctx=103, majf=0, minf=18945 00:15:50.871 IO depths : 1=1.4%, 2=2.7%, 4=5.4%, 8=10.8%, 16=21.6%, 32=43.2%, >=64=14.9% 00:15:50.871 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.871 complete : 0=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=100.0% 00:15:50.871 issued rwts: total=74,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.871 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.871 job5: (groupid=0, jobs=1): err= 0: pid=1798499: Fri Sep 27 15:20:51 2024 00:15:50.871 read: IOPS=60, BW=60.8MiB/s (63.8MB/s)(655MiB/10767msec) 00:15:50.871 slat (usec): min=47, max=2084.3k, avg=16221.93, stdev=139026.12 00:15:50.871 clat (msec): min=138, max=4980, avg=1945.55, stdev=1643.97 00:15:50.871 lat (msec): min=518, max=4982, avg=1961.77, stdev=1644.90 00:15:50.871 clat percentiles (msec): 00:15:50.871 | 1.00th=[ 523], 5.00th=[ 531], 10.00th=[ 542], 20.00th=[ 592], 00:15:50.871 | 30.00th=[ 617], 40.00th=[ 634], 50.00th=[ 894], 60.00th=[ 1301], 00:15:50.871 | 70.00th=[ 3205], 80.00th=[ 3339], 90.00th=[ 4665], 95.00th=[ 4866], 00:15:50.871 | 99.00th=[ 5000], 99.50th=[ 5000], 99.90th=[ 5000], 99.95th=[ 5000], 00:15:50.871 | 99.99th=[ 5000] 00:15:50.871 bw ( KiB/s): min= 2048, max=245760, per=4.25%, avg=107897.20, stdev=91576.10, samples=10 00:15:50.871 iops : min= 2, max= 240, avg=105.20, stdev=89.51, samples=10 00:15:50.871 lat (msec) : 250=0.15%, 500=0.15%, 750=48.40%, 1000=5.65%, 2000=5.95% 00:15:50.871 lat (msec) : >=2000=39.69% 00:15:50.871 cpu : usr=0.03%, sys=1.11%, ctx=887, majf=0, minf=32769 00:15:50.871 IO depths : 1=0.2%, 2=0.3%, 4=0.6%, 8=1.2%, 16=2.4%, 32=4.9%, >=64=90.4% 00:15:50.871 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:50.871 complete : 0=0.0%, 4=99.8%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.2% 00:15:50.871 issued rwts: total=655,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:50.871 latency : target=0, window=0, percentile=100.00%, depth=128 00:15:50.871 00:15:50.871 Run status group 0 (all jobs): 00:15:50.871 READ: bw=2479MiB/s (2599MB/s), 320KiB/s-199MiB/s (328kB/s-208MB/s), io=31.4GiB (33.7GB), run=10011-12960msec 00:15:50.871 00:15:50.871 Disk stats (read/write): 00:15:50.871 nvme0n1: ios=45728/0, merge=0/0, ticks=5834077/0, in_queue=5834077, util=98.19% 00:15:50.871 nvme1n1: ios=33900/0, merge=0/0, ticks=12391998/0, in_queue=12391998, util=98.70% 00:15:50.871 nvme2n1: ios=22406/0, merge=0/0, ticks=11342762/0, in_queue=11342762, util=98.81% 00:15:50.871 nvme3n1: ios=16786/0, merge=0/0, ticks=8189103/0, in_queue=8189103, util=98.84% 00:15:50.871 nvme4n1: ios=37437/0, merge=0/0, ticks=10368711/0, in_queue=10368711, util=99.15% 00:15:50.871 nvme5n1: ios=98302/0, merge=0/0, ticks=10189921/0, in_queue=10189921, util=99.21% 00:15:50.871 15:20:51 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@38 -- # sync 00:15:50.871 15:20:51 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # seq 0 5 00:15:50.871 15:20:51 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # for i in $(seq 0 5) 00:15:50.871 15:20:51 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@41 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode0 00:15:50.871 NQN:nqn.2016-06.io.spdk:cnode0 disconnected 1 controller(s) 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@42 -- # waitforserial_disconnect SPDK00000000000000 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1219 -- # local i=0 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # grep -q -w SPDK00000000000000 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # grep -q -w SPDK00000000000000 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1231 -- # return 0 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@43 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode0 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # for i in $(seq 0 5) 00:15:50.871 15:20:52 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@41 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode1 00:15:51.809 NQN:nqn.2016-06.io.spdk:cnode1 disconnected 1 controller(s) 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@42 -- # waitforserial_disconnect SPDK00000000000001 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1219 -- # local i=0 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # grep -q -w SPDK00000000000001 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # grep -q -w SPDK00000000000001 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1231 -- # return 0 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@43 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # for i in $(seq 0 5) 00:15:51.809 15:20:53 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@41 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode2 00:15:52.747 NQN:nqn.2016-06.io.spdk:cnode2 disconnected 1 controller(s) 00:15:52.747 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@42 -- # waitforserial_disconnect SPDK00000000000002 00:15:52.747 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1219 -- # local i=0 00:15:52.747 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:15:52.747 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # grep -q -w SPDK00000000000002 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # grep -q -w SPDK00000000000002 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1231 -- # return 0 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@43 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode2 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # for i in $(seq 0 5) 00:15:53.006 15:20:54 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@41 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode3 00:15:53.945 NQN:nqn.2016-06.io.spdk:cnode3 disconnected 1 controller(s) 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@42 -- # waitforserial_disconnect SPDK00000000000003 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1219 -- # local i=0 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # grep -q -w SPDK00000000000003 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # grep -q -w SPDK00000000000003 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1231 -- # return 0 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@43 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode3 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # for i in $(seq 0 5) 00:15:53.945 15:20:55 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@41 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode4 00:15:54.886 NQN:nqn.2016-06.io.spdk:cnode4 disconnected 1 controller(s) 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@42 -- # waitforserial_disconnect SPDK00000000000004 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1219 -- # local i=0 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # grep -q -w SPDK00000000000004 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # grep -q -w SPDK00000000000004 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1231 -- # return 0 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@43 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode4 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@40 -- # for i in $(seq 0 5) 00:15:54.886 15:20:56 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@41 -- # nvme disconnect -n nqn.2016-06.io.spdk:cnode5 00:15:55.823 NQN:nqn.2016-06.io.spdk:cnode5 disconnected 1 controller(s) 00:15:55.823 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@42 -- # waitforserial_disconnect SPDK00000000000005 00:15:55.823 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1219 -- # local i=0 00:15:55.823 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # lsblk -o NAME,SERIAL 00:15:55.823 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1220 -- # grep -q -w SPDK00000000000005 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # lsblk -l -o NAME,SERIAL 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1227 -- # grep -q -w SPDK00000000000005 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1231 -- # return 0 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@43 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode5 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@561 -- # xtrace_disable 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- target/srq_overwhelm.sh@48 -- # nvmftestfini 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@331 -- # nvmfcleanup 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@99 -- # sync 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@102 -- # set +e 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@103 -- # for i in {1..20} 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:15:56.082 rmmod nvme_rdma 00:15:56.082 rmmod nvme_fabrics 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@106 -- # set -e 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@107 -- # return 0 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@332 -- # '[' -n 1797268 ']' 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@333 -- # killprocess 1797268 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@950 -- # '[' -z 1797268 ']' 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@954 -- # kill -0 1797268 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@955 -- # uname 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1797268 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1797268' 00:15:56.082 killing process with pid 1797268 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@969 -- # kill 1797268 00:15:56.082 15:20:57 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@974 -- # wait 1797268 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@338 -- # nvmf_fini 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@264 -- # local dev 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@267 -- # remove_target_ns 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@22 -- # _remove_target_ns 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@268 -- # delete_main_bridge 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@130 -- # return 0 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:15:56.342 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@41 -- # _dev=0 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@41 -- # dev_map=() 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/setup.sh@284 -- # iptr 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@538 -- # iptables-save 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- nvmf/common.sh@538 -- # iptables-restore 00:15:56.602 00:15:56.602 real 0m35.463s 00:15:56.602 user 1m55.501s 00:15:56.602 sys 0m16.021s 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_srq_overwhelm -- common/autotest_common.sh@10 -- # set +x 00:15:56.602 ************************************ 00:15:56.602 END TEST nvmf_srq_overwhelm 00:15:56.602 ************************************ 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@65 -- # run_test nvmf_shutdown /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/shutdown.sh --transport=rdma 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:15:56.602 ************************************ 00:15:56.602 START TEST nvmf_shutdown 00:15:56.602 ************************************ 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/shutdown.sh --transport=rdma 00:15:56.602 * Looking for test storage... 00:15:56.602 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1681 -- # lcov --version 00:15:56.602 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@333 -- # local ver1 ver1_l 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@334 -- # local ver2 ver2_l 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@336 -- # IFS=.-: 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@336 -- # read -ra ver1 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@337 -- # IFS=.-: 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@337 -- # read -ra ver2 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@338 -- # local 'op=<' 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@340 -- # ver1_l=2 00:15:56.862 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@341 -- # ver2_l=1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@344 -- # case "$op" in 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@345 -- # : 1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@364 -- # (( v = 0 )) 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@365 -- # decimal 1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@353 -- # local d=1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@355 -- # echo 1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@365 -- # ver1[v]=1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@366 -- # decimal 2 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@353 -- # local d=2 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@355 -- # echo 2 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@366 -- # ver2[v]=2 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@368 -- # return 0 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:15:56.863 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:56.863 --rc genhtml_branch_coverage=1 00:15:56.863 --rc genhtml_function_coverage=1 00:15:56.863 --rc genhtml_legend=1 00:15:56.863 --rc geninfo_all_blocks=1 00:15:56.863 --rc geninfo_unexecuted_blocks=1 00:15:56.863 00:15:56.863 ' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:15:56.863 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:56.863 --rc genhtml_branch_coverage=1 00:15:56.863 --rc genhtml_function_coverage=1 00:15:56.863 --rc genhtml_legend=1 00:15:56.863 --rc geninfo_all_blocks=1 00:15:56.863 --rc geninfo_unexecuted_blocks=1 00:15:56.863 00:15:56.863 ' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:15:56.863 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:56.863 --rc genhtml_branch_coverage=1 00:15:56.863 --rc genhtml_function_coverage=1 00:15:56.863 --rc genhtml_legend=1 00:15:56.863 --rc geninfo_all_blocks=1 00:15:56.863 --rc geninfo_unexecuted_blocks=1 00:15:56.863 00:15:56.863 ' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:15:56.863 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:56.863 --rc genhtml_branch_coverage=1 00:15:56.863 --rc genhtml_function_coverage=1 00:15:56.863 --rc genhtml_legend=1 00:15:56.863 --rc geninfo_all_blocks=1 00:15:56.863 --rc geninfo_unexecuted_blocks=1 00:15:56.863 00:15:56.863 ' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@7 -- # uname -s 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@15 -- # shopt -s extglob 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- paths/export.sh@5 -- # export PATH 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@50 -- # : 0 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:15:56.863 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:15:56.863 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- nvmf/common.sh@54 -- # have_pci_nics=0 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@12 -- # MALLOC_BDEV_SIZE=64 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@13 -- # MALLOC_BLOCK_SIZE=512 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@162 -- # run_test nvmf_shutdown_tc1 nvmf_shutdown_tc1 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@10 -- # set +x 00:15:56.864 ************************************ 00:15:56.864 START TEST nvmf_shutdown_tc1 00:15:56.864 ************************************ 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@1125 -- # nvmf_shutdown_tc1 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@75 -- # starttarget 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@16 -- # nvmftestinit 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@292 -- # prepare_net_devs 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@254 -- # local -g is_hw=no 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@256 -- # remove_target_ns 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@125 -- # xtrace_disable 00:15:56.864 15:20:58 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@131 -- # pci_devs=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@131 -- # local -a pci_devs 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@132 -- # pci_net_devs=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@133 -- # pci_drivers=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@133 -- # local -A pci_drivers 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@135 -- # net_devs=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@135 -- # local -ga net_devs 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@136 -- # e810=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@136 -- # local -ga e810 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@137 -- # x722=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@137 -- # local -ga x722 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@138 -- # mlx=() 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@138 -- # local -ga mlx 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:16:04.992 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:04.992 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:16:04.993 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:16:04.993 Found net devices under 0000:18:00.0: mlx_0_0 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:16:04.993 Found net devices under 0000:18:00.1: mlx_0_1 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@249 -- # get_rdma_if_list 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@75 -- # rdma_devs=() 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@89 -- # continue 2 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@89 -- # continue 2 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@258 -- # is_hw=yes 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@61 -- # uname 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@65 -- # modprobe ib_cm 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@66 -- # modprobe ib_core 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@67 -- # modprobe ib_umad 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@69 -- # modprobe iw_cm 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@27 -- # local -gA dev_map 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@28 -- # local -g _dev 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@44 -- # ips=() 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:16:04.993 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@58 -- # key_initiator=target1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@11 -- # local val=167772161 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:16:04.994 10.0.0.1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@11 -- # local val=167772162 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:16:04.994 10.0.0.2 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@38 -- # ping_ips 1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:04.994 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:04.994 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:16:04.994 00:16:04.994 --- 10.0.0.2 ping statistics --- 00:16:04.994 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:04.994 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:16:04.994 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:04.995 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:04.995 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.030 ms 00:16:04.995 00:16:04.995 --- 10.0.0.2 ping statistics --- 00:16:04.995 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:04.995 rtt min/avg/max/mdev = 0.030/0.030/0.030/0.000 ms 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@98 -- # (( pair++ )) 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@266 -- # return 0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:04.995 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@19 -- # nvmfappstart -m 0x1E 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@324 -- # nvmfpid=1804448 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@325 -- # waitforlisten 1804448 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1E 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@831 -- # '[' -z 1804448 ']' 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:04.996 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:04.996 15:21:05 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:04.996 [2024-09-27 15:21:05.757072] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:04.996 [2024-09-27 15:21:05.757136] [ DPDK EAL parameters: nvmf -c 0x1E --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:04.996 [2024-09-27 15:21:05.844443] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:16:04.996 [2024-09-27 15:21:05.933207] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:16:04.996 [2024-09-27 15:21:05.933249] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:16:04.996 [2024-09-27 15:21:05.933259] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:16:04.996 [2024-09-27 15:21:05.933267] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:16:04.996 [2024-09-27 15:21:05.933275] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:16:04.996 [2024-09-27 15:21:05.933364] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:16:04.996 [2024-09-27 15:21:05.933381] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:16:04.996 [2024-09-27 15:21:05.933486] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:16:04.996 [2024-09-27 15:21:05.933488] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@864 -- # return 0 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@21 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:04.996 [2024-09-27 15:21:06.699116] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1af17a0/0x1af5c90) succeed. 00:16:04.996 [2024-09-27 15:21:06.709776] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1af2de0/0x1b37330) succeed. 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@23 -- # num_subsystems=({1..10}) 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@25 -- # timing_enter create_subsystems 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:04.996 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@27 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@29 -- # cat 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@36 -- # rpc_cmd 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:05.257 15:21:06 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:05.257 Malloc1 00:16:05.257 [2024-09-27 15:21:06.948871] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:16:05.257 Malloc2 00:16:05.257 Malloc3 00:16:05.257 Malloc4 00:16:05.517 Malloc5 00:16:05.517 Malloc6 00:16:05.517 Malloc7 00:16:05.517 Malloc8 00:16:05.517 Malloc9 00:16:05.517 Malloc10 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@37 -- # timing_exit create_subsystems 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@79 -- # perfpid=1804775 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@80 -- # waitforlisten 1804775 /var/tmp/bdevperf.sock 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@831 -- # '[' -z 1804775 ']' 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@78 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -m 0x1 -i 1 -r /var/tmp/bdevperf.sock --json /dev/fd/63 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@78 -- # gen_nvmf_target_json 1 2 3 4 5 6 7 8 9 10 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:16:05.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@368 -- # config=() 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@368 -- # local subsystem config 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.777 { 00:16:05.777 "params": { 00:16:05.777 "name": "Nvme$subsystem", 00:16:05.777 "trtype": "$TEST_TRANSPORT", 00:16:05.777 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.777 "adrfam": "ipv4", 00:16:05.777 "trsvcid": "$NVMF_PORT", 00:16:05.777 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.777 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.777 "hdgst": ${hdgst:-false}, 00:16:05.777 "ddgst": ${ddgst:-false} 00:16:05.777 }, 00:16:05.777 "method": "bdev_nvme_attach_controller" 00:16:05.777 } 00:16:05.777 EOF 00:16:05.777 )") 00:16:05.777 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 [2024-09-27 15:21:07.467510] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:05.778 [2024-09-27 15:21:07.467566] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk1 --proc-type=auto ] 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:05.778 { 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme$subsystem", 00:16:05.778 "trtype": "$TEST_TRANSPORT", 00:16:05.778 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "$NVMF_PORT", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:05.778 "hdgst": ${hdgst:-false}, 00:16:05.778 "ddgst": ${ddgst:-false} 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 } 00:16:05.778 EOF 00:16:05.778 )") 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@392 -- # jq . 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@393 -- # IFS=, 00:16:05.778 15:21:07 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme1", 00:16:05.778 "trtype": "rdma", 00:16:05.778 "traddr": "10.0.0.2", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "4420", 00:16:05.778 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:16:05.778 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:16:05.778 "hdgst": false, 00:16:05.778 "ddgst": false 00:16:05.778 }, 00:16:05.778 "method": "bdev_nvme_attach_controller" 00:16:05.778 },{ 00:16:05.778 "params": { 00:16:05.778 "name": "Nvme2", 00:16:05.778 "trtype": "rdma", 00:16:05.778 "traddr": "10.0.0.2", 00:16:05.778 "adrfam": "ipv4", 00:16:05.778 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode2", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host2", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme3", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode3", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host3", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme4", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode4", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host4", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme5", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode5", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host5", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme6", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode6", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host6", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme7", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode7", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host7", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme8", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode8", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host8", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme9", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode9", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host9", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 },{ 00:16:05.779 "params": { 00:16:05.779 "name": "Nvme10", 00:16:05.779 "trtype": "rdma", 00:16:05.779 "traddr": "10.0.0.2", 00:16:05.779 "adrfam": "ipv4", 00:16:05.779 "trsvcid": "4420", 00:16:05.779 "subnqn": "nqn.2016-06.io.spdk:cnode10", 00:16:05.779 "hostnqn": "nqn.2016-06.io.spdk:host10", 00:16:05.779 "hdgst": false, 00:16:05.779 "ddgst": false 00:16:05.779 }, 00:16:05.779 "method": "bdev_nvme_attach_controller" 00:16:05.779 }' 00:16:05.779 [2024-09-27 15:21:07.549207] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:06.038 [2024-09-27 15:21:07.633851] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@864 -- # return 0 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@81 -- # rpc_cmd -s /var/tmp/bdevperf.sock framework_wait_init 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@84 -- # kill -9 1804775 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@85 -- # rm -f /var/run/spdk_bdev1 00:16:06.982 15:21:08 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@88 -- # sleep 1 00:16:07.920 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/shutdown.sh: line 74: 1804775 Killed $rootdir/test/app/bdev_svc/bdev_svc -m 0x1 -i 1 -r /var/tmp/bdevperf.sock --json <(gen_nvmf_target_json "${num_subsystems[@]}") 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@89 -- # kill -0 1804448 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@92 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -o 65536 -w verify -t 1 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@92 -- # gen_nvmf_target_json 1 2 3 4 5 6 7 8 9 10 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@368 -- # config=() 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@368 -- # local subsystem config 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.920 { 00:16:07.920 "params": { 00:16:07.920 "name": "Nvme$subsystem", 00:16:07.920 "trtype": "$TEST_TRANSPORT", 00:16:07.920 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.920 "adrfam": "ipv4", 00:16:07.920 "trsvcid": "$NVMF_PORT", 00:16:07.920 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.920 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.920 "hdgst": ${hdgst:-false}, 00:16:07.920 "ddgst": ${ddgst:-false} 00:16:07.920 }, 00:16:07.920 "method": "bdev_nvme_attach_controller" 00:16:07.920 } 00:16:07.920 EOF 00:16:07.920 )") 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.920 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.920 { 00:16:07.920 "params": { 00:16:07.920 "name": "Nvme$subsystem", 00:16:07.920 "trtype": "$TEST_TRANSPORT", 00:16:07.920 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 [2024-09-27 15:21:09.558615] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:07.921 [2024-09-27 15:21:09.558678] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1805028 ] 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:07.921 { 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme$subsystem", 00:16:07.921 "trtype": "$TEST_TRANSPORT", 00:16:07.921 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "$NVMF_PORT", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:07.921 "hdgst": ${hdgst:-false}, 00:16:07.921 "ddgst": ${ddgst:-false} 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 } 00:16:07.921 EOF 00:16:07.921 )") 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@390 -- # cat 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@392 -- # jq . 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@393 -- # IFS=, 00:16:07.921 15:21:09 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme1", 00:16:07.921 "trtype": "rdma", 00:16:07.921 "traddr": "10.0.0.2", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "4420", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:16:07.921 "hdgst": false, 00:16:07.921 "ddgst": false 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 },{ 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme2", 00:16:07.921 "trtype": "rdma", 00:16:07.921 "traddr": "10.0.0.2", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "4420", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode2", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host2", 00:16:07.921 "hdgst": false, 00:16:07.921 "ddgst": false 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 },{ 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme3", 00:16:07.921 "trtype": "rdma", 00:16:07.921 "traddr": "10.0.0.2", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "4420", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode3", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host3", 00:16:07.921 "hdgst": false, 00:16:07.921 "ddgst": false 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 },{ 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme4", 00:16:07.921 "trtype": "rdma", 00:16:07.921 "traddr": "10.0.0.2", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "4420", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode4", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host4", 00:16:07.921 "hdgst": false, 00:16:07.921 "ddgst": false 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 },{ 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme5", 00:16:07.921 "trtype": "rdma", 00:16:07.921 "traddr": "10.0.0.2", 00:16:07.921 "adrfam": "ipv4", 00:16:07.921 "trsvcid": "4420", 00:16:07.921 "subnqn": "nqn.2016-06.io.spdk:cnode5", 00:16:07.921 "hostnqn": "nqn.2016-06.io.spdk:host5", 00:16:07.921 "hdgst": false, 00:16:07.921 "ddgst": false 00:16:07.921 }, 00:16:07.921 "method": "bdev_nvme_attach_controller" 00:16:07.921 },{ 00:16:07.921 "params": { 00:16:07.921 "name": "Nvme6", 00:16:07.921 "trtype": "rdma", 00:16:07.921 "traddr": "10.0.0.2", 00:16:07.921 "adrfam": "ipv4", 00:16:07.922 "trsvcid": "4420", 00:16:07.922 "subnqn": "nqn.2016-06.io.spdk:cnode6", 00:16:07.922 "hostnqn": "nqn.2016-06.io.spdk:host6", 00:16:07.922 "hdgst": false, 00:16:07.922 "ddgst": false 00:16:07.922 }, 00:16:07.922 "method": "bdev_nvme_attach_controller" 00:16:07.922 },{ 00:16:07.922 "params": { 00:16:07.922 "name": "Nvme7", 00:16:07.922 "trtype": "rdma", 00:16:07.922 "traddr": "10.0.0.2", 00:16:07.922 "adrfam": "ipv4", 00:16:07.922 "trsvcid": "4420", 00:16:07.922 "subnqn": "nqn.2016-06.io.spdk:cnode7", 00:16:07.922 "hostnqn": "nqn.2016-06.io.spdk:host7", 00:16:07.922 "hdgst": false, 00:16:07.922 "ddgst": false 00:16:07.922 }, 00:16:07.922 "method": "bdev_nvme_attach_controller" 00:16:07.922 },{ 00:16:07.922 "params": { 00:16:07.922 "name": "Nvme8", 00:16:07.922 "trtype": "rdma", 00:16:07.922 "traddr": "10.0.0.2", 00:16:07.922 "adrfam": "ipv4", 00:16:07.922 "trsvcid": "4420", 00:16:07.922 "subnqn": "nqn.2016-06.io.spdk:cnode8", 00:16:07.922 "hostnqn": "nqn.2016-06.io.spdk:host8", 00:16:07.922 "hdgst": false, 00:16:07.922 "ddgst": false 00:16:07.922 }, 00:16:07.922 "method": "bdev_nvme_attach_controller" 00:16:07.922 },{ 00:16:07.922 "params": { 00:16:07.922 "name": "Nvme9", 00:16:07.922 "trtype": "rdma", 00:16:07.922 "traddr": "10.0.0.2", 00:16:07.922 "adrfam": "ipv4", 00:16:07.922 "trsvcid": "4420", 00:16:07.922 "subnqn": "nqn.2016-06.io.spdk:cnode9", 00:16:07.922 "hostnqn": "nqn.2016-06.io.spdk:host9", 00:16:07.922 "hdgst": false, 00:16:07.922 "ddgst": false 00:16:07.922 }, 00:16:07.922 "method": "bdev_nvme_attach_controller" 00:16:07.922 },{ 00:16:07.922 "params": { 00:16:07.922 "name": "Nvme10", 00:16:07.922 "trtype": "rdma", 00:16:07.922 "traddr": "10.0.0.2", 00:16:07.922 "adrfam": "ipv4", 00:16:07.922 "trsvcid": "4420", 00:16:07.922 "subnqn": "nqn.2016-06.io.spdk:cnode10", 00:16:07.922 "hostnqn": "nqn.2016-06.io.spdk:host10", 00:16:07.922 "hdgst": false, 00:16:07.922 "ddgst": false 00:16:07.922 }, 00:16:07.922 "method": "bdev_nvme_attach_controller" 00:16:07.922 }' 00:16:07.922 [2024-09-27 15:21:09.650489] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:07.922 [2024-09-27 15:21:09.734008] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:08.858 Running I/O for 1 seconds... 00:16:10.236 3203.00 IOPS, 200.19 MiB/s 00:16:10.236 Latency(us) 00:16:10.236 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:16:10.236 Job: Nvme1n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme1n1 : 1.17 353.23 22.08 0.00 0.00 174835.08 7009.50 216097.84 00:16:10.236 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme2n1 : 1.18 356.26 22.27 0.00 0.00 171581.50 3732.70 208803.39 00:16:10.236 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme3n1 : 1.18 354.20 22.14 0.00 0.00 170188.88 51516.99 201508.95 00:16:10.236 Job: Nvme4n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme4n1 : 1.19 401.67 25.10 0.00 0.00 151170.15 6012.22 134035.37 00:16:10.236 Job: Nvme5n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme5n1 : 1.19 391.22 24.45 0.00 0.00 152956.77 6069.20 179625.63 00:16:10.236 Job: Nvme6n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme6n1 : 1.19 400.13 25.01 0.00 0.00 147563.95 6183.18 114431.55 00:16:10.236 Job: Nvme7n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.236 Verification LBA range: start 0x0 length 0x400 00:16:10.236 Nvme7n1 : 1.19 403.94 25.25 0.00 0.00 144167.15 6411.13 107137.11 00:16:10.236 Job: Nvme8n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.237 Verification LBA range: start 0x0 length 0x400 00:16:10.237 Nvme8n1 : 1.19 386.83 24.18 0.00 0.00 147998.07 6354.14 109872.53 00:16:10.237 Job: Nvme9n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.237 Verification LBA range: start 0x0 length 0x400 00:16:10.237 Nvme9n1 : 1.19 385.72 24.11 0.00 0.00 146273.69 6183.18 121270.09 00:16:10.237 Job: Nvme10n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:10.237 Verification LBA range: start 0x0 length 0x400 00:16:10.237 Nvme10n1 : 1.19 268.55 16.78 0.00 0.00 207936.65 6867.03 459549.83 00:16:10.237 =================================================================================================================== 00:16:10.237 Total : 3701.73 231.36 0.00 0.00 159497.94 3732.70 459549.83 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@95 -- # stoptarget 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@42 -- # rm -f ./local-job0-0-verify.state 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@43 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevperf.conf 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@44 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- target/shutdown.sh@46 -- # nvmftestfini 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@331 -- # nvmfcleanup 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@99 -- # sync 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@102 -- # set +e 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@103 -- # for i in {1..20} 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:16:10.495 rmmod nvme_rdma 00:16:10.495 rmmod nvme_fabrics 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@106 -- # set -e 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@107 -- # return 0 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@332 -- # '[' -n 1804448 ']' 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@333 -- # killprocess 1804448 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@950 -- # '[' -z 1804448 ']' 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@954 -- # kill -0 1804448 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@955 -- # uname 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1804448 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:16:10.495 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1804448' 00:16:10.495 killing process with pid 1804448 00:16:10.496 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@969 -- # kill 1804448 00:16:10.496 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@974 -- # wait 1804448 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@338 -- # nvmf_fini 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@264 -- # local dev 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@267 -- # remove_target_ns 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@268 -- # delete_main_bridge 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@130 -- # return 0 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:11.064 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@41 -- # _dev=0 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@41 -- # dev_map=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/setup.sh@284 -- # iptr 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@538 -- # iptables-save 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- nvmf/common.sh@538 -- # iptables-restore 00:16:11.065 00:16:11.065 real 0m14.183s 00:16:11.065 user 0m32.130s 00:16:11.065 sys 0m6.674s 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc1 -- common/autotest_common.sh@10 -- # set +x 00:16:11.065 ************************************ 00:16:11.065 END TEST nvmf_shutdown_tc1 00:16:11.065 ************************************ 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@163 -- # run_test nvmf_shutdown_tc2 nvmf_shutdown_tc2 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@10 -- # set +x 00:16:11.065 ************************************ 00:16:11.065 START TEST nvmf_shutdown_tc2 00:16:11.065 ************************************ 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@1125 -- # nvmf_shutdown_tc2 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@100 -- # starttarget 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@16 -- # nvmftestinit 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@292 -- # prepare_net_devs 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@254 -- # local -g is_hw=no 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@256 -- # remove_target_ns 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@125 -- # xtrace_disable 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@131 -- # pci_devs=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@131 -- # local -a pci_devs 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@132 -- # pci_net_devs=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@133 -- # pci_drivers=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@133 -- # local -A pci_drivers 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@135 -- # net_devs=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@135 -- # local -ga net_devs 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@136 -- # e810=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@136 -- # local -ga e810 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@137 -- # x722=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@137 -- # local -ga x722 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@138 -- # mlx=() 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@138 -- # local -ga mlx 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:16:11.065 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:16:11.065 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:11.065 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:16:11.065 Found net devices under 0000:18:00.0: mlx_0_0 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:16:11.066 Found net devices under 0000:18:00.1: mlx_0_1 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@249 -- # get_rdma_if_list 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@75 -- # rdma_devs=() 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@89 -- # continue 2 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@89 -- # continue 2 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@258 -- # is_hw=yes 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:16:11.066 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@61 -- # uname 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@65 -- # modprobe ib_cm 00:16:11.326 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@66 -- # modprobe ib_core 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@67 -- # modprobe ib_umad 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@69 -- # modprobe iw_cm 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@27 -- # local -gA dev_map 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@28 -- # local -g _dev 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@44 -- # ips=() 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@58 -- # key_initiator=target1 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@11 -- # local val=167772161 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:16:11.327 10.0.0.1 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:16:11.327 15:21:12 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@11 -- # local val=167772162 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:16:11.327 10.0.0.2 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@38 -- # ping_ips 1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:16:11.327 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:11.328 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:11.328 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.042 ms 00:16:11.328 00:16:11.328 --- 10.0.0.2 ping statistics --- 00:16:11.328 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:11.328 rtt min/avg/max/mdev = 0.042/0.042/0.042/0.000 ms 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:11.328 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:11.328 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.027 ms 00:16:11.328 00:16:11.328 --- 10.0.0.2 ping statistics --- 00:16:11.328 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:11.328 rtt min/avg/max/mdev = 0.027/0.027/0.027/0.000 ms 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@98 -- # (( pair++ )) 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@266 -- # return 0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:11.328 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:16:11.329 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@19 -- # nvmfappstart -m 0x1E 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@324 -- # nvmfpid=1805647 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@325 -- # waitforlisten 1805647 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1E 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@831 -- # '[' -z 1805647 ']' 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:11.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:11.588 15:21:13 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:11.589 [2024-09-27 15:21:13.244626] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:11.589 [2024-09-27 15:21:13.244685] [ DPDK EAL parameters: nvmf -c 0x1E --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:11.589 [2024-09-27 15:21:13.331055] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:16:11.589 [2024-09-27 15:21:13.419473] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:16:11.589 [2024-09-27 15:21:13.419521] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:16:11.589 [2024-09-27 15:21:13.419530] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:16:11.589 [2024-09-27 15:21:13.419538] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:16:11.589 [2024-09-27 15:21:13.419545] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:16:11.589 [2024-09-27 15:21:13.419667] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:16:11.589 [2024-09-27 15:21:13.419751] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:16:11.589 [2024-09-27 15:21:13.419839] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:16:11.589 [2024-09-27 15:21:13.419840] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@864 -- # return 0 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@21 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:12.526 [2024-09-27 15:21:14.167512] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0xa217a0/0xa25c90) succeed. 00:16:12.526 [2024-09-27 15:21:14.177927] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0xa22de0/0xa67330) succeed. 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@23 -- # num_subsystems=({1..10}) 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@25 -- # timing_enter create_subsystems 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@27 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@29 -- # cat 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@36 -- # rpc_cmd 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:12.526 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:12.785 Malloc1 00:16:12.785 [2024-09-27 15:21:14.410788] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:16:12.785 Malloc2 00:16:12.785 Malloc3 00:16:12.785 Malloc4 00:16:12.785 Malloc5 00:16:12.785 Malloc6 00:16:13.044 Malloc7 00:16:13.044 Malloc8 00:16:13.044 Malloc9 00:16:13.044 Malloc10 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@37 -- # timing_exit create_subsystems 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@104 -- # perfpid=1805884 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@105 -- # waitforlisten 1805884 /var/tmp/bdevperf.sock 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@831 -- # '[' -z 1805884 ']' 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@103 -- # gen_nvmf_target_json 1 2 3 4 5 6 7 8 9 10 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@103 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bdevperf.sock --json /dev/fd/63 -q 64 -o 65536 -w verify -t 10 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@368 -- # config=() 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@368 -- # local subsystem config 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:16:13.044 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.044 { 00:16:13.044 "params": { 00:16:13.044 "name": "Nvme$subsystem", 00:16:13.044 "trtype": "$TEST_TRANSPORT", 00:16:13.044 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.044 "adrfam": "ipv4", 00:16:13.044 "trsvcid": "$NVMF_PORT", 00:16:13.044 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.044 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.044 "hdgst": ${hdgst:-false}, 00:16:13.044 "ddgst": ${ddgst:-false} 00:16:13.044 }, 00:16:13.044 "method": "bdev_nvme_attach_controller" 00:16:13.044 } 00:16:13.044 EOF 00:16:13.044 )") 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.044 { 00:16:13.044 "params": { 00:16:13.044 "name": "Nvme$subsystem", 00:16:13.044 "trtype": "$TEST_TRANSPORT", 00:16:13.044 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.044 "adrfam": "ipv4", 00:16:13.044 "trsvcid": "$NVMF_PORT", 00:16:13.044 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.044 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.044 "hdgst": ${hdgst:-false}, 00:16:13.044 "ddgst": ${ddgst:-false} 00:16:13.044 }, 00:16:13.044 "method": "bdev_nvme_attach_controller" 00:16:13.044 } 00:16:13.044 EOF 00:16:13.044 )") 00:16:13.044 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.304 { 00:16:13.304 "params": { 00:16:13.304 "name": "Nvme$subsystem", 00:16:13.304 "trtype": "$TEST_TRANSPORT", 00:16:13.304 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.304 "adrfam": "ipv4", 00:16:13.304 "trsvcid": "$NVMF_PORT", 00:16:13.304 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.304 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.304 "hdgst": ${hdgst:-false}, 00:16:13.304 "ddgst": ${ddgst:-false} 00:16:13.304 }, 00:16:13.304 "method": "bdev_nvme_attach_controller" 00:16:13.304 } 00:16:13.304 EOF 00:16:13.304 )") 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.304 { 00:16:13.304 "params": { 00:16:13.304 "name": "Nvme$subsystem", 00:16:13.304 "trtype": "$TEST_TRANSPORT", 00:16:13.304 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.304 "adrfam": "ipv4", 00:16:13.304 "trsvcid": "$NVMF_PORT", 00:16:13.304 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.304 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.304 "hdgst": ${hdgst:-false}, 00:16:13.304 "ddgst": ${ddgst:-false} 00:16:13.304 }, 00:16:13.304 "method": "bdev_nvme_attach_controller" 00:16:13.304 } 00:16:13.304 EOF 00:16:13.304 )") 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.304 { 00:16:13.304 "params": { 00:16:13.304 "name": "Nvme$subsystem", 00:16:13.304 "trtype": "$TEST_TRANSPORT", 00:16:13.304 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.304 "adrfam": "ipv4", 00:16:13.304 "trsvcid": "$NVMF_PORT", 00:16:13.304 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.304 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.304 "hdgst": ${hdgst:-false}, 00:16:13.304 "ddgst": ${ddgst:-false} 00:16:13.304 }, 00:16:13.304 "method": "bdev_nvme_attach_controller" 00:16:13.304 } 00:16:13.304 EOF 00:16:13.304 )") 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.304 { 00:16:13.304 "params": { 00:16:13.304 "name": "Nvme$subsystem", 00:16:13.304 "trtype": "$TEST_TRANSPORT", 00:16:13.304 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.304 "adrfam": "ipv4", 00:16:13.304 "trsvcid": "$NVMF_PORT", 00:16:13.304 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.304 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.304 "hdgst": ${hdgst:-false}, 00:16:13.304 "ddgst": ${ddgst:-false} 00:16:13.304 }, 00:16:13.304 "method": "bdev_nvme_attach_controller" 00:16:13.304 } 00:16:13.304 EOF 00:16:13.304 )") 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.304 [2024-09-27 15:21:14.925328] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:13.304 [2024-09-27 15:21:14.925399] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1805884 ] 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.304 { 00:16:13.304 "params": { 00:16:13.304 "name": "Nvme$subsystem", 00:16:13.304 "trtype": "$TEST_TRANSPORT", 00:16:13.304 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.304 "adrfam": "ipv4", 00:16:13.304 "trsvcid": "$NVMF_PORT", 00:16:13.304 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.304 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.304 "hdgst": ${hdgst:-false}, 00:16:13.304 "ddgst": ${ddgst:-false} 00:16:13.304 }, 00:16:13.304 "method": "bdev_nvme_attach_controller" 00:16:13.304 } 00:16:13.304 EOF 00:16:13.304 )") 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.304 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.304 { 00:16:13.304 "params": { 00:16:13.304 "name": "Nvme$subsystem", 00:16:13.304 "trtype": "$TEST_TRANSPORT", 00:16:13.304 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.304 "adrfam": "ipv4", 00:16:13.304 "trsvcid": "$NVMF_PORT", 00:16:13.304 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.304 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.304 "hdgst": ${hdgst:-false}, 00:16:13.304 "ddgst": ${ddgst:-false} 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 } 00:16:13.305 EOF 00:16:13.305 )") 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.305 { 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme$subsystem", 00:16:13.305 "trtype": "$TEST_TRANSPORT", 00:16:13.305 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "$NVMF_PORT", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.305 "hdgst": ${hdgst:-false}, 00:16:13.305 "ddgst": ${ddgst:-false} 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 } 00:16:13.305 EOF 00:16:13.305 )") 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:13.305 { 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme$subsystem", 00:16:13.305 "trtype": "$TEST_TRANSPORT", 00:16:13.305 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "$NVMF_PORT", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:13.305 "hdgst": ${hdgst:-false}, 00:16:13.305 "ddgst": ${ddgst:-false} 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 } 00:16:13.305 EOF 00:16:13.305 )") 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@390 -- # cat 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@392 -- # jq . 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@393 -- # IFS=, 00:16:13.305 15:21:14 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme1", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme2", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode2", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host2", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme3", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode3", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host3", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme4", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode4", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host4", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme5", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode5", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host5", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme6", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode6", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host6", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme7", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode7", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host7", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme8", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode8", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host8", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme9", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode9", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host9", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 },{ 00:16:13.305 "params": { 00:16:13.305 "name": "Nvme10", 00:16:13.305 "trtype": "rdma", 00:16:13.305 "traddr": "10.0.0.2", 00:16:13.305 "adrfam": "ipv4", 00:16:13.305 "trsvcid": "4420", 00:16:13.305 "subnqn": "nqn.2016-06.io.spdk:cnode10", 00:16:13.305 "hostnqn": "nqn.2016-06.io.spdk:host10", 00:16:13.305 "hdgst": false, 00:16:13.305 "ddgst": false 00:16:13.305 }, 00:16:13.305 "method": "bdev_nvme_attach_controller" 00:16:13.305 }' 00:16:13.305 [2024-09-27 15:21:15.012944] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:13.305 [2024-09-27 15:21:15.094488] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:14.323 Running I/O for 10 seconds... 00:16:14.323 15:21:15 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:14.323 15:21:15 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@864 -- # return 0 00:16:14.323 15:21:15 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@106 -- # rpc_cmd -s /var/tmp/bdevperf.sock framework_wait_init 00:16:14.323 15:21:15 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:14.323 15:21:15 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@108 -- # waitforio /var/tmp/bdevperf.sock Nvme1n1 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@51 -- # '[' -z /var/tmp/bdevperf.sock ']' 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@55 -- # '[' -z Nvme1n1 ']' 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@58 -- # local ret=1 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@59 -- # local i 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@60 -- # (( i = 10 )) 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@60 -- # (( i != 0 )) 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@61 -- # rpc_cmd -s /var/tmp/bdevperf.sock bdev_get_iostat -b Nvme1n1 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@61 -- # jq -r '.bdevs[0].num_read_ops' 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:14.323 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:14.582 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:14.582 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@61 -- # read_io_count=19 00:16:14.582 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@64 -- # '[' 19 -ge 100 ']' 00:16:14.582 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@68 -- # sleep 0.25 00:16:14.841 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@60 -- # (( i-- )) 00:16:14.841 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@60 -- # (( i != 0 )) 00:16:14.841 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@61 -- # rpc_cmd -s /var/tmp/bdevperf.sock bdev_get_iostat -b Nvme1n1 00:16:14.841 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@61 -- # jq -r '.bdevs[0].num_read_ops' 00:16:14.841 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:14.841 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@61 -- # read_io_count=178 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@64 -- # '[' 178 -ge 100 ']' 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@65 -- # ret=0 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@66 -- # break 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@70 -- # return 0 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@111 -- # killprocess 1805884 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@950 -- # '[' -z 1805884 ']' 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@954 -- # kill -0 1805884 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@955 -- # uname 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1805884 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1805884' 00:16:15.100 killing process with pid 1805884 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@969 -- # kill 1805884 00:16:15.100 15:21:16 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@974 -- # wait 1805884 00:16:15.100 Received shutdown signal, test time was about 0.852852 seconds 00:16:15.100 00:16:15.100 Latency(us) 00:16:15.100 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:16:15.100 Job: Nvme1n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme1n1 : 0.84 361.68 22.61 0.00 0.00 173250.42 7351.43 240716.58 00:16:15.100 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme2n1 : 0.84 381.30 23.83 0.00 0.00 161517.43 7265.95 164124.94 00:16:15.100 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme3n1 : 0.84 380.75 23.80 0.00 0.00 158264.81 8776.13 157742.30 00:16:15.100 Job: Nvme4n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme4n1 : 0.84 388.54 24.28 0.00 0.00 152074.72 4786.98 146800.64 00:16:15.100 Job: Nvme5n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme5n1 : 0.84 379.58 23.72 0.00 0.00 152943.93 9573.95 140418.00 00:16:15.100 Job: Nvme6n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme6n1 : 0.84 379.06 23.69 0.00 0.00 149507.56 9915.88 132211.76 00:16:15.100 Job: Nvme7n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme7n1 : 0.85 378.53 23.66 0.00 0.00 146690.23 10200.82 124917.31 00:16:15.100 Job: Nvme8n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme8n1 : 0.85 377.97 23.62 0.00 0.00 144053.92 10542.75 116255.17 00:16:15.100 Job: Nvme9n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme9n1 : 0.85 377.35 23.58 0.00 0.00 141503.98 11055.64 105313.50 00:16:15.100 Job: Nvme10n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:15.100 Verification LBA range: start 0x0 length 0x400 00:16:15.100 Nvme10n1 : 0.85 300.40 18.78 0.00 0.00 174232.54 3006.11 242540.19 00:16:15.100 =================================================================================================================== 00:16:15.100 Total : 3705.17 231.57 0.00 0.00 154913.96 3006.11 242540.19 00:16:15.359 15:21:17 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@114 -- # sleep 1 00:16:16.296 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@115 -- # kill -0 1805647 00:16:16.296 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@117 -- # stoptarget 00:16:16.296 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@42 -- # rm -f ./local-job0-0-verify.state 00:16:16.296 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@43 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevperf.conf 00:16:16.296 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@44 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- target/shutdown.sh@46 -- # nvmftestfini 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@331 -- # nvmfcleanup 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@99 -- # sync 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@102 -- # set +e 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@103 -- # for i in {1..20} 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:16:16.556 rmmod nvme_rdma 00:16:16.556 rmmod nvme_fabrics 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@106 -- # set -e 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@107 -- # return 0 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@332 -- # '[' -n 1805647 ']' 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@333 -- # killprocess 1805647 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@950 -- # '[' -z 1805647 ']' 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@954 -- # kill -0 1805647 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@955 -- # uname 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1805647 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1805647' 00:16:16.556 killing process with pid 1805647 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@969 -- # kill 1805647 00:16:16.556 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@974 -- # wait 1805647 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@338 -- # nvmf_fini 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@264 -- # local dev 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@267 -- # remove_target_ns 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@268 -- # delete_main_bridge 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@130 -- # return 0 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@41 -- # _dev=0 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@41 -- # dev_map=() 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/setup.sh@284 -- # iptr 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@538 -- # iptables-save 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- nvmf/common.sh@538 -- # iptables-restore 00:16:17.127 00:16:17.127 real 0m5.925s 00:16:17.127 user 0m23.384s 00:16:17.127 sys 0m1.335s 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc2 -- common/autotest_common.sh@10 -- # set +x 00:16:17.127 ************************************ 00:16:17.127 END TEST nvmf_shutdown_tc2 00:16:17.127 ************************************ 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@164 -- # run_test nvmf_shutdown_tc3 nvmf_shutdown_tc3 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@10 -- # set +x 00:16:17.127 ************************************ 00:16:17.127 START TEST nvmf_shutdown_tc3 00:16:17.127 ************************************ 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@1125 -- # nvmf_shutdown_tc3 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@122 -- # starttarget 00:16:17.127 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@16 -- # nvmftestinit 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@292 -- # prepare_net_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@254 -- # local -g is_hw=no 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@256 -- # remove_target_ns 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@125 -- # xtrace_disable 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@131 -- # pci_devs=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@131 -- # local -a pci_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@132 -- # pci_net_devs=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@133 -- # pci_drivers=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@133 -- # local -A pci_drivers 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@135 -- # net_devs=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@135 -- # local -ga net_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@136 -- # e810=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@136 -- # local -ga e810 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@137 -- # x722=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@137 -- # local -ga x722 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@138 -- # mlx=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@138 -- # local -ga mlx 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:16:17.128 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:16:17.128 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:16:17.128 Found net devices under 0000:18:00.0: mlx_0_0 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:16:17.128 Found net devices under 0000:18:00.1: mlx_0_1 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@249 -- # get_rdma_if_list 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@75 -- # rdma_devs=() 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:17.128 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@89 -- # continue 2 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@89 -- # continue 2 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@258 -- # is_hw=yes 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@61 -- # uname 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@65 -- # modprobe ib_cm 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@66 -- # modprobe ib_core 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@67 -- # modprobe ib_umad 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@69 -- # modprobe iw_cm 00:16:17.129 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@27 -- # local -gA dev_map 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@28 -- # local -g _dev 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@44 -- # ips=() 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@58 -- # key_initiator=target1 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@11 -- # local val=167772161 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:16:17.390 15:21:18 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:16:17.390 10.0.0.1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@11 -- # local val=167772162 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:16:17.390 10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@38 -- # ping_ips 1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:17.390 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:17.390 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.029 ms 00:16:17.390 00:16:17.390 --- 10.0.0.2 ping statistics --- 00:16:17.390 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:17.390 rtt min/avg/max/mdev = 0.029/0.029/0.029/0.000 ms 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:16:17.390 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:17.391 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:17.391 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.025 ms 00:16:17.391 00:16:17.391 --- 10.0.0.2 ping statistics --- 00:16:17.391 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:17.391 rtt min/avg/max/mdev = 0.025/0.025/0.025/0.000 ms 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@98 -- # (( pair++ )) 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@266 -- # return 0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@19 -- # nvmfappstart -m 0x1E 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@324 -- # nvmfpid=1806564 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@325 -- # waitforlisten 1806564 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1E 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@831 -- # '[' -z 1806564 ']' 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:17.391 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:17.391 15:21:19 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:17.651 [2024-09-27 15:21:19.259704] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:17.651 [2024-09-27 15:21:19.259772] [ DPDK EAL parameters: nvmf -c 0x1E --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:17.651 [2024-09-27 15:21:19.348658] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:16:17.651 [2024-09-27 15:21:19.440545] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:16:17.651 [2024-09-27 15:21:19.440589] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:16:17.651 [2024-09-27 15:21:19.440599] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:16:17.651 [2024-09-27 15:21:19.440607] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:16:17.651 [2024-09-27 15:21:19.440614] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:16:17.651 [2024-09-27 15:21:19.440686] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:16:17.651 [2024-09-27 15:21:19.440890] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:16:17.651 [2024-09-27 15:21:19.440790] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:16:17.651 [2024-09-27 15:21:19.440892] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@864 -- # return 0 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@21 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:18.590 [2024-09-27 15:21:20.201019] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x11e57a0/0x11e9c90) succeed. 00:16:18.590 [2024-09-27 15:21:20.211709] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x11e6de0/0x122b330) succeed. 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@23 -- # num_subsystems=({1..10}) 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@25 -- # timing_enter create_subsystems 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@27 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@29 -- # cat 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@36 -- # rpc_cmd 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:18.590 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:18.590 Malloc1 00:16:18.590 [2024-09-27 15:21:20.435860] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:16:18.849 Malloc2 00:16:18.849 Malloc3 00:16:18.849 Malloc4 00:16:18.849 Malloc5 00:16:18.849 Malloc6 00:16:18.849 Malloc7 00:16:19.109 Malloc8 00:16:19.109 Malloc9 00:16:19.109 Malloc10 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@37 -- # timing_exit create_subsystems 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@126 -- # perfpid=1806870 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@127 -- # waitforlisten 1806870 /var/tmp/bdevperf.sock 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@831 -- # '[' -z 1806870 ']' 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@125 -- # gen_nvmf_target_json 1 2 3 4 5 6 7 8 9 10 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bdevperf.sock --json /dev/fd/63 -q 64 -o 65536 -w verify -t 10 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@368 -- # config=() 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:16:19.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@368 -- # local subsystem config 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.109 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.109 { 00:16:19.109 "params": { 00:16:19.109 "name": "Nvme$subsystem", 00:16:19.109 "trtype": "$TEST_TRANSPORT", 00:16:19.109 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.109 "adrfam": "ipv4", 00:16:19.109 "trsvcid": "$NVMF_PORT", 00:16:19.109 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.109 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.109 "hdgst": ${hdgst:-false}, 00:16:19.109 "ddgst": ${ddgst:-false} 00:16:19.109 }, 00:16:19.109 "method": "bdev_nvme_attach_controller" 00:16:19.109 } 00:16:19.109 EOF 00:16:19.109 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.110 { 00:16:19.110 "params": { 00:16:19.110 "name": "Nvme$subsystem", 00:16:19.110 "trtype": "$TEST_TRANSPORT", 00:16:19.110 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.110 "adrfam": "ipv4", 00:16:19.110 "trsvcid": "$NVMF_PORT", 00:16:19.110 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.110 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.110 "hdgst": ${hdgst:-false}, 00:16:19.110 "ddgst": ${ddgst:-false} 00:16:19.110 }, 00:16:19.110 "method": "bdev_nvme_attach_controller" 00:16:19.110 } 00:16:19.110 EOF 00:16:19.110 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.110 { 00:16:19.110 "params": { 00:16:19.110 "name": "Nvme$subsystem", 00:16:19.110 "trtype": "$TEST_TRANSPORT", 00:16:19.110 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.110 "adrfam": "ipv4", 00:16:19.110 "trsvcid": "$NVMF_PORT", 00:16:19.110 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.110 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.110 "hdgst": ${hdgst:-false}, 00:16:19.110 "ddgst": ${ddgst:-false} 00:16:19.110 }, 00:16:19.110 "method": "bdev_nvme_attach_controller" 00:16:19.110 } 00:16:19.110 EOF 00:16:19.110 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.110 { 00:16:19.110 "params": { 00:16:19.110 "name": "Nvme$subsystem", 00:16:19.110 "trtype": "$TEST_TRANSPORT", 00:16:19.110 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.110 "adrfam": "ipv4", 00:16:19.110 "trsvcid": "$NVMF_PORT", 00:16:19.110 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.110 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.110 "hdgst": ${hdgst:-false}, 00:16:19.110 "ddgst": ${ddgst:-false} 00:16:19.110 }, 00:16:19.110 "method": "bdev_nvme_attach_controller" 00:16:19.110 } 00:16:19.110 EOF 00:16:19.110 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.110 { 00:16:19.110 "params": { 00:16:19.110 "name": "Nvme$subsystem", 00:16:19.110 "trtype": "$TEST_TRANSPORT", 00:16:19.110 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.110 "adrfam": "ipv4", 00:16:19.110 "trsvcid": "$NVMF_PORT", 00:16:19.110 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.110 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.110 "hdgst": ${hdgst:-false}, 00:16:19.110 "ddgst": ${ddgst:-false} 00:16:19.110 }, 00:16:19.110 "method": "bdev_nvme_attach_controller" 00:16:19.110 } 00:16:19.110 EOF 00:16:19.110 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.110 { 00:16:19.110 "params": { 00:16:19.110 "name": "Nvme$subsystem", 00:16:19.110 "trtype": "$TEST_TRANSPORT", 00:16:19.110 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.110 "adrfam": "ipv4", 00:16:19.110 "trsvcid": "$NVMF_PORT", 00:16:19.110 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.110 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.110 "hdgst": ${hdgst:-false}, 00:16:19.110 "ddgst": ${ddgst:-false} 00:16:19.110 }, 00:16:19.110 "method": "bdev_nvme_attach_controller" 00:16:19.110 } 00:16:19.110 EOF 00:16:19.110 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.110 [2024-09-27 15:21:20.949122] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:19.110 [2024-09-27 15:21:20.949187] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1806870 ] 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.110 { 00:16:19.110 "params": { 00:16:19.110 "name": "Nvme$subsystem", 00:16:19.110 "trtype": "$TEST_TRANSPORT", 00:16:19.110 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.110 "adrfam": "ipv4", 00:16:19.110 "trsvcid": "$NVMF_PORT", 00:16:19.110 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.110 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.110 "hdgst": ${hdgst:-false}, 00:16:19.110 "ddgst": ${ddgst:-false} 00:16:19.110 }, 00:16:19.110 "method": "bdev_nvme_attach_controller" 00:16:19.110 } 00:16:19.110 EOF 00:16:19.110 )") 00:16:19.110 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.369 { 00:16:19.369 "params": { 00:16:19.369 "name": "Nvme$subsystem", 00:16:19.369 "trtype": "$TEST_TRANSPORT", 00:16:19.369 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.369 "adrfam": "ipv4", 00:16:19.369 "trsvcid": "$NVMF_PORT", 00:16:19.369 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.369 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.369 "hdgst": ${hdgst:-false}, 00:16:19.369 "ddgst": ${ddgst:-false} 00:16:19.369 }, 00:16:19.369 "method": "bdev_nvme_attach_controller" 00:16:19.369 } 00:16:19.369 EOF 00:16:19.369 )") 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.369 { 00:16:19.369 "params": { 00:16:19.369 "name": "Nvme$subsystem", 00:16:19.369 "trtype": "$TEST_TRANSPORT", 00:16:19.369 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.369 "adrfam": "ipv4", 00:16:19.369 "trsvcid": "$NVMF_PORT", 00:16:19.369 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.369 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.369 "hdgst": ${hdgst:-false}, 00:16:19.369 "ddgst": ${ddgst:-false} 00:16:19.369 }, 00:16:19.369 "method": "bdev_nvme_attach_controller" 00:16:19.369 } 00:16:19.369 EOF 00:16:19.369 )") 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@370 -- # for subsystem in "${@:-1}" 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # config+=("$(cat <<-EOF 00:16:19.369 { 00:16:19.369 "params": { 00:16:19.369 "name": "Nvme$subsystem", 00:16:19.369 "trtype": "$TEST_TRANSPORT", 00:16:19.369 "traddr": "$NVMF_FIRST_TARGET_IP", 00:16:19.369 "adrfam": "ipv4", 00:16:19.369 "trsvcid": "$NVMF_PORT", 00:16:19.369 "subnqn": "nqn.2016-06.io.spdk:cnode$subsystem", 00:16:19.369 "hostnqn": "nqn.2016-06.io.spdk:host$subsystem", 00:16:19.369 "hdgst": ${hdgst:-false}, 00:16:19.369 "ddgst": ${ddgst:-false} 00:16:19.369 }, 00:16:19.369 "method": "bdev_nvme_attach_controller" 00:16:19.369 } 00:16:19.369 EOF 00:16:19.369 )") 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@390 -- # cat 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@392 -- # jq . 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@393 -- # IFS=, 00:16:19.369 15:21:20 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@394 -- # printf '%s\n' '{ 00:16:19.369 "params": { 00:16:19.369 "name": "Nvme1", 00:16:19.369 "trtype": "rdma", 00:16:19.369 "traddr": "10.0.0.2", 00:16:19.369 "adrfam": "ipv4", 00:16:19.369 "trsvcid": "4420", 00:16:19.369 "subnqn": "nqn.2016-06.io.spdk:cnode1", 00:16:19.369 "hostnqn": "nqn.2016-06.io.spdk:host1", 00:16:19.369 "hdgst": false, 00:16:19.369 "ddgst": false 00:16:19.369 }, 00:16:19.369 "method": "bdev_nvme_attach_controller" 00:16:19.369 },{ 00:16:19.369 "params": { 00:16:19.369 "name": "Nvme2", 00:16:19.369 "trtype": "rdma", 00:16:19.369 "traddr": "10.0.0.2", 00:16:19.369 "adrfam": "ipv4", 00:16:19.369 "trsvcid": "4420", 00:16:19.369 "subnqn": "nqn.2016-06.io.spdk:cnode2", 00:16:19.369 "hostnqn": "nqn.2016-06.io.spdk:host2", 00:16:19.369 "hdgst": false, 00:16:19.369 "ddgst": false 00:16:19.369 }, 00:16:19.369 "method": "bdev_nvme_attach_controller" 00:16:19.369 },{ 00:16:19.369 "params": { 00:16:19.369 "name": "Nvme3", 00:16:19.369 "trtype": "rdma", 00:16:19.369 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode3", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host3", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme4", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode4", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host4", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme5", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode5", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host5", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme6", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode6", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host6", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme7", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode7", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host7", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme8", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode8", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host8", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme9", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode9", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host9", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 },{ 00:16:19.370 "params": { 00:16:19.370 "name": "Nvme10", 00:16:19.370 "trtype": "rdma", 00:16:19.370 "traddr": "10.0.0.2", 00:16:19.370 "adrfam": "ipv4", 00:16:19.370 "trsvcid": "4420", 00:16:19.370 "subnqn": "nqn.2016-06.io.spdk:cnode10", 00:16:19.370 "hostnqn": "nqn.2016-06.io.spdk:host10", 00:16:19.370 "hdgst": false, 00:16:19.370 "ddgst": false 00:16:19.370 }, 00:16:19.370 "method": "bdev_nvme_attach_controller" 00:16:19.370 }' 00:16:19.370 [2024-09-27 15:21:21.041919] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:19.370 [2024-09-27 15:21:21.123955] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:20.307 Running I/O for 10 seconds... 00:16:20.307 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:20.307 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@864 -- # return 0 00:16:20.307 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@128 -- # rpc_cmd -s /var/tmp/bdevperf.sock framework_wait_init 00:16:20.307 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:20.307 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@131 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; kill -9 $perfpid || true; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@133 -- # waitforio /var/tmp/bdevperf.sock Nvme1n1 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@51 -- # '[' -z /var/tmp/bdevperf.sock ']' 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@55 -- # '[' -z Nvme1n1 ']' 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@58 -- # local ret=1 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@59 -- # local i 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@60 -- # (( i = 10 )) 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@60 -- # (( i != 0 )) 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@61 -- # rpc_cmd -s /var/tmp/bdevperf.sock bdev_get_iostat -b Nvme1n1 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@61 -- # jq -r '.bdevs[0].num_read_ops' 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@61 -- # read_io_count=46 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@64 -- # '[' 46 -ge 100 ']' 00:16:20.566 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@68 -- # sleep 0.25 00:16:20.825 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@60 -- # (( i-- )) 00:16:20.825 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@60 -- # (( i != 0 )) 00:16:20.825 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@61 -- # rpc_cmd -s /var/tmp/bdevperf.sock bdev_get_iostat -b Nvme1n1 00:16:20.825 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@61 -- # jq -r '.bdevs[0].num_read_ops' 00:16:20.825 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:20.825 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@61 -- # read_io_count=200 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@64 -- # '[' 200 -ge 100 ']' 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@65 -- # ret=0 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@66 -- # break 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@70 -- # return 0 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@136 -- # killprocess 1806564 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@950 -- # '[' -z 1806564 ']' 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@954 -- # kill -0 1806564 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@955 -- # uname 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1806564 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1806564' 00:16:21.084 killing process with pid 1806564 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@969 -- # kill 1806564 00:16:21.084 15:21:22 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@974 -- # wait 1806564 00:16:21.601 2723.00 IOPS, 170.19 MiB/s 15:21:23 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@137 -- # sleep 1 00:16:22.172 [2024-09-27 15:21:23.845543] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.845584] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:8192 cdw0:0 sqhd:f200 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.845597] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.845606] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:8192 cdw0:0 sqhd:f200 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.845616] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.845625] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:8192 cdw0:0 sqhd:f200 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.845635] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.845643] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:8192 cdw0:0 sqhd:f200 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.847776] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.172 [2024-09-27 15:21:23.847827] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode5] in failed state. 00:16:22.172 [2024-09-27 15:21:23.847892] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.847926] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.847960] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.847991] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.848025] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.848056] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.848090] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.848122] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.850300] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.172 [2024-09-27 15:21:23.850358] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode3] in failed state. 00:16:22.172 [2024-09-27 15:21:23.850409] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.850442] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.850476] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.850506] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.850539] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.850569] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.850602] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.850632] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.852830] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.172 [2024-09-27 15:21:23.852873] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode2] in failed state. 00:16:22.172 [2024-09-27 15:21:23.852924] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.852957] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.852990] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.853020] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.853054] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.853085] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.853118] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.853149] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.172 [2024-09-27 15:21:23.855402] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.172 [2024-09-27 15:21:23.855444] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:16:22.172 [2024-09-27 15:21:23.855500] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.172 [2024-09-27 15:21:23.855534] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.855567] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.855599] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.855641] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.855672] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.855705] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.855736] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.858361] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.173 [2024-09-27 15:21:23.858403] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode4] in failed state. 00:16:22.173 [2024-09-27 15:21:23.858450] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.858483] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.858517] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.858547] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.858580] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.858612] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.858644] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.858675] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.860946] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.173 [2024-09-27 15:21:23.860999] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode10] in failed state. 00:16:22.173 [2024-09-27 15:21:23.861048] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.861092] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.861127] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.861159] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.861192] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.861222] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.861256] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.173 [2024-09-27 15:21:23.861287] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.863850] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.173 [2024-09-27 15:21:23.863900] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode6] in failed state. 00:16:22.173 [2024-09-27 15:21:23.866556] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200013802bc0 was disconnected and freed. reset controller. 00:16:22.173 [2024-09-27 15:21:23.866604] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.868574] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200013802900 was disconnected and freed. reset controller. 00:16:22.173 [2024-09-27 15:21:23.868616] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.871038] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200013802640 was disconnected and freed. reset controller. 00:16:22.173 [2024-09-27 15:21:23.871081] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.873578] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200013802380 was disconnected and freed. reset controller. 00:16:22.173 [2024-09-27 15:21:23.873621] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.875988] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x2000138020c0 was disconnected and freed. reset controller. 00:16:22.173 [2024-09-27 15:21:23.876030] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876261] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876312] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876373] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876414] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876459] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876506] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876547] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.876803] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] resetting controller 00:16:22.173 [2024-09-27 15:21:23.876852] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode2] resetting controller 00:16:22.173 [2024-09-27 15:21:23.876888] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode3] resetting controller 00:16:22.173 [2024-09-27 15:21:23.876923] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode4] resetting controller 00:16:22.173 [2024-09-27 15:21:23.876958] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode5] resetting controller 00:16:22.173 [2024-09-27 15:21:23.887632] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.887706] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.173 [2024-09-27 15:21:23.887940] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.173 [2024-09-27 15:21:23.887978] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.173 [2024-09-27 15:21:23.888006] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019aed000 00:16:22.173 [2024-09-27 15:21:23.888233] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.173 [2024-09-27 15:21:23.888279] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.173 [2024-09-27 15:21:23.888305] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019ae5280 00:16:22.173 [2024-09-27 15:21:23.888445] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.173 [2024-09-27 15:21:23.888481] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.173 [2024-09-27 15:21:23.888506] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019aba2c0 00:16:22.173 [2024-09-27 15:21:23.888626] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.173 [2024-09-27 15:21:23.888660] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.173 [2024-09-27 15:21:23.888685] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019ab9ac0 00:16:22.173 [2024-09-27 15:21:23.888783] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.173 [2024-09-27 15:21:23.888818] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.173 [2024-09-27 15:21:23.888843] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019ad20c0 00:16:22.173 [2024-09-27 15:21:23.895940] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:58 nsid:1 lba:24576 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001065f000 len:0x10000 key:0x1c2200 00:16:22.173 [2024-09-27 15:21:23.895987] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.896063] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:24704 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001063e000 len:0x10000 key:0x1c2200 00:16:22.173 [2024-09-27 15:21:23.896098] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.896146] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:60 nsid:1 lba:24832 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001061d000 len:0x10000 key:0x1c2200 00:16:22.173 [2024-09-27 15:21:23.896179] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.896225] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:24960 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000105fc000 len:0x10000 key:0x1c2200 00:16:22.173 [2024-09-27 15:21:23.896258] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.173 [2024-09-27 15:21:23.896304] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:62 nsid:1 lba:25088 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000105db000 len:0x10000 key:0x1c2200 00:16:22.173 [2024-09-27 15:21:23.896338] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896397] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:25216 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000105ba000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896430] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896477] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:64 nsid:1 lba:25344 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010599000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896513] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896568] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:1 lba:25472 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010578000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896602] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896648] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:1 lba:25600 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010557000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896681] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896728] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:1 lba:25728 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010536000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896760] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896807] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:1 lba:25856 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010515000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896840] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896886] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:1 lba:25984 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000104f4000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896919] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.896965] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:5 nsid:1 lba:26112 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000104d3000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.896998] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897043] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:6 nsid:1 lba:26240 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000104b2000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897075] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897121] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:7 nsid:1 lba:26368 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010491000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897154] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897200] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:8 nsid:1 lba:26496 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010470000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897233] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897278] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:9 nsid:1 lba:26624 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000110af000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897311] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897372] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:10 nsid:1 lba:26752 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ed9f000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897406] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897452] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:11 nsid:1 lba:26880 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ed7e000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897492] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897537] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:12 nsid:1 lba:27008 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ed5d000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897570] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897616] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:13 nsid:1 lba:27136 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ed3c000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897649] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897695] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:14 nsid:1 lba:27264 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ed1b000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897728] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897774] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:15 nsid:1 lba:27392 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ecfa000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897806] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897851] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:16 nsid:1 lba:27520 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ecd9000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897884] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.897930] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:17 nsid:1 lba:27648 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ecb8000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.897962] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898007] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:18 nsid:1 lba:27776 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ec97000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898040] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898085] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:27904 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ec76000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898118] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898163] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:20 nsid:1 lba:28032 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ec55000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898197] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898243] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:21 nsid:1 lba:28160 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ec34000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898276] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898321] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:22 nsid:1 lba:28288 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000127a1000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898369] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898415] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:23 nsid:1 lba:28416 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012150000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898449] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898495] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:24 nsid:1 lba:28544 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000be2f000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898529] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898585] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:25 nsid:1 lba:28672 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000be0e000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898618] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898663] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:26 nsid:1 lba:28800 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bded000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898696] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898743] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:27 nsid:1 lba:28928 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bdcc000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898776] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898822] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:28 nsid:1 lba:29056 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bdab000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898856] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898901] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:29184 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bd8a000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.898934] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.898980] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:29312 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bd69000 len:0x10000 key:0x1c2200 00:16:22.174 [2024-09-27 15:21:23.899013] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.174 [2024-09-27 15:21:23.899060] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:29440 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bd48000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899093] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899138] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:32 nsid:1 lba:29568 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bd27000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899171] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899217] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:29696 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bd06000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899251] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899302] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:34 nsid:1 lba:29824 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bce5000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899336] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899396] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:29952 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bcc4000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899429] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899474] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:30080 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bca3000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899508] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899553] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:30208 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bc82000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899586] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899631] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:30336 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bc61000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899664] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899709] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:30464 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bc40000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899742] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899788] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:30592 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012780000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899821] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899875] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:30720 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c24f000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899908] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.899953] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:42 nsid:1 lba:30848 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c22e000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.899987] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900032] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:30976 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c20d000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900064] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900111] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:31104 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c1ec000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900144] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900194] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:31232 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c1cb000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900228] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900274] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:46 nsid:1 lba:31360 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c1aa000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900306] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900372] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:47 nsid:1 lba:31488 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c189000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900406] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900452] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:31616 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c168000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900484] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900529] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:49 nsid:1 lba:31744 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c147000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900561] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900609] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:31872 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c126000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900643] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900688] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:32000 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c105000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900721] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900767] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:32128 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c0e4000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900799] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900845] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:32256 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c0c3000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900877] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.900924] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:54 nsid:1 lba:32384 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c0a2000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.900957] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.901004] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:55 nsid:1 lba:32512 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c081000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.901036] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.901082] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:56 nsid:1 lba:32640 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c060000 len:0x10000 key:0x1c2200 00:16:22.175 [2024-09-27 15:21:23.901120] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.905300] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode10] resetting controller 00:16:22.175 [2024-09-27 15:21:23.905446] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.175 [2024-09-27 15:21:23.905502] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.175 [2024-09-27 15:21:23.905537] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:50399 cdw0:86e804b0 sqhd:733c p:1 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.905570] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.175 [2024-09-27 15:21:23.905603] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:50399 cdw0:86e804b0 sqhd:733c p:1 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.905636] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.175 [2024-09-27 15:21:23.905668] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:50399 cdw0:86e804b0 sqhd:733c p:1 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.905701] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.175 [2024-09-27 15:21:23.905733] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:50399 cdw0:86e804b0 sqhd:733c p:1 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.908207] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.175 [2024-09-27 15:21:23.908236] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode7] in failed state. 00:16:22.175 [2024-09-27 15:21:23.908268] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.175 [2024-09-27 15:21:23.908289] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.908311] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.175 [2024-09-27 15:21:23.908331] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.175 [2024-09-27 15:21:23.908361] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.176 [2024-09-27 15:21:23.908381] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.908402] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.176 [2024-09-27 15:21:23.908422] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.910744] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.176 [2024-09-27 15:21:23.910785] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode9] in failed state. 00:16:22.176 [2024-09-27 15:21:23.910838] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.176 [2024-09-27 15:21:23.910872] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.910908] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.176 [2024-09-27 15:21:23.910948] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.910982] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.176 [2024-09-27 15:21:23.911015] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.911049] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:16:22.176 [2024-09-27 15:21:23.911081] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:51816 cdw0:86e804b0 sqhd:1900 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.913914] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:22.176 [2024-09-27 15:21:23.913956] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode8] in failed state. 00:16:22.176 [2024-09-27 15:21:23.914543] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:22 nsid:1 lba:40960 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a8dfd80 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.914587] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.914636] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:56 nsid:1 lba:41088 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a8cfd00 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.914669] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.914714] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:24 nsid:1 lba:41216 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a8bfc80 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.914746] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.914791] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:41344 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a8afc00 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.914824] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.914869] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:26 nsid:1 lba:41472 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a89fb80 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.914901] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.914945] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:60 nsid:1 lba:41600 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a88fb00 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.914978] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915024] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:28 nsid:1 lba:41728 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a87fa80 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915056] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915101] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:62 nsid:1 lba:41856 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a86fa00 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915134] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915187] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:46 nsid:1 lba:41984 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a85f980 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915221] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915266] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:42112 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a84f900 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915299] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915362] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:42240 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a83f880 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915396] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915441] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:42368 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a82f800 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915474] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915517] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:42496 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a81f780 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915550] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915594] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:1 lba:42624 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a80f700 len:0x10000 key:0x1c1100 00:16:22.176 [2024-09-27 15:21:23.915625] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915670] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:42752 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a68f500 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.915701] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915746] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:5 nsid:1 lba:42880 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a67f480 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.915778] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915821] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:6 nsid:1 lba:43008 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a66f400 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.915854] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915898] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:7 nsid:1 lba:43136 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a65f380 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.915931] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.915975] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:8 nsid:1 lba:43264 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a64f300 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.916007] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916056] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:9 nsid:1 lba:43392 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a63f280 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.916089] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916133] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:10 nsid:1 lba:43520 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a62f200 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.916165] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916208] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:43648 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a61f180 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.916241] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916285] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:12 nsid:1 lba:43776 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001a60f100 len:0x10000 key:0x1c1000 00:16:22.176 [2024-09-27 15:21:23.916317] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916372] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:43904 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001abf0000 len:0x10000 key:0x1c1400 00:16:22.176 [2024-09-27 15:21:23.916406] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916449] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:44032 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001abdff80 len:0x10000 key:0x1c1400 00:16:22.176 [2024-09-27 15:21:23.916481] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916525] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:44160 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001abcff00 len:0x10000 key:0x1c1400 00:16:22.176 [2024-09-27 15:21:23.916557] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.176 [2024-09-27 15:21:23.916601] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:16 nsid:1 lba:44288 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001abbfe80 len:0x10000 key:0x1c1400 00:16:22.176 [2024-09-27 15:21:23.916633] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.916678] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:17 nsid:1 lba:44416 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001abafe00 len:0x10000 key:0x1c1400 00:16:22.177 [2024-09-27 15:21:23.916710] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.916753] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:44544 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ab9fd80 len:0x10000 key:0x1c1400 00:16:22.177 [2024-09-27 15:21:23.916786] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.916830] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:19 nsid:1 lba:44672 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ab8fd00 len:0x10000 key:0x1c1400 00:16:22.177 [2024-09-27 15:21:23.916862] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.916907] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:20 nsid:1 lba:44800 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ab7fc80 len:0x10000 key:0x1c1400 00:16:22.177 [2024-09-27 15:21:23.916950] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.916995] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:21 nsid:1 lba:36736 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012192000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917028] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917074] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:36864 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000121b3000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917107] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917153] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:23 nsid:1 lba:36992 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000121d4000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917185] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917232] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:37120 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000121f5000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917264] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917311] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:25 nsid:1 lba:37248 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012216000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917356] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917402] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:37376 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012237000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917435] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917481] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:27 nsid:1 lba:37504 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012258000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917513] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917560] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:37632 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012279000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917593] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917639] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:37760 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001229a000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917672] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917718] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:37888 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000122bb000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917750] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917797] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:32 nsid:1 lba:38016 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000122dc000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917834] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917881] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:38144 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000122fd000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917915] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.917960] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:34 nsid:1 lba:38272 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001231e000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.917993] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918038] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:38400 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001233f000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918071] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918117] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:38528 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000be92000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918149] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918194] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:38656 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b75a000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918227] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918273] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:38784 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001163a000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918305] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918423] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:38912 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001165b000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918458] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918504] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:39040 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001167c000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918536] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918581] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:39168 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001169d000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918614] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918660] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:42 nsid:1 lba:39296 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000116be000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918692] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918738] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:39424 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000116df000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918770] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918821] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:39552 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000fe1f000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918854] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.177 [2024-09-27 15:21:23.918901] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:39680 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013023000 len:0x10000 key:0x1c2200 00:16:22.177 [2024-09-27 15:21:23.918933] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.918979] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:39808 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000fc30000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919012] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919057] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:47 nsid:1 lba:39936 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000fc51000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919090] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919136] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:40064 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000125b2000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919168] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919215] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:49 nsid:1 lba:40192 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012591000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919252] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919298] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:40320 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f5be000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919331] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919391] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:40448 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200013002000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919424] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919470] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:40576 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012fe1000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919502] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919548] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:40704 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012fc0000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919580] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.919626] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:54 nsid:1 lba:40832 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c45f000 len:0x10000 key:0x1c2200 00:16:22.178 [2024-09-27 15:21:23.919659] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924424] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:23 nsid:1 lba:40960 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ac4f900 len:0x10000 key:0x1c1800 00:16:22.178 [2024-09-27 15:21:23.924475] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924526] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:56 nsid:1 lba:41088 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ac3f880 len:0x10000 key:0x1c1800 00:16:22.178 [2024-09-27 15:21:23.924559] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924603] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:25 nsid:1 lba:41216 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ac2f800 len:0x10000 key:0x1c1800 00:16:22.178 [2024-09-27 15:21:23.924635] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924679] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:41344 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ac1f780 len:0x10000 key:0x1c1800 00:16:22.178 [2024-09-27 15:21:23.924711] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924755] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:27 nsid:1 lba:41472 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ac0f700 len:0x10000 key:0x1c1800 00:16:22.178 [2024-09-27 15:21:23.924788] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924831] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:60 nsid:1 lba:41600 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aff0000 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.924864] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924908] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:29 nsid:1 lba:41728 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001afdff80 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.924941] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.924984] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:62 nsid:1 lba:41856 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001afcff00 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925016] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925060] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:49 nsid:1 lba:41984 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001afbfe80 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925092] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925136] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:42112 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001afafe00 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925169] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925212] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:42240 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af9fd80 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925245] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925290] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:42368 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af8fd00 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925328] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925391] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:42496 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af7fc80 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925418] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925455] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:1 lba:42624 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af6fc00 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925482] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925520] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:42752 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af5fb80 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925547] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925584] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:5 nsid:1 lba:42880 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af4fb00 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925612] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925649] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:6 nsid:1 lba:43008 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af3fa80 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925677] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925713] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:7 nsid:1 lba:43136 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af2fa00 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925741] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925777] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:8 nsid:1 lba:43264 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af1f980 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925805] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925842] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:9 nsid:1 lba:43392 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001af0f900 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925869] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925906] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:10 nsid:1 lba:43520 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aeff880 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925933] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.925970] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:43648 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aeef800 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.925997] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.926035] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:12 nsid:1 lba:43776 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aedf780 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.926066] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.926103] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:43904 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aecf700 len:0x10000 key:0x1c2500 00:16:22.178 [2024-09-27 15:21:23.926131] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.178 [2024-09-27 15:21:23.926168] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:44032 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aebf680 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926196] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926232] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:44160 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001aeaf600 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926259] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926297] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:16 nsid:1 lba:44288 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ae9f580 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926324] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926415] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:17 nsid:1 lba:44416 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ae8f500 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926444] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926481] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:44544 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ae7f480 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926508] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926545] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:19 nsid:1 lba:44672 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ae6f400 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926572] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926608] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:20 nsid:1 lba:44800 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ae5f380 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926635] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926673] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:21 nsid:1 lba:44928 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001ae4f300 len:0x10000 key:0x1c2500 00:16:22.179 [2024-09-27 15:21:23.926700] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926736] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:22 nsid:1 lba:36864 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012ba0000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.926764] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926803] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:36992 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012bc1000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.926831] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926877] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:24 nsid:1 lba:37120 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012be2000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.926905] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.926944] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:37248 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012c03000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.926971] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927010] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:26 nsid:1 lba:37376 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012c24000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927038] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927075] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:37504 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012c45000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927102] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927141] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:28 nsid:1 lba:37632 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012c66000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927170] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927208] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:37760 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012c87000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927235] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927273] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:37888 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012ca8000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927301] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927339] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:32 nsid:1 lba:38016 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012cc9000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927377] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927415] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:38144 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012cea000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927443] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927480] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:34 nsid:1 lba:38272 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012d0b000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927507] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927545] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:38400 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012d2c000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927573] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927614] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:38528 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012d4d000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927641] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927680] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:38656 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f810000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927707] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927745] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:38784 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f831000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927772] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927811] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:38912 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c87f000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927837] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927876] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:39040 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c85e000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927903] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.927942] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:39168 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c83d000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.927970] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928008] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:42 nsid:1 lba:39296 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c81c000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.928035] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928072] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:39424 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c7fb000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.928100] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928137] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:39552 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c7da000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.928164] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928202] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:39680 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c7b9000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.928229] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928266] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:46 nsid:1 lba:39808 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c798000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.928294] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928331] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:47 nsid:1 lba:39936 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c777000 len:0x10000 key:0x1c2200 00:16:22.179 [2024-09-27 15:21:23.928374] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.179 [2024-09-27 15:21:23.928413] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:40064 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c756000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928440] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.928479] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:40192 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c735000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928506] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.928545] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:40320 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c714000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928572] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.928611] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:40448 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c6f3000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928638] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.928676] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:40576 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c6d2000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928704] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.928742] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:40704 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c6b1000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928770] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.928808] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:54 nsid:1 lba:40832 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c690000 len:0x10000 key:0x1c2200 00:16:22.180 [2024-09-27 15:21:23.928835] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933089] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200013801b40 was disconnected and freed. reset controller. 00:16:22.180 [2024-09-27 15:21:23.933125] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.180 [2024-09-27 15:21:23.933156] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:20 nsid:1 lba:40960 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b2dfd80 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933181] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933220] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:56 nsid:1 lba:41088 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b2cfd00 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933244] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933276] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:22 nsid:1 lba:41216 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b2bfc80 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933300] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933337] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:41344 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b2afc00 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933368] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933401] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:24 nsid:1 lba:41472 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b29fb80 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933425] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933456] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:60 nsid:1 lba:41600 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b28fb00 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933480] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933511] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:26 nsid:1 lba:41728 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b27fa80 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933535] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933567] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:62 nsid:1 lba:41856 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b26fa00 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933590] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933622] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:59 nsid:1 lba:41984 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b25f980 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933645] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933677] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:42112 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b24f900 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933700] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933749] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:42240 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b23f880 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933773] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933804] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:42368 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b22f800 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933828] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933859] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:42496 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b21f780 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933883] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933915] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:1 lba:42624 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b20f700 len:0x10000 key:0x1c1200 00:16:22.180 [2024-09-27 15:21:23.933938] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.933973] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:42752 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b03f280 len:0x10000 key:0x1c1300 00:16:22.180 [2024-09-27 15:21:23.933998] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934029] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:5 nsid:1 lba:42880 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b02f200 len:0x10000 key:0x1c1300 00:16:22.180 [2024-09-27 15:21:23.934052] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934084] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:6 nsid:1 lba:43008 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b01f180 len:0x10000 key:0x1c1300 00:16:22.180 [2024-09-27 15:21:23.934108] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934139] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:7 nsid:1 lba:43136 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b00f100 len:0x10000 key:0x1c1300 00:16:22.180 [2024-09-27 15:21:23.934164] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934196] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:8 nsid:1 lba:43264 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b5f0000 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934219] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934251] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:9 nsid:1 lba:43392 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b5dff80 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934279] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934311] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:10 nsid:1 lba:43520 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b5cff00 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934335] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934403] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:43648 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b5bfe80 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934428] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934460] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:12 nsid:1 lba:43776 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b5afe00 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934484] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934516] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:43904 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b59fd80 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934539] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934571] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:44032 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b58fd00 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934596] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934627] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:44160 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b57fc80 len:0x10000 key:0x1c1c00 00:16:22.180 [2024-09-27 15:21:23.934655] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.180 [2024-09-27 15:21:23.934687] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:16 nsid:1 lba:44288 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b56fc00 len:0x10000 key:0x1c1c00 00:16:22.181 [2024-09-27 15:21:23.934711] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.934744] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:17 nsid:1 lba:44416 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b55fb80 len:0x10000 key:0x1c1c00 00:16:22.181 [2024-09-27 15:21:23.934767] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.934798] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:44544 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001b54fb00 len:0x10000 key:0x1c1c00 00:16:22.181 [2024-09-27 15:21:23.934821] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.934854] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:36480 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f411000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.934876] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.934910] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:36608 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f432000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.934934] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.934967] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:21 nsid:1 lba:36736 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f453000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.934991] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935024] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:27 nsid:1 lba:36864 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f474000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935049] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935082] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:23 nsid:1 lba:36992 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f495000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935106] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935139] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:37120 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f4b6000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935163] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935196] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:25 nsid:1 lba:37248 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f4d7000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935220] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935254] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:37376 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f4f8000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935281] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935315] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:28 nsid:1 lba:37504 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f519000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935338] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935385] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:37632 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000f53a000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935410] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935443] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:37760 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c4e3000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935466] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935500] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:37888 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c504000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935524] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935556] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:32 nsid:1 lba:38016 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c525000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935580] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935613] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:38144 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bad5000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935637] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935674] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:34 nsid:1 lba:38272 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000bab4000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935698] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935731] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:38400 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ba93000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935755] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935788] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:38528 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ca8f000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935812] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935850] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:38656 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ca6e000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935886] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935920] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:38784 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ca4d000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.935944] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.935987] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:38912 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ca2c000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936012] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936045] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:39040 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ca0b000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936068] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936102] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:39168 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c9ea000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936126] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936158] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:42 nsid:1 lba:39296 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c9c9000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936182] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936215] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:39424 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c9a8000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936240] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936272] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:39552 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c987000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936295] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936328] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:39680 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c966000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936365] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936399] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:46 nsid:1 lba:39808 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c945000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936423] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936457] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:47 nsid:1 lba:39936 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c924000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936480] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936514] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:40064 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c903000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936537] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936570] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:49 nsid:1 lba:40192 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c8e2000 len:0x10000 key:0x1c2200 00:16:22.181 [2024-09-27 15:21:23.936594] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.181 [2024-09-27 15:21:23.936634] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:40320 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c8c1000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.936659] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.936692] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:40448 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000c8a0000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.936716] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.936749] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:40576 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b6b5000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.936773] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.936806] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:40704 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b694000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.936830] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.936864] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:54 nsid:1 lba:40832 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b673000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.936888] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.940961] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200013801880 was disconnected and freed. reset controller. 00:16:22.182 [2024-09-27 15:21:23.941008] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.182 [2024-09-27 15:21:23.941052] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:56 nsid:1 lba:32768 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010050000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941087] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941166] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:32896 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010071000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941200] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941246] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:58 nsid:1 lba:33024 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010092000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941278] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941324] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:33152 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000100b3000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941370] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941430] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:60 nsid:1 lba:33280 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000100d4000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941455] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941489] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:33408 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000100f5000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941518] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941552] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:62 nsid:1 lba:33536 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010116000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941576] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941609] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:33664 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010137000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941633] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941666] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:64 nsid:1 lba:33792 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010158000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941690] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941723] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:1 lba:33920 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010179000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941747] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941780] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:1 lba:34048 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001019a000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941804] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941837] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:1 lba:34176 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000101bb000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941861] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941894] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:1 lba:34304 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000101dc000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941917] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.941951] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:1 lba:34432 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000101fd000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.941974] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942007] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:5 nsid:1 lba:34560 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001021e000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942031] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942064] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:6 nsid:1 lba:34688 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20001023f000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942087] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942120] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:7 nsid:1 lba:34816 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b718000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942155] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942188] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:8 nsid:1 lba:34944 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b739000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942211] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942242] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:9 nsid:1 lba:35072 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e016000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942265] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942295] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:10 nsid:1 lba:35200 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e037000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942318] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942384] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:11 nsid:1 lba:35328 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012e55000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942409] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942440] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:12 nsid:1 lba:35456 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012e34000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942463] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942493] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:13 nsid:1 lba:35584 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012e13000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942516] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942547] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:14 nsid:1 lba:35712 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012df2000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942570] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.182 [2024-09-27 15:21:23.942602] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:15 nsid:1 lba:35840 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012dd1000 len:0x10000 key:0x1c2200 00:16:22.182 [2024-09-27 15:21:23.942625] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942656] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:16 nsid:1 lba:35968 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200012db0000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.942679] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942710] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:17 nsid:1 lba:36096 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e3b2000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.942732] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942763] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:18 nsid:1 lba:36224 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e3d3000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.942786] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942820] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:36352 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e3f4000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.942843] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942876] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:20 nsid:1 lba:36480 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e415000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.942898] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942929] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:21 nsid:1 lba:36608 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e436000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.942952] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.942984] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:22 nsid:1 lba:36736 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e457000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943006] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943044] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:23 nsid:1 lba:36864 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e478000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943067] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943098] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:24 nsid:1 lba:36992 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e499000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943121] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943153] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:25 nsid:1 lba:37120 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e4ba000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943176] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943208] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:26 nsid:1 lba:37248 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e4db000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943230] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943262] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:27 nsid:1 lba:37376 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e4fc000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943285] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943316] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:28 nsid:1 lba:37504 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e51d000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943338] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943400] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:37632 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e53e000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943425] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943456] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:37760 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010aa0000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943481] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943513] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:37888 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000b820000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943536] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943568] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:32 nsid:1 lba:38016 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ceaf000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943590] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943621] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:38144 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ce8e000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943644] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943674] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:34 nsid:1 lba:38272 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ce6d000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943697] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943728] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:38400 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ce4c000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943750] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943781] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:38528 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ce2b000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943803] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943833] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:38656 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ce0a000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943855] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943886] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:38784 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000cde9000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943908] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943944] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:38912 len:128 SGL KEYED DATA BLOCK ADDRESS 0x2000131d0000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.943968] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.943998] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:39040 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e814000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944021] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944053] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:39168 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e835000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944079] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944110] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:42 nsid:1 lba:39296 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e856000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944132] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944163] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:39424 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e877000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944185] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944216] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:39552 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e898000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944239] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944269] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:39680 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e8b9000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944292] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944323] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:46 nsid:1 lba:39808 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000e8da000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944353] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944384] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:47 nsid:1 lba:39936 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010cf2000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944406] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.183 [2024-09-27 15:21:23.944437] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:40064 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010cd1000 len:0x10000 key:0x1c2200 00:16:22.183 [2024-09-27 15:21:23.944459] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.944490] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:49 nsid:1 lba:40192 len:128 SGL KEYED DATA BLOCK ADDRESS 0x200010cb0000 len:0x10000 key:0x1c2200 00:16:22.184 [2024-09-27 15:21:23.944513] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.944544] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:40320 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ba72000 len:0x10000 key:0x1c2200 00:16:22.184 [2024-09-27 15:21:23.944566] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.944597] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:40448 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ba51000 len:0x10000 key:0x1c2200 00:16:22.184 [2024-09-27 15:21:23.944620] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.944651] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:40576 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000ba30000 len:0x10000 key:0x1c2200 00:16:22.184 [2024-09-27 15:21:23.944673] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.944707] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:40704 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000d2cf000 len:0x10000 key:0x1c2200 00:16:22.184 [2024-09-27 15:21:23.944731] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.944762] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:54 nsid:1 lba:40832 len:128 SGL KEYED DATA BLOCK ADDRESS 0x20000d2ae000 len:0x10000 key:0x1c2200 00:16:22.184 [2024-09-27 15:21:23.944783] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:c3862000 sqhd:7250 p:0 m:0 dnr:0 00:16:22.184 [2024-09-27 15:21:23.968541] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x2000138015c0 was disconnected and freed. reset controller. 00:16:22.184 [2024-09-27 15:21:23.968563] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.184 [2024-09-27 15:21:23.968608] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode6] resetting controller 00:16:22.184 [2024-09-27 15:21:23.968676] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.184 [2024-09-27 15:21:23.968694] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.184 [2024-09-27 15:21:23.968708] bdev_nvme.c:3029:bdev_nvme_failover_ctrlr_unsafe: *NOTICE*: Unable to perform failover, already in progress. 00:16:22.184 task offset: 24576 on job bdev=Nvme10n1 fails 00:16:22.184 00:16:22.184 Latency(us) 00:16:22.184 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:16:22.184 Job: Nvme1n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme1n1 ended in about 1.96 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme1n1 : 1.96 152.79 9.55 32.70 0.00 341952.98 6582.09 1057694.05 00:16:22.184 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme2n1 ended in about 1.96 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme2n1 : 1.96 146.60 9.16 32.69 0.00 350961.39 12195.39 1057694.05 00:16:22.184 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme3n1 ended in about 1.96 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme3n1 : 1.96 145.51 9.09 32.68 0.00 350257.47 19945.74 1057694.05 00:16:22.184 Job: Nvme4n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme4n1 ended in about 1.96 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme4n1 : 1.96 146.98 9.19 32.66 0.00 344842.08 27354.16 1057694.05 00:16:22.184 Job: Nvme5n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme5n1 ended in about 1.96 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme5n1 : 1.96 130.59 8.16 32.65 0.00 376629.69 59267.34 1057694.05 00:16:22.184 Job: Nvme6n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme6n1 ended in about 1.91 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme6n1 : 1.91 150.44 9.40 33.55 0.00 331569.78 36016.31 1064988.49 00:16:22.184 Job: Nvme7n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme7n1 ended in about 1.92 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme7n1 : 1.92 150.24 9.39 33.39 0.00 329096.92 43766.65 1050399.61 00:16:22.184 Job: Nvme8n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme8n1 ended in about 1.92 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme8n1 : 1.92 148.06 9.25 33.25 0.00 330630.63 51972.90 1050399.61 00:16:22.184 Job: Nvme9n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme9n1 ended in about 1.93 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme9n1 : 1.93 132.45 8.28 33.11 0.00 359038.98 54024.46 1050399.61 00:16:22.184 Job: Nvme10n1 (Core Mask 0x1, workload: verify, depth: 64, IO size: 65536) 00:16:22.184 Job: Nvme10n1 ended in about 1.89 seconds with error 00:16:22.184 Verification LBA range: start 0x0 length 0x400 00:16:22.184 Nvme10n1 : 1.89 101.63 6.35 33.88 0.00 434051.34 60179.14 1064988.49 00:16:22.184 =================================================================================================================== 00:16:22.184 Total : 1405.30 87.83 330.55 0.00 352389.23 6582.09 1064988.49 00:16:22.184 [2024-09-27 15:21:23.995462] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:16:22.184 [2024-09-27 15:21:23.995502] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode7] resetting controller 00:16:22.184 [2024-09-27 15:21:23.995521] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode8] resetting controller 00:16:22.184 [2024-09-27 15:21:23.995534] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode9] resetting controller 00:16:22.184 [2024-09-27 15:21:24.001315] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.184 [2024-09-27 15:21:24.001347] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.184 [2024-09-27 15:21:24.001356] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019a89000 00:16:22.184 [2024-09-27 15:21:24.005674] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.184 [2024-09-27 15:21:24.005698] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.184 [2024-09-27 15:21:24.005707] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019abf180 00:16:22.184 [2024-09-27 15:21:24.006590] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.184 [2024-09-27 15:21:24.006607] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.184 [2024-09-27 15:21:24.006615] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019a9a040 00:16:22.184 [2024-09-27 15:21:24.006699] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.184 [2024-09-27 15:21:24.006712] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.184 [2024-09-27 15:21:24.006720] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019a9a640 00:16:22.184 [2024-09-27 15:21:24.006826] nvme_rdma.c: 542:nvme_rdma_validate_cm_event: *ERROR*: Expected RDMA_CM_EVENT_ESTABLISHED but received RDMA_CM_EVENT_REJECTED (8) from CM event channel (status = 8) 00:16:22.184 [2024-09-27 15:21:24.006838] nvme_rdma.c:1088:nvme_rdma_connect_established: *ERROR*: RDMA connect error -74 00:16:22.184 [2024-09-27 15:21:24.006845] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x200019abf4c0 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@138 -- # NOT wait 1806870 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@650 -- # local es=0 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@652 -- # valid_exec_arg wait 1806870 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@638 -- # local arg=wait 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@642 -- # type -t wait 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:22.752 15:21:24 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@653 -- # wait 1806870 00:16:23.321 [2024-09-27 15:21:24.892003] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.321 [2024-09-27 15:21:24.892063] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:16:23.321 [2024-09-27 15:21:24.893618] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.321 [2024-09-27 15:21:24.893662] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode2] in failed state. 00:16:23.321 [2024-09-27 15:21:24.895077] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.321 [2024-09-27 15:21:24.895091] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode3] in failed state. 00:16:23.321 [2024-09-27 15:21:24.896289] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.321 [2024-09-27 15:21:24.896318] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode4] in failed state. 00:16:23.321 [2024-09-27 15:21:24.897738] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.321 [2024-09-27 15:21:24.897779] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode5] in failed state. 00:16:23.321 [2024-09-27 15:21:24.897877] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode1] Ctrlr is in error state 00:16:23.321 [2024-09-27 15:21:24.897909] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode1] controller reinitialization failed 00:16:23.321 [2024-09-27 15:21:24.897941] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] already in failed state 00:16:23.321 [2024-09-27 15:21:24.897979] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode2] Ctrlr is in error state 00:16:23.321 [2024-09-27 15:21:24.898010] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode2] controller reinitialization failed 00:16:23.321 [2024-09-27 15:21:24.898039] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode2] already in failed state 00:16:23.321 [2024-09-27 15:21:24.898074] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode3] Ctrlr is in error state 00:16:23.321 [2024-09-27 15:21:24.898103] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode3] controller reinitialization failed 00:16:23.321 [2024-09-27 15:21:24.898133] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode3] already in failed state 00:16:23.321 [2024-09-27 15:21:24.898166] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode4] Ctrlr is in error state 00:16:23.321 [2024-09-27 15:21:24.898196] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode4] controller reinitialization failed 00:16:23.321 [2024-09-27 15:21:24.898225] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode4] already in failed state 00:16:23.321 [2024-09-27 15:21:24.898261] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode5] Ctrlr is in error state 00:16:23.321 [2024-09-27 15:21:24.898290] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode5] controller reinitialization failed 00:16:23.321 [2024-09-27 15:21:24.898319] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode5] already in failed state 00:16:23.321 [2024-09-27 15:21:24.898367] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.321 [2024-09-27 15:21:24.898385] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.321 [2024-09-27 15:21:24.898395] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.321 [2024-09-27 15:21:24.898405] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.321 [2024-09-27 15:21:24.898414] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.321 [2024-09-27 15:21:25.005186] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.321 [2024-09-27 15:21:25.005241] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode10] in failed state. 00:16:23.321 [2024-09-27 15:21:25.005331] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode10] Ctrlr is in error state 00:16:23.321 [2024-09-27 15:21:25.005399] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode10] controller reinitialization failed 00:16:23.321 [2024-09-27 15:21:25.005429] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode10] already in failed state 00:16:23.321 [2024-09-27 15:21:25.005493] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.321 [2024-09-27 15:21:25.011490] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.322 [2024-09-27 15:21:25.011554] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode6] in failed state. 00:16:23.322 [2024-09-27 15:21:25.013086] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.322 [2024-09-27 15:21:25.013121] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode7] in failed state. 00:16:23.322 [2024-09-27 15:21:25.014629] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.322 [2024-09-27 15:21:25.014662] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode8] in failed state. 00:16:23.322 [2024-09-27 15:21:25.015942] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:23.322 [2024-09-27 15:21:25.015974] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode9] in failed state. 00:16:23.322 [2024-09-27 15:21:25.015995] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode6] Ctrlr is in error state 00:16:23.322 [2024-09-27 15:21:25.016019] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode6] controller reinitialization failed 00:16:23.322 [2024-09-27 15:21:25.016042] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode6] already in failed state 00:16:23.322 [2024-09-27 15:21:25.016152] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.322 [2024-09-27 15:21:25.016184] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode7] Ctrlr is in error state 00:16:23.322 [2024-09-27 15:21:25.016207] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode7] controller reinitialization failed 00:16:23.322 [2024-09-27 15:21:25.016230] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode7] already in failed state 00:16:23.322 [2024-09-27 15:21:25.016257] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode8] Ctrlr is in error state 00:16:23.322 [2024-09-27 15:21:25.016279] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode8] controller reinitialization failed 00:16:23.322 [2024-09-27 15:21:25.016301] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode8] already in failed state 00:16:23.322 [2024-09-27 15:21:25.016328] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode9] Ctrlr is in error state 00:16:23.322 [2024-09-27 15:21:25.016363] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode9] controller reinitialization failed 00:16:23.322 [2024-09-27 15:21:25.016393] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode9] already in failed state 00:16:23.322 [2024-09-27 15:21:25.016478] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.322 [2024-09-27 15:21:25.016506] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.322 [2024-09-27 15:21:25.016532] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@653 -- # es=255 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@662 -- # es=127 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@663 -- # case "$es" in 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@670 -- # es=1 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@140 -- # stoptarget 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@42 -- # rm -f ./local-job0-0-verify.state 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@43 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevperf.conf 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@44 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- target/shutdown.sh@46 -- # nvmftestfini 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@331 -- # nvmfcleanup 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@99 -- # sync 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@102 -- # set +e 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@103 -- # for i in {1..20} 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:16:23.582 rmmod nvme_rdma 00:16:23.582 rmmod nvme_fabrics 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@106 -- # set -e 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@107 -- # return 0 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@332 -- # '[' -n 1806564 ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@333 -- # killprocess 1806564 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@950 -- # '[' -z 1806564 ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@954 -- # kill -0 1806564 00:16:23.582 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1806564) - No such process 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@977 -- # echo 'Process with pid 1806564 is not found' 00:16:23.582 Process with pid 1806564 is not found 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@338 -- # nvmf_fini 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@264 -- # local dev 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@267 -- # remove_target_ns 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@268 -- # delete_main_bridge 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@130 -- # return 0 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@41 -- # _dev=0 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@41 -- # dev_map=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/setup.sh@284 -- # iptr 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@538 -- # iptables-save 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- nvmf/common.sh@538 -- # iptables-restore 00:16:23.582 00:16:23.582 real 0m6.435s 00:16:23.582 user 0m19.645s 00:16:23.582 sys 0m1.544s 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc3 -- common/autotest_common.sh@10 -- # set +x 00:16:23.582 ************************************ 00:16:23.582 END TEST nvmf_shutdown_tc3 00:16:23.582 ************************************ 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@166 -- # [[ mlx5 == \e\8\1\0 ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@167 -- # run_test nvmf_shutdown_tc4 nvmf_shutdown_tc4 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@10 -- # set +x 00:16:23.582 ************************************ 00:16:23.582 START TEST nvmf_shutdown_tc4 00:16:23.582 ************************************ 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@1125 -- # nvmf_shutdown_tc4 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@145 -- # starttarget 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@16 -- # nvmftestinit 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@292 -- # prepare_net_devs 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@254 -- # local -g is_hw=no 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@256 -- # remove_target_ns 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@125 -- # xtrace_disable 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@131 -- # pci_devs=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@131 -- # local -a pci_devs 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@132 -- # pci_net_devs=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@133 -- # pci_drivers=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@133 -- # local -A pci_drivers 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@135 -- # net_devs=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@135 -- # local -ga net_devs 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@136 -- # e810=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@136 -- # local -ga e810 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@137 -- # x722=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@137 -- # local -ga x722 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@138 -- # mlx=() 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@138 -- # local -ga mlx 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:16:23.582 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:16:23.583 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:16:23.583 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:16:23.583 Found net devices under 0000:18:00.0: mlx_0_0 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:16:23.583 Found net devices under 0000:18:00.1: mlx_0_1 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@249 -- # get_rdma_if_list 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@75 -- # rdma_devs=() 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:16:23.583 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@89 -- # continue 2 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@89 -- # continue 2 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@258 -- # is_hw=yes 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@61 -- # uname 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@65 -- # modprobe ib_cm 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@66 -- # modprobe ib_core 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@67 -- # modprobe ib_umad 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@69 -- # modprobe iw_cm 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@27 -- # local -gA dev_map 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@28 -- # local -g _dev 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@44 -- # ips=() 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@58 -- # key_initiator=target1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@11 -- # local val=167772161 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:16:23.843 10.0.0.1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@11 -- # local val=167772162 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:16:23.843 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:16:23.843 10.0.0.2 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@38 -- # ping_ips 1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:23.844 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:23.844 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.043 ms 00:16:23.844 00:16:23.844 --- 10.0.0.2 ping statistics --- 00:16:23.844 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:23.844 rtt min/avg/max/mdev = 0.043/0.043/0.043/0.000 ms 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:23.844 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:23.844 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:16:23.844 00:16:23.844 --- 10.0.0.2 ping statistics --- 00:16:23.844 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:23.844 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@98 -- # (( pair++ )) 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@266 -- # return 0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:23.844 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@107 -- # local dev=target0 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:23.845 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@107 -- # local dev=target1 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@19 -- # nvmfappstart -m 0x1E 00:16:24.105 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@324 -- # nvmfpid=1807639 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@325 -- # waitforlisten 1807639 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1E 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@831 -- # '[' -z 1807639 ']' 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:24.106 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:24.106 15:21:25 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:24.106 [2024-09-27 15:21:25.791196] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:24.106 [2024-09-27 15:21:25.791261] [ DPDK EAL parameters: nvmf -c 0x1E --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:24.106 [2024-09-27 15:21:25.875933] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:16:24.365 [2024-09-27 15:21:25.965952] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:16:24.365 [2024-09-27 15:21:25.965994] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:16:24.365 [2024-09-27 15:21:25.966004] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:16:24.365 [2024-09-27 15:21:25.966012] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:16:24.365 [2024-09-27 15:21:25.966019] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:16:24.365 [2024-09-27 15:21:25.966135] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:16:24.365 [2024-09-27 15:21:25.966237] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:16:24.365 [2024-09-27 15:21:25.966338] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:16:24.365 [2024-09-27 15:21:25.966340] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@864 -- # return 0 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@21 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:24.932 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:24.932 [2024-09-27 15:21:26.730529] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x6027a0/0x606c90) succeed. 00:16:24.932 [2024-09-27 15:21:26.741037] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x603de0/0x648330) succeed. 00:16:25.192 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@23 -- # num_subsystems=({1..10}) 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@25 -- # timing_enter create_subsystems 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@27 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@28 -- # for i in "${num_subsystems[@]}" 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@29 -- # cat 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@36 -- # rpc_cmd 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:25.193 15:21:26 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:25.193 Malloc1 00:16:25.193 [2024-09-27 15:21:26.966721] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:16:25.193 Malloc2 00:16:25.193 Malloc3 00:16:25.453 Malloc4 00:16:25.453 Malloc5 00:16:25.453 Malloc6 00:16:25.453 Malloc7 00:16:25.453 Malloc8 00:16:25.712 Malloc9 00:16:25.712 Malloc10 00:16:25.712 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:25.713 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@37 -- # timing_exit create_subsystems 00:16:25.713 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:25.713 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:25.713 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@149 -- # perfpid=1807874 00:16:25.713 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@150 -- # sleep 5 00:16:25.713 15:21:27 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@148 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 128 -o 45056 -O 4096 -w randwrite -t 20 -r 'trtype:rdma adrfam:IPV4 traddr:10.0.0.2 trsvcid:4420' -P 4 00:16:25.713 [2024-09-27 15:21:27.531585] subsystem.c:1641:spdk_nvmf_subsystem_listener_allowed: *WARNING*: Allowing connection to discovery subsystem on RDMA/10.0.0.2/4420, even though this listener was not added to the discovery subsystem. This behavior is deprecated and will be removed in a future release. 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@152 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; kill -9 $perfpid || true; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@155 -- # killprocess 1807639 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@950 -- # '[' -z 1807639 ']' 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@954 -- # kill -0 1807639 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@955 -- # uname 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1807639 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1807639' 00:16:30.988 killing process with pid 1807639 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@969 -- # kill 1807639 00:16:30.988 15:21:32 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@974 -- # wait 1807639 00:16:30.988 NVMe io qpair process completion error 00:16:30.988 NVMe io qpair process completion error 00:16:30.988 NVMe io qpair process completion error 00:16:30.988 NVMe io qpair process completion error 00:16:30.988 NVMe io qpair process completion error 00:16:30.988 NVMe io qpair process completion error 00:16:30.988 NVMe io qpair process completion error 00:16:31.557 15:21:33 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@156 -- # sleep 1 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 starting I/O failed: -6 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.818 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 [2024-09-27 15:21:33.601817] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2016-06.io.spdk:cnode3] Submitting Keep Alive failed 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 [2024-09-27 15:21:33.612572] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2016-06.io.spdk:cnode10] Submitting Keep Alive failed 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.819 starting I/O failed: -6 00:16:31.819 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 [2024-09-27 15:21:33.623272] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2016-06.io.spdk:cnode1] Submitting Keep Alive failed 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.820 starting I/O failed: -6 00:16:31.820 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 [2024-09-27 15:21:33.635280] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2016-06.io.spdk:cnode2] Submitting Keep Alive failed 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.821 Write completed with error (sct=0, sc=8) 00:16:31.821 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 [2024-09-27 15:21:33.649789] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2016-06.io.spdk:cnode5] Submitting Keep Alive failed 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.822 Write completed with error (sct=0, sc=8) 00:16:31.822 starting I/O failed: -6 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 starting I/O failed: -6 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 starting I/O failed: -6 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 starting I/O failed: -6 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 starting I/O failed: -6 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 [2024-09-27 15:21:33.662909] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2016-06.io.spdk:cnode4] Submitting Keep Alive failed 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:31.823 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 Write completed with error (sct=0, sc=8) 00:16:32.084 NVMe io qpair process completion error 00:16:32.084 NVMe io qpair process completion error 00:16:32.084 NVMe io qpair process completion error 00:16:32.084 NVMe io qpair process completion error 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@158 -- # NOT wait 1807874 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@650 -- # local es=0 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@652 -- # valid_exec_arg wait 1807874 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@638 -- # local arg=wait 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@642 -- # type -t wait 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:32.344 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@653 -- # wait 1807874 00:16:32.914 [2024-09-27 15:21:34.665117] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.914 [2024-09-27 15:21:34.665183] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode3] in failed state. 00:16:32.914 [2024-09-27 15:21:34.667624] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.914 [2024-09-27 15:21:34.667670] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode10] in failed state. 00:16:32.914 [2024-09-27 15:21:34.670088] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.914 [2024-09-27 15:21:34.670131] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode6] in failed state. 00:16:32.914 [2024-09-27 15:21:34.672190] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.914 [2024-09-27 15:21:34.672232] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.914 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 [2024-09-27 15:21:34.674837] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.915 [2024-09-27 15:21:34.674878] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode2] in failed state. 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 [2024-09-27 15:21:34.677420] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.915 [2024-09-27 15:21:34.677462] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode7] in failed state. 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.915 [2024-09-27 15:21:34.680094] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.915 [2024-09-27 15:21:34.680136] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode9] in failed state. 00:16:32.915 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 [2024-09-27 15:21:34.682675] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.916 [2024-09-27 15:21:34.682750] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode8] in failed state. 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 [2024-09-27 15:21:34.685580] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 [2024-09-27 15:21:34.685629] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode5] in failed state. 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 [2024-09-27 15:21:34.688618] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 [2024-09-27 15:21:34.688664] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode4] in failed state. 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.916 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.917 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Write completed with error (sct=0, sc=8) 00:16:32.918 Initializing NVMe Controllers 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode3 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode10 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode6 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode2 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode7 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode9 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode8 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode5 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode4 00:16:32.918 Controller IO queue size 128, less than required. 00:16:32.918 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode3) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode10) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode6) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode2) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode7) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode9) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode8) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode5) NSID 1 with lcore 0 00:16:32.918 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode4) NSID 1 with lcore 0 00:16:32.918 Initialization complete. Launching workers. 00:16:32.918 ======================================================== 00:16:32.918 Latency(us) 00:16:32.918 Device Information : IOPS MiB/s Average min max 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode3) NSID 1 from core 0: 1452.14 62.40 87947.05 16552.83 1230969.33 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode10) NSID 1 from core 0: 1444.59 62.07 87677.31 35347.21 1181053.73 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode6) NSID 1 from core 0: 1468.27 63.09 101849.60 124.08 2205772.66 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 1440.05 61.88 88033.80 32489.24 1209030.36 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode2) NSID 1 from core 0: 1446.77 62.17 87713.95 37683.90 1200718.51 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode7) NSID 1 from core 0: 1474.32 63.35 101489.70 114.94 2166244.91 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode9) NSID 1 from core 0: 1490.11 64.03 100538.25 122.77 2065888.66 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode8) NSID 1 from core 0: 1478.68 63.54 101429.65 119.55 2127547.51 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode5) NSID 1 from core 0: 1447.44 62.19 87653.91 129.87 1194440.36 00:16:32.918 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode4) NSID 1 from core 0: 1440.55 61.90 88232.91 86.50 1220652.65 00:16:32.918 ======================================================== 00:16:32.918 Total : 14582.92 626.61 93327.53 86.50 2205772.66 00:16:32.918 00:16:32.918 /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf: errors occurred 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@653 -- # es=1 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@159 -- # stoptarget 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@42 -- # rm -f ./local-job0-0-verify.state 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@43 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/bdevperf.conf 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@44 -- # rm -rf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/target/rpcs.txt 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- target/shutdown.sh@46 -- # nvmftestfini 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@331 -- # nvmfcleanup 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@99 -- # sync 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@102 -- # set +e 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@103 -- # for i in {1..20} 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:16:33.179 rmmod nvme_rdma 00:16:33.179 rmmod nvme_fabrics 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@106 -- # set -e 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@107 -- # return 0 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@332 -- # '[' -n 1807639 ']' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@333 -- # killprocess 1807639 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@950 -- # '[' -z 1807639 ']' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@954 -- # kill -0 1807639 00:16:33.179 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1807639) - No such process 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@977 -- # echo 'Process with pid 1807639 is not found' 00:16:33.179 Process with pid 1807639 is not found 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@338 -- # nvmf_fini 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@264 -- # local dev 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@267 -- # remove_target_ns 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@268 -- # delete_main_bridge 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@130 -- # return 0 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@41 -- # _dev=0 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@41 -- # dev_map=() 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/setup.sh@284 -- # iptr 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@538 -- # iptables-save 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:16:33.179 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- nvmf/common.sh@538 -- # iptables-restore 00:16:33.180 00:16:33.180 real 0m9.459s 00:16:33.180 user 0m34.868s 00:16:33.180 sys 0m1.517s 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown.nvmf_shutdown_tc4 -- common/autotest_common.sh@10 -- # set +x 00:16:33.180 ************************************ 00:16:33.180 END TEST nvmf_shutdown_tc4 00:16:33.180 ************************************ 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- target/shutdown.sh@170 -- # trap - SIGINT SIGTERM EXIT 00:16:33.180 00:16:33.180 real 0m36.611s 00:16:33.180 user 1m50.308s 00:16:33.180 sys 0m11.446s 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra.nvmf_shutdown -- common/autotest_common.sh@10 -- # set +x 00:16:33.180 ************************************ 00:16:33.180 END TEST nvmf_shutdown 00:16:33.180 ************************************ 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra -- nvmf/nvmf_target_extra.sh@68 -- # trap - SIGINT SIGTERM EXIT 00:16:33.180 00:16:33.180 real 7m42.303s 00:16:33.180 user 18m21.439s 00:16:33.180 sys 2m14.633s 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:33.180 15:21:34 nvmf_rdma.nvmf_target_extra -- common/autotest_common.sh@10 -- # set +x 00:16:33.180 ************************************ 00:16:33.180 END TEST nvmf_target_extra 00:16:33.180 ************************************ 00:16:33.180 15:21:35 nvmf_rdma -- nvmf/nvmf.sh@12 -- # run_test nvmf_host /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_host.sh --transport=rdma 00:16:33.180 15:21:35 nvmf_rdma -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:16:33.180 15:21:35 nvmf_rdma -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:33.180 15:21:35 nvmf_rdma -- common/autotest_common.sh@10 -- # set +x 00:16:33.440 ************************************ 00:16:33.440 START TEST nvmf_host 00:16:33.440 ************************************ 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_host.sh --transport=rdma 00:16:33.440 * Looking for test storage... 00:16:33.440 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1681 -- # lcov --version 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@333 -- # local ver1 ver1_l 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@334 -- # local ver2 ver2_l 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@336 -- # IFS=.-: 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@336 -- # read -ra ver1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@337 -- # IFS=.-: 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@337 -- # read -ra ver2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@338 -- # local 'op=<' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@340 -- # ver1_l=2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@341 -- # ver2_l=1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@344 -- # case "$op" in 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@345 -- # : 1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@364 -- # (( v = 0 )) 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@365 -- # decimal 1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@353 -- # local d=1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@355 -- # echo 1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@365 -- # ver1[v]=1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@366 -- # decimal 2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@353 -- # local d=2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@355 -- # echo 2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@366 -- # ver2[v]=2 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@368 -- # return 0 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:16:33.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.440 --rc genhtml_branch_coverage=1 00:16:33.440 --rc genhtml_function_coverage=1 00:16:33.440 --rc genhtml_legend=1 00:16:33.440 --rc geninfo_all_blocks=1 00:16:33.440 --rc geninfo_unexecuted_blocks=1 00:16:33.440 00:16:33.440 ' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:16:33.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.440 --rc genhtml_branch_coverage=1 00:16:33.440 --rc genhtml_function_coverage=1 00:16:33.440 --rc genhtml_legend=1 00:16:33.440 --rc geninfo_all_blocks=1 00:16:33.440 --rc geninfo_unexecuted_blocks=1 00:16:33.440 00:16:33.440 ' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:16:33.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.440 --rc genhtml_branch_coverage=1 00:16:33.440 --rc genhtml_function_coverage=1 00:16:33.440 --rc genhtml_legend=1 00:16:33.440 --rc geninfo_all_blocks=1 00:16:33.440 --rc geninfo_unexecuted_blocks=1 00:16:33.440 00:16:33.440 ' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:16:33.440 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.440 --rc genhtml_branch_coverage=1 00:16:33.440 --rc genhtml_function_coverage=1 00:16:33.440 --rc genhtml_legend=1 00:16:33.440 --rc geninfo_all_blocks=1 00:16:33.440 --rc geninfo_unexecuted_blocks=1 00:16:33.440 00:16:33.440 ' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@7 -- # uname -s 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@15 -- # shopt -s extglob 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.440 15:21:35 nvmf_rdma.nvmf_host -- paths/export.sh@5 -- # export PATH 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@50 -- # : 0 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:16:33.441 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/common.sh@54 -- # have_pci_nics=0 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@11 -- # trap 'exit 1' SIGINT SIGTERM EXIT 00:16:33.441 15:21:35 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@13 -- # TEST_ARGS=("$@") 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@15 -- # [[ 0 -eq 0 ]] 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@16 -- # run_test nvmf_aer /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/aer.sh --transport=rdma 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:16:33.701 ************************************ 00:16:33.701 START TEST nvmf_aer 00:16:33.701 ************************************ 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/aer.sh --transport=rdma 00:16:33.701 * Looking for test storage... 00:16:33.701 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1681 -- # lcov --version 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@333 -- # local ver1 ver1_l 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@334 -- # local ver2 ver2_l 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@336 -- # IFS=.-: 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@336 -- # read -ra ver1 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@337 -- # IFS=.-: 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@337 -- # read -ra ver2 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@338 -- # local 'op=<' 00:16:33.701 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@340 -- # ver1_l=2 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@341 -- # ver2_l=1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@344 -- # case "$op" in 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@345 -- # : 1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@364 -- # (( v = 0 )) 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@365 -- # decimal 1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@353 -- # local d=1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@355 -- # echo 1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@365 -- # ver1[v]=1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@366 -- # decimal 2 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@353 -- # local d=2 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@355 -- # echo 2 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@366 -- # ver2[v]=2 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@368 -- # return 0 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:16:33.702 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.702 --rc genhtml_branch_coverage=1 00:16:33.702 --rc genhtml_function_coverage=1 00:16:33.702 --rc genhtml_legend=1 00:16:33.702 --rc geninfo_all_blocks=1 00:16:33.702 --rc geninfo_unexecuted_blocks=1 00:16:33.702 00:16:33.702 ' 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:16:33.702 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.702 --rc genhtml_branch_coverage=1 00:16:33.702 --rc genhtml_function_coverage=1 00:16:33.702 --rc genhtml_legend=1 00:16:33.702 --rc geninfo_all_blocks=1 00:16:33.702 --rc geninfo_unexecuted_blocks=1 00:16:33.702 00:16:33.702 ' 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:16:33.702 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.702 --rc genhtml_branch_coverage=1 00:16:33.702 --rc genhtml_function_coverage=1 00:16:33.702 --rc genhtml_legend=1 00:16:33.702 --rc geninfo_all_blocks=1 00:16:33.702 --rc geninfo_unexecuted_blocks=1 00:16:33.702 00:16:33.702 ' 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:16:33.702 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:33.702 --rc genhtml_branch_coverage=1 00:16:33.702 --rc genhtml_function_coverage=1 00:16:33.702 --rc genhtml_legend=1 00:16:33.702 --rc geninfo_all_blocks=1 00:16:33.702 --rc geninfo_unexecuted_blocks=1 00:16:33.702 00:16:33.702 ' 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@7 -- # uname -s 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@15 -- # shopt -s extglob 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- paths/export.sh@5 -- # export PATH 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:16:33.702 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@50 -- # : 0 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:16:33.963 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@54 -- # have_pci_nics=0 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@11 -- # nvmftestinit 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@292 -- # prepare_net_devs 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@254 -- # local -g is_hw=no 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@256 -- # remove_target_ns 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@125 -- # xtrace_disable 00:16:33.963 15:21:35 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@131 -- # pci_devs=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@131 -- # local -a pci_devs 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@132 -- # pci_net_devs=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@133 -- # pci_drivers=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@133 -- # local -A pci_drivers 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@135 -- # net_devs=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@135 -- # local -ga net_devs 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@136 -- # e810=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@136 -- # local -ga e810 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@137 -- # x722=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@137 -- # local -ga x722 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@138 -- # mlx=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@138 -- # local -ga mlx 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:16:40.537 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:16:40.537 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:16:40.537 Found net devices under 0000:18:00.0: mlx_0_0 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:16:40.537 Found net devices under 0000:18:00.1: mlx_0_1 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@249 -- # get_rdma_if_list 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@75 -- # rdma_devs=() 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@89 -- # continue 2 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:40.537 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@89 -- # continue 2 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@258 -- # is_hw=yes 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@61 -- # uname 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@65 -- # modprobe ib_cm 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@66 -- # modprobe ib_core 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@67 -- # modprobe ib_umad 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@69 -- # modprobe iw_cm 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@27 -- # local -gA dev_map 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@28 -- # local -g _dev 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@44 -- # ips=() 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@58 -- # key_initiator=target1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@11 -- # local val=167772161 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:16:40.538 10.0.0.1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@11 -- # local val=167772162 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:16:40.538 10.0.0.2 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@38 -- # ping_ips 1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@107 -- # local dev=target0 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:40.538 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:40.798 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:16:40.798 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:16:40.798 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:16:40.798 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:16:40.798 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:40.798 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:40.798 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:40.798 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:16:40.798 00:16:40.799 --- 10.0.0.2 ping statistics --- 00:16:40.799 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:40.799 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@107 -- # local dev=target0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:40.799 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:40.799 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:16:40.799 00:16:40.799 --- 10.0.0.2 ping statistics --- 00:16:40.799 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:40.799 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@98 -- # (( pair++ )) 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@266 -- # return 0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@107 -- # local dev=target0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@107 -- # local dev=target1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@107 -- # local dev=target0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@107 -- # local dev=target1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:16:40.799 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@12 -- # nvmfappstart -m 0xF 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@324 -- # nvmfpid=1811942 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@325 -- # waitforlisten 1811942 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@831 -- # '[' -z 1811942 ']' 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:40.800 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:40.800 15:21:42 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:16:40.800 [2024-09-27 15:21:42.587391] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:40.800 [2024-09-27 15:21:42.587456] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:41.058 [2024-09-27 15:21:42.674536] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:16:41.058 [2024-09-27 15:21:42.761278] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:16:41.058 [2024-09-27 15:21:42.761318] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:16:41.058 [2024-09-27 15:21:42.761328] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:16:41.058 [2024-09-27 15:21:42.761337] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:16:41.059 [2024-09-27 15:21:42.761350] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:16:41.059 [2024-09-27 15:21:42.761462] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:16:41.059 [2024-09-27 15:21:42.761565] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:16:41.059 [2024-09-27 15:21:42.761667] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:41.059 [2024-09-27 15:21:42.761668] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:16:41.638 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:41.638 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@864 -- # return 0 00:16:41.638 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:16:41.638 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:41.638 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@14 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 [2024-09-27 15:21:43.512365] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1d984a0/0x1d9c990) succeed. 00:16:41.950 [2024-09-27 15:21:43.522889] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1d99ae0/0x1dde030) succeed. 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@16 -- # rpc_cmd bdev_malloc_create 64 512 --name Malloc0 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 Malloc0 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@17 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -m 2 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@18 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@19 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 [2024-09-27 15:21:43.700406] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@21 -- # rpc_cmd nvmf_get_subsystems 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:41.950 [ 00:16:41.950 { 00:16:41.950 "nqn": "nqn.2014-08.org.nvmexpress.discovery", 00:16:41.950 "subtype": "Discovery", 00:16:41.950 "listen_addresses": [], 00:16:41.950 "allow_any_host": true, 00:16:41.950 "hosts": [] 00:16:41.950 }, 00:16:41.950 { 00:16:41.950 "nqn": "nqn.2016-06.io.spdk:cnode1", 00:16:41.950 "subtype": "NVMe", 00:16:41.950 "listen_addresses": [ 00:16:41.950 { 00:16:41.950 "trtype": "RDMA", 00:16:41.950 "adrfam": "IPv4", 00:16:41.950 "traddr": "10.0.0.2", 00:16:41.950 "trsvcid": "4420" 00:16:41.950 } 00:16:41.950 ], 00:16:41.950 "allow_any_host": true, 00:16:41.950 "hosts": [], 00:16:41.950 "serial_number": "SPDK00000000000001", 00:16:41.950 "model_number": "SPDK bdev Controller", 00:16:41.950 "max_namespaces": 2, 00:16:41.950 "min_cntlid": 1, 00:16:41.950 "max_cntlid": 65519, 00:16:41.950 "namespaces": [ 00:16:41.950 { 00:16:41.950 "nsid": 1, 00:16:41.950 "bdev_name": "Malloc0", 00:16:41.950 "name": "Malloc0", 00:16:41.950 "nguid": "C9ECFDA207D34353B5A9ADD7F6253069", 00:16:41.950 "uuid": "c9ecfda2-07d3-4353-b5a9-add7f6253069" 00:16:41.950 } 00:16:41.950 ] 00:16:41.950 } 00:16:41.950 ] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@23 -- # AER_TOUCH_FILE=/tmp/aer_touch_file 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@24 -- # rm -f /tmp/aer_touch_file 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@33 -- # aerpid=1812120 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvme/aer/aer -r ' trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2016-06.io.spdk:cnode1' -n 2 -t /tmp/aer_touch_file 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@36 -- # waitforfile /tmp/aer_touch_file 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1265 -- # local i=0 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1266 -- # '[' '!' -e /tmp/aer_touch_file ']' 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1267 -- # '[' 0 -lt 200 ']' 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1268 -- # i=1 00:16:41.950 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1269 -- # sleep 0.1 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1266 -- # '[' '!' -e /tmp/aer_touch_file ']' 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1267 -- # '[' 1 -lt 200 ']' 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1268 -- # i=2 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1269 -- # sleep 0.1 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1266 -- # '[' '!' -e /tmp/aer_touch_file ']' 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1272 -- # '[' '!' -e /tmp/aer_touch_file ']' 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1276 -- # return 0 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@39 -- # rpc_cmd bdev_malloc_create 64 4096 --name Malloc1 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.247 Malloc1 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@40 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 -n 2 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@41 -- # rpc_cmd nvmf_get_subsystems 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:42.247 15:21:43 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.247 [ 00:16:42.247 { 00:16:42.247 "nqn": "nqn.2014-08.org.nvmexpress.discovery", 00:16:42.247 "subtype": "Discovery", 00:16:42.247 "listen_addresses": [], 00:16:42.247 "allow_any_host": true, 00:16:42.247 "hosts": [] 00:16:42.247 }, 00:16:42.247 { 00:16:42.247 "nqn": "nqn.2016-06.io.spdk:cnode1", 00:16:42.247 "subtype": "NVMe", 00:16:42.247 "listen_addresses": [ 00:16:42.247 { 00:16:42.247 "trtype": "RDMA", 00:16:42.247 "adrfam": "IPv4", 00:16:42.247 "traddr": "10.0.0.2", 00:16:42.247 "trsvcid": "4420" 00:16:42.247 } 00:16:42.247 ], 00:16:42.247 "allow_any_host": true, 00:16:42.247 "hosts": [], 00:16:42.247 "serial_number": "SPDK00000000000001", 00:16:42.247 "model_number": "SPDK bdev Controller", 00:16:42.247 "max_namespaces": 2, 00:16:42.247 "min_cntlid": 1, 00:16:42.247 "max_cntlid": 65519, 00:16:42.247 "namespaces": [ 00:16:42.247 { 00:16:42.247 "nsid": 1, 00:16:42.247 "bdev_name": "Malloc0", 00:16:42.247 "name": "Malloc0", 00:16:42.247 "nguid": "C9ECFDA207D34353B5A9ADD7F6253069", 00:16:42.247 "uuid": "c9ecfda2-07d3-4353-b5a9-add7f6253069" 00:16:42.247 }, 00:16:42.247 { 00:16:42.247 "nsid": 2, 00:16:42.247 "bdev_name": "Malloc1", 00:16:42.247 "name": "Malloc1", 00:16:42.247 "nguid": "CC29DD0714C24191B0361839802275C4", 00:16:42.247 "uuid": "cc29dd07-14c2-4191-b036-1839802275c4" 00:16:42.247 } 00:16:42.247 ] 00:16:42.247 } 00:16:42.247 ] 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@43 -- # wait 1812120 00:16:42.247 Asynchronous Event Request test 00:16:42.247 Attaching to 10.0.0.2 00:16:42.247 Attached to 10.0.0.2 00:16:42.247 Registering asynchronous event callbacks... 00:16:42.247 Starting namespace attribute notice tests for all controllers... 00:16:42.247 10.0.0.2: aer_cb for log page 4, aen_event_type: 0x02, aen_event_info: 0x00 00:16:42.247 aer_cb - Changed Namespace 00:16:42.247 Cleaning up... 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@45 -- # rpc_cmd bdev_malloc_delete Malloc0 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@46 -- # rpc_cmd bdev_malloc_delete Malloc1 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:42.247 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@47 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@49 -- # trap - SIGINT SIGTERM EXIT 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- host/aer.sh@51 -- # nvmftestfini 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@331 -- # nvmfcleanup 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@99 -- # sync 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@102 -- # set +e 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@103 -- # for i in {1..20} 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:16:42.506 rmmod nvme_rdma 00:16:42.506 rmmod nvme_fabrics 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@106 -- # set -e 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@107 -- # return 0 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@332 -- # '[' -n 1811942 ']' 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@333 -- # killprocess 1811942 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@950 -- # '[' -z 1811942 ']' 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@954 -- # kill -0 1811942 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@955 -- # uname 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1811942 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1811942' 00:16:42.506 killing process with pid 1811942 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@969 -- # kill 1811942 00:16:42.506 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@974 -- # wait 1811942 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@338 -- # nvmf_fini 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@264 -- # local dev 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@267 -- # remove_target_ns 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@268 -- # delete_main_bridge 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@130 -- # return 0 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@41 -- # _dev=0 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@41 -- # dev_map=() 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/setup.sh@284 -- # iptr 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@538 -- # iptables-save 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- nvmf/common.sh@538 -- # iptables-restore 00:16:42.766 00:16:42.766 real 0m9.189s 00:16:42.766 user 0m8.843s 00:16:42.766 sys 0m5.966s 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_aer -- common/autotest_common.sh@10 -- # set +x 00:16:42.766 ************************************ 00:16:42.766 END TEST nvmf_aer 00:16:42.766 ************************************ 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@17 -- # run_test nvmf_async_init /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/async_init.sh --transport=rdma 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:16:42.766 ************************************ 00:16:42.766 START TEST nvmf_async_init 00:16:42.766 ************************************ 00:16:42.766 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/async_init.sh --transport=rdma 00:16:43.027 * Looking for test storage... 00:16:43.027 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1681 -- # lcov --version 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@333 -- # local ver1 ver1_l 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@334 -- # local ver2 ver2_l 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@336 -- # IFS=.-: 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@336 -- # read -ra ver1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@337 -- # IFS=.-: 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@337 -- # read -ra ver2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@338 -- # local 'op=<' 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@340 -- # ver1_l=2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@341 -- # ver2_l=1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@344 -- # case "$op" in 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@345 -- # : 1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@364 -- # (( v = 0 )) 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@365 -- # decimal 1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@353 -- # local d=1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@355 -- # echo 1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@365 -- # ver1[v]=1 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@366 -- # decimal 2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@353 -- # local d=2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@355 -- # echo 2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@366 -- # ver2[v]=2 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@368 -- # return 0 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:16:43.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:43.027 --rc genhtml_branch_coverage=1 00:16:43.027 --rc genhtml_function_coverage=1 00:16:43.027 --rc genhtml_legend=1 00:16:43.027 --rc geninfo_all_blocks=1 00:16:43.027 --rc geninfo_unexecuted_blocks=1 00:16:43.027 00:16:43.027 ' 00:16:43.027 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:16:43.027 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:43.027 --rc genhtml_branch_coverage=1 00:16:43.027 --rc genhtml_function_coverage=1 00:16:43.027 --rc genhtml_legend=1 00:16:43.027 --rc geninfo_all_blocks=1 00:16:43.027 --rc geninfo_unexecuted_blocks=1 00:16:43.027 00:16:43.027 ' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:16:43.028 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:43.028 --rc genhtml_branch_coverage=1 00:16:43.028 --rc genhtml_function_coverage=1 00:16:43.028 --rc genhtml_legend=1 00:16:43.028 --rc geninfo_all_blocks=1 00:16:43.028 --rc geninfo_unexecuted_blocks=1 00:16:43.028 00:16:43.028 ' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:16:43.028 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:43.028 --rc genhtml_branch_coverage=1 00:16:43.028 --rc genhtml_function_coverage=1 00:16:43.028 --rc genhtml_legend=1 00:16:43.028 --rc geninfo_all_blocks=1 00:16:43.028 --rc geninfo_unexecuted_blocks=1 00:16:43.028 00:16:43.028 ' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@11 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@7 -- # uname -s 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@15 -- # shopt -s extglob 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- paths/export.sh@5 -- # export PATH 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@50 -- # : 0 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:16:43.028 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@54 -- # have_pci_nics=0 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@13 -- # null_bdev_size=1024 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@14 -- # null_block_size=512 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@15 -- # null_bdev=null0 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@16 -- # nvme_bdev=nvme0 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@20 -- # uuidgen 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@20 -- # tr -d - 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@20 -- # nguid=1098dfd604574f3382951fcfa0876696 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@22 -- # nvmftestinit 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@292 -- # prepare_net_devs 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@254 -- # local -g is_hw=no 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@256 -- # remove_target_ns 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@125 -- # xtrace_disable 00:16:43.028 15:21:44 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@131 -- # pci_devs=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@131 -- # local -a pci_devs 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@132 -- # pci_net_devs=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@133 -- # pci_drivers=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@133 -- # local -A pci_drivers 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@135 -- # net_devs=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@135 -- # local -ga net_devs 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@136 -- # e810=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@136 -- # local -ga e810 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@137 -- # x722=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@137 -- # local -ga x722 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@138 -- # mlx=() 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@138 -- # local -ga mlx 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:16:51.159 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:16:51.159 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:51.159 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:16:51.160 Found net devices under 0000:18:00.0: mlx_0_0 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:16:51.160 Found net devices under 0000:18:00.1: mlx_0_1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@249 -- # get_rdma_if_list 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@75 -- # rdma_devs=() 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@89 -- # continue 2 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@89 -- # continue 2 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@258 -- # is_hw=yes 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@61 -- # uname 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@65 -- # modprobe ib_cm 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@66 -- # modprobe ib_core 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@67 -- # modprobe ib_umad 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@69 -- # modprobe iw_cm 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@27 -- # local -gA dev_map 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@28 -- # local -g _dev 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@44 -- # ips=() 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@58 -- # key_initiator=target1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@11 -- # local val=167772161 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:16:51.160 10.0.0.1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@11 -- # local val=167772162 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:16:51.160 10.0.0.2 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:16:51.160 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@38 -- # ping_ips 1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@107 -- # local dev=target0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:51.161 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:51.161 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.035 ms 00:16:51.161 00:16:51.161 --- 10.0.0.2 ping statistics --- 00:16:51.161 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:51.161 rtt min/avg/max/mdev = 0.035/0.035/0.035/0.000 ms 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@107 -- # local dev=target0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:16:51.161 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:16:51.161 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.024 ms 00:16:51.161 00:16:51.161 --- 10.0.0.2 ping statistics --- 00:16:51.161 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:16:51.161 rtt min/avg/max/mdev = 0.024/0.024/0.024/0.000 ms 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@98 -- # (( pair++ )) 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@266 -- # return 0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@107 -- # local dev=target0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@107 -- # local dev=target1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:16:51.161 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # get_net_dev target0 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@107 -- # local dev=target0 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # get_net_dev target1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@107 -- # local dev=target1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@23 -- # nvmfappstart -m 0x1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@724 -- # xtrace_disable 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@324 -- # nvmfpid=1815082 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x1 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@325 -- # waitforlisten 1815082 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@831 -- # '[' -z 1815082 ']' 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:51.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:51.162 15:21:51 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 [2024-09-27 15:21:51.897801] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:16:51.162 [2024-09-27 15:21:51.897866] [ DPDK EAL parameters: nvmf -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:51.162 [2024-09-27 15:21:51.983705] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:51.162 [2024-09-27 15:21:52.073981] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:16:51.162 [2024-09-27 15:21:52.074022] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:16:51.162 [2024-09-27 15:21:52.074032] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:16:51.162 [2024-09-27 15:21:52.074041] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:16:51.162 [2024-09-27 15:21:52.074048] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:16:51.162 [2024-09-27 15:21:52.074072] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@864 -- # return 0 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@730 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@26 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 [2024-09-27 15:21:52.814138] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x93b2a0/0x93f790) succeed. 00:16:51.162 [2024-09-27 15:21:52.822888] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x93c7a0/0x980e30) succeed. 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@27 -- # rpc_cmd bdev_null_create null0 1024 512 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 null0 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@28 -- # rpc_cmd bdev_wait_for_examine 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@29 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode0 -a 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@30 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode0 null0 -g 1098dfd604574f3382951fcfa0876696 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@31 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4420 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 [2024-09-27 15:21:52.920085] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@37 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -a 10.0.0.2 -f ipv4 -s 4420 -n nqn.2016-06.io.spdk:cnode0 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 nvme0n1 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@41 -- # rpc_cmd bdev_get_bdevs -b nvme0n1 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.162 15:21:52 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.162 [ 00:16:51.162 { 00:16:51.423 "name": "nvme0n1", 00:16:51.423 "aliases": [ 00:16:51.423 "1098dfd6-0457-4f33-8295-1fcfa0876696" 00:16:51.423 ], 00:16:51.423 "product_name": "NVMe disk", 00:16:51.423 "block_size": 512, 00:16:51.423 "num_blocks": 2097152, 00:16:51.423 "uuid": "1098dfd6-0457-4f33-8295-1fcfa0876696", 00:16:51.423 "numa_id": 0, 00:16:51.423 "assigned_rate_limits": { 00:16:51.423 "rw_ios_per_sec": 0, 00:16:51.423 "rw_mbytes_per_sec": 0, 00:16:51.423 "r_mbytes_per_sec": 0, 00:16:51.423 "w_mbytes_per_sec": 0 00:16:51.423 }, 00:16:51.423 "claimed": false, 00:16:51.423 "zoned": false, 00:16:51.423 "supported_io_types": { 00:16:51.423 "read": true, 00:16:51.423 "write": true, 00:16:51.423 "unmap": false, 00:16:51.423 "flush": true, 00:16:51.423 "reset": true, 00:16:51.423 "nvme_admin": true, 00:16:51.423 "nvme_io": true, 00:16:51.423 "nvme_io_md": false, 00:16:51.423 "write_zeroes": true, 00:16:51.423 "zcopy": false, 00:16:51.423 "get_zone_info": false, 00:16:51.423 "zone_management": false, 00:16:51.423 "zone_append": false, 00:16:51.423 "compare": true, 00:16:51.423 "compare_and_write": true, 00:16:51.423 "abort": true, 00:16:51.423 "seek_hole": false, 00:16:51.423 "seek_data": false, 00:16:51.423 "copy": true, 00:16:51.423 "nvme_iov_md": false 00:16:51.423 }, 00:16:51.423 "memory_domains": [ 00:16:51.423 { 00:16:51.423 "dma_device_id": "SPDK_RDMA_DMA_DEVICE", 00:16:51.423 "dma_device_type": 0 00:16:51.423 } 00:16:51.423 ], 00:16:51.423 "driver_specific": { 00:16:51.423 "nvme": [ 00:16:51.423 { 00:16:51.423 "trid": { 00:16:51.423 "trtype": "RDMA", 00:16:51.423 "adrfam": "IPv4", 00:16:51.423 "traddr": "10.0.0.2", 00:16:51.423 "trsvcid": "4420", 00:16:51.423 "subnqn": "nqn.2016-06.io.spdk:cnode0" 00:16:51.423 }, 00:16:51.423 "ctrlr_data": { 00:16:51.423 "cntlid": 1, 00:16:51.423 "vendor_id": "0x8086", 00:16:51.423 "model_number": "SPDK bdev Controller", 00:16:51.423 "serial_number": "00000000000000000000", 00:16:51.423 "firmware_revision": "25.01", 00:16:51.423 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:16:51.423 "oacs": { 00:16:51.423 "security": 0, 00:16:51.423 "format": 0, 00:16:51.423 "firmware": 0, 00:16:51.423 "ns_manage": 0 00:16:51.423 }, 00:16:51.423 "multi_ctrlr": true, 00:16:51.423 "ana_reporting": false 00:16:51.423 }, 00:16:51.423 "vs": { 00:16:51.423 "nvme_version": "1.3" 00:16:51.423 }, 00:16:51.423 "ns_data": { 00:16:51.423 "id": 1, 00:16:51.423 "can_share": true 00:16:51.423 } 00:16:51.423 } 00:16:51.423 ], 00:16:51.423 "mp_policy": "active_passive" 00:16:51.423 } 00:16:51.423 } 00:16:51.423 ] 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@44 -- # rpc_cmd bdev_nvme_reset_controller nvme0 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.423 [2024-09-27 15:21:53.029677] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode0] resetting controller 00:16:51.423 [2024-09-27 15:21:53.047601] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:16:51.423 [2024-09-27 15:21:53.077376] bdev_nvme.c:2183:_bdev_nvme_reset_ctrlr_complete: *NOTICE*: Resetting controller successful. 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@47 -- # rpc_cmd bdev_get_bdevs -b nvme0n1 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.423 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.423 [ 00:16:51.423 { 00:16:51.423 "name": "nvme0n1", 00:16:51.423 "aliases": [ 00:16:51.423 "1098dfd6-0457-4f33-8295-1fcfa0876696" 00:16:51.423 ], 00:16:51.423 "product_name": "NVMe disk", 00:16:51.423 "block_size": 512, 00:16:51.423 "num_blocks": 2097152, 00:16:51.423 "uuid": "1098dfd6-0457-4f33-8295-1fcfa0876696", 00:16:51.423 "numa_id": 0, 00:16:51.423 "assigned_rate_limits": { 00:16:51.423 "rw_ios_per_sec": 0, 00:16:51.423 "rw_mbytes_per_sec": 0, 00:16:51.423 "r_mbytes_per_sec": 0, 00:16:51.423 "w_mbytes_per_sec": 0 00:16:51.423 }, 00:16:51.423 "claimed": false, 00:16:51.423 "zoned": false, 00:16:51.423 "supported_io_types": { 00:16:51.423 "read": true, 00:16:51.423 "write": true, 00:16:51.423 "unmap": false, 00:16:51.423 "flush": true, 00:16:51.423 "reset": true, 00:16:51.423 "nvme_admin": true, 00:16:51.423 "nvme_io": true, 00:16:51.423 "nvme_io_md": false, 00:16:51.423 "write_zeroes": true, 00:16:51.423 "zcopy": false, 00:16:51.423 "get_zone_info": false, 00:16:51.423 "zone_management": false, 00:16:51.423 "zone_append": false, 00:16:51.423 "compare": true, 00:16:51.423 "compare_and_write": true, 00:16:51.423 "abort": true, 00:16:51.423 "seek_hole": false, 00:16:51.423 "seek_data": false, 00:16:51.423 "copy": true, 00:16:51.423 "nvme_iov_md": false 00:16:51.423 }, 00:16:51.423 "memory_domains": [ 00:16:51.423 { 00:16:51.423 "dma_device_id": "SPDK_RDMA_DMA_DEVICE", 00:16:51.423 "dma_device_type": 0 00:16:51.423 } 00:16:51.423 ], 00:16:51.423 "driver_specific": { 00:16:51.423 "nvme": [ 00:16:51.423 { 00:16:51.423 "trid": { 00:16:51.423 "trtype": "RDMA", 00:16:51.423 "adrfam": "IPv4", 00:16:51.423 "traddr": "10.0.0.2", 00:16:51.423 "trsvcid": "4420", 00:16:51.423 "subnqn": "nqn.2016-06.io.spdk:cnode0" 00:16:51.423 }, 00:16:51.423 "ctrlr_data": { 00:16:51.423 "cntlid": 2, 00:16:51.423 "vendor_id": "0x8086", 00:16:51.423 "model_number": "SPDK bdev Controller", 00:16:51.423 "serial_number": "00000000000000000000", 00:16:51.423 "firmware_revision": "25.01", 00:16:51.423 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:16:51.423 "oacs": { 00:16:51.423 "security": 0, 00:16:51.423 "format": 0, 00:16:51.423 "firmware": 0, 00:16:51.424 "ns_manage": 0 00:16:51.424 }, 00:16:51.424 "multi_ctrlr": true, 00:16:51.424 "ana_reporting": false 00:16:51.424 }, 00:16:51.424 "vs": { 00:16:51.424 "nvme_version": "1.3" 00:16:51.424 }, 00:16:51.424 "ns_data": { 00:16:51.424 "id": 1, 00:16:51.424 "can_share": true 00:16:51.424 } 00:16:51.424 } 00:16:51.424 ], 00:16:51.424 "mp_policy": "active_passive" 00:16:51.424 } 00:16:51.424 } 00:16:51.424 ] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@50 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@53 -- # mktemp 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@53 -- # key_path=/tmp/tmp.q1GYJNnotQ 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@54 -- # echo -n NVMeTLSkey-1:01:MDAxMTIyMzM0NDU1NjY3Nzg4OTlhYWJiY2NkZGVlZmZwJEiQ: 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@55 -- # chmod 0600 /tmp/tmp.q1GYJNnotQ 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@56 -- # rpc_cmd keyring_file_add_key key0 /tmp/tmp.q1GYJNnotQ 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@57 -- # rpc_cmd nvmf_subsystem_allow_any_host nqn.2016-06.io.spdk:cnode0 --disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@58 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode0 -t rdma -a 10.0.0.2 -s 4421 --secure-channel 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.424 [2024-09-27 15:21:53.176815] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4421 *** 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@60 -- # rpc_cmd nvmf_subsystem_add_host nqn.2016-06.io.spdk:cnode0 nqn.2016-06.io.spdk:host1 --psk key0 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@66 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -a 10.0.0.2 -f ipv4 -s 4421 -n nqn.2016-06.io.spdk:cnode0 -q nqn.2016-06.io.spdk:host1 --psk key0 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.424 [2024-09-27 15:21:53.200878] bdev_nvme_rpc.c: 517:rpc_bdev_nvme_attach_controller: *NOTICE*: TLS support is considered experimental 00:16:51.424 nvme0n1 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@70 -- # rpc_cmd bdev_get_bdevs -b nvme0n1 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.424 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.685 [ 00:16:51.685 { 00:16:51.685 "name": "nvme0n1", 00:16:51.685 "aliases": [ 00:16:51.685 "1098dfd6-0457-4f33-8295-1fcfa0876696" 00:16:51.685 ], 00:16:51.685 "product_name": "NVMe disk", 00:16:51.685 "block_size": 512, 00:16:51.685 "num_blocks": 2097152, 00:16:51.685 "uuid": "1098dfd6-0457-4f33-8295-1fcfa0876696", 00:16:51.685 "numa_id": 0, 00:16:51.685 "assigned_rate_limits": { 00:16:51.685 "rw_ios_per_sec": 0, 00:16:51.685 "rw_mbytes_per_sec": 0, 00:16:51.685 "r_mbytes_per_sec": 0, 00:16:51.685 "w_mbytes_per_sec": 0 00:16:51.685 }, 00:16:51.685 "claimed": false, 00:16:51.685 "zoned": false, 00:16:51.685 "supported_io_types": { 00:16:51.685 "read": true, 00:16:51.685 "write": true, 00:16:51.685 "unmap": false, 00:16:51.685 "flush": true, 00:16:51.685 "reset": true, 00:16:51.685 "nvme_admin": true, 00:16:51.685 "nvme_io": true, 00:16:51.685 "nvme_io_md": false, 00:16:51.685 "write_zeroes": true, 00:16:51.685 "zcopy": false, 00:16:51.685 "get_zone_info": false, 00:16:51.685 "zone_management": false, 00:16:51.685 "zone_append": false, 00:16:51.685 "compare": true, 00:16:51.685 "compare_and_write": true, 00:16:51.685 "abort": true, 00:16:51.685 "seek_hole": false, 00:16:51.685 "seek_data": false, 00:16:51.685 "copy": true, 00:16:51.685 "nvme_iov_md": false 00:16:51.685 }, 00:16:51.685 "memory_domains": [ 00:16:51.685 { 00:16:51.685 "dma_device_id": "SPDK_RDMA_DMA_DEVICE", 00:16:51.685 "dma_device_type": 0 00:16:51.685 } 00:16:51.685 ], 00:16:51.685 "driver_specific": { 00:16:51.685 "nvme": [ 00:16:51.685 { 00:16:51.685 "trid": { 00:16:51.685 "trtype": "RDMA", 00:16:51.685 "adrfam": "IPv4", 00:16:51.685 "traddr": "10.0.0.2", 00:16:51.685 "trsvcid": "4421", 00:16:51.685 "subnqn": "nqn.2016-06.io.spdk:cnode0" 00:16:51.685 }, 00:16:51.685 "ctrlr_data": { 00:16:51.685 "cntlid": 3, 00:16:51.685 "vendor_id": "0x8086", 00:16:51.685 "model_number": "SPDK bdev Controller", 00:16:51.685 "serial_number": "00000000000000000000", 00:16:51.685 "firmware_revision": "25.01", 00:16:51.685 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:16:51.685 "oacs": { 00:16:51.685 "security": 0, 00:16:51.685 "format": 0, 00:16:51.685 "firmware": 0, 00:16:51.685 "ns_manage": 0 00:16:51.685 }, 00:16:51.685 "multi_ctrlr": true, 00:16:51.685 "ana_reporting": false 00:16:51.685 }, 00:16:51.685 "vs": { 00:16:51.685 "nvme_version": "1.3" 00:16:51.685 }, 00:16:51.685 "ns_data": { 00:16:51.685 "id": 1, 00:16:51.685 "can_share": true 00:16:51.685 } 00:16:51.685 } 00:16:51.685 ], 00:16:51.685 "mp_policy": "active_passive" 00:16:51.685 } 00:16:51.685 } 00:16:51.685 ] 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@73 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@561 -- # xtrace_disable 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@76 -- # rm -f /tmp/tmp.q1GYJNnotQ 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@78 -- # trap - SIGINT SIGTERM EXIT 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- host/async_init.sh@79 -- # nvmftestfini 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@331 -- # nvmfcleanup 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@99 -- # sync 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@102 -- # set +e 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@103 -- # for i in {1..20} 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:16:51.685 rmmod nvme_rdma 00:16:51.685 rmmod nvme_fabrics 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@106 -- # set -e 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@107 -- # return 0 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@332 -- # '[' -n 1815082 ']' 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@333 -- # killprocess 1815082 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@950 -- # '[' -z 1815082 ']' 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@954 -- # kill -0 1815082 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@955 -- # uname 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1815082 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1815082' 00:16:51.685 killing process with pid 1815082 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@969 -- # kill 1815082 00:16:51.685 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@974 -- # wait 1815082 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@338 -- # nvmf_fini 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@264 -- # local dev 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@267 -- # remove_target_ns 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@268 -- # delete_main_bridge 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@130 -- # return 0 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@41 -- # _dev=0 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@41 -- # dev_map=() 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/setup.sh@284 -- # iptr 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@538 -- # iptables-save 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- nvmf/common.sh@538 -- # iptables-restore 00:16:51.945 00:16:51.945 real 0m9.109s 00:16:51.945 user 0m4.128s 00:16:51.945 sys 0m5.752s 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host.nvmf_async_init -- common/autotest_common.sh@10 -- # set +x 00:16:51.945 ************************************ 00:16:51.945 END TEST nvmf_async_init 00:16:51.945 ************************************ 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@20 -- # run_test nvmf_identify /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/identify.sh --transport=rdma 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:51.945 15:21:53 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:16:52.205 ************************************ 00:16:52.205 START TEST nvmf_identify 00:16:52.205 ************************************ 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/identify.sh --transport=rdma 00:16:52.205 * Looking for test storage... 00:16:52.205 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1681 -- # lcov --version 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@333 -- # local ver1 ver1_l 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@334 -- # local ver2 ver2_l 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@336 -- # IFS=.-: 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@336 -- # read -ra ver1 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@337 -- # IFS=.-: 00:16:52.205 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@337 -- # read -ra ver2 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@338 -- # local 'op=<' 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@340 -- # ver1_l=2 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@341 -- # ver2_l=1 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@344 -- # case "$op" in 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@345 -- # : 1 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@364 -- # (( v = 0 )) 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@365 -- # decimal 1 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@353 -- # local d=1 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@355 -- # echo 1 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@365 -- # ver1[v]=1 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@366 -- # decimal 2 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@353 -- # local d=2 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@355 -- # echo 2 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@366 -- # ver2[v]=2 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@368 -- # return 0 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:16:52.206 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:52.206 --rc genhtml_branch_coverage=1 00:16:52.206 --rc genhtml_function_coverage=1 00:16:52.206 --rc genhtml_legend=1 00:16:52.206 --rc geninfo_all_blocks=1 00:16:52.206 --rc geninfo_unexecuted_blocks=1 00:16:52.206 00:16:52.206 ' 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:16:52.206 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:52.206 --rc genhtml_branch_coverage=1 00:16:52.206 --rc genhtml_function_coverage=1 00:16:52.206 --rc genhtml_legend=1 00:16:52.206 --rc geninfo_all_blocks=1 00:16:52.206 --rc geninfo_unexecuted_blocks=1 00:16:52.206 00:16:52.206 ' 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:16:52.206 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:52.206 --rc genhtml_branch_coverage=1 00:16:52.206 --rc genhtml_function_coverage=1 00:16:52.206 --rc genhtml_legend=1 00:16:52.206 --rc geninfo_all_blocks=1 00:16:52.206 --rc geninfo_unexecuted_blocks=1 00:16:52.206 00:16:52.206 ' 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:16:52.206 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:52.206 --rc genhtml_branch_coverage=1 00:16:52.206 --rc genhtml_function_coverage=1 00:16:52.206 --rc genhtml_legend=1 00:16:52.206 --rc geninfo_all_blocks=1 00:16:52.206 --rc geninfo_unexecuted_blocks=1 00:16:52.206 00:16:52.206 ' 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@7 -- # uname -s 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:16:52.206 15:21:53 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@15 -- # shopt -s extglob 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- paths/export.sh@5 -- # export PATH 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@50 -- # : 0 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:16:52.206 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@54 -- # have_pci_nics=0 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@11 -- # MALLOC_BDEV_SIZE=64 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@14 -- # nvmftestinit 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@292 -- # prepare_net_devs 00:16:52.206 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@254 -- # local -g is_hw=no 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@256 -- # remove_target_ns 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@22 -- # _remove_target_ns 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@125 -- # xtrace_disable 00:16:52.207 15:21:54 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@131 -- # pci_devs=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@131 -- # local -a pci_devs 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@132 -- # pci_net_devs=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@133 -- # pci_drivers=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@133 -- # local -A pci_drivers 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@135 -- # net_devs=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@135 -- # local -ga net_devs 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@136 -- # e810=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@136 -- # local -ga e810 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@137 -- # x722=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@137 -- # local -ga x722 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@138 -- # mlx=() 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@138 -- # local -ga mlx 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:17:00.336 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:17:00.337 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:17:00.337 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:17:00.337 Found net devices under 0000:18:00.0: mlx_0_0 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:17:00.337 Found net devices under 0000:18:00.1: mlx_0_1 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@249 -- # get_rdma_if_list 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@75 -- # rdma_devs=() 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@89 -- # continue 2 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@89 -- # continue 2 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@258 -- # is_hw=yes 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@61 -- # uname 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@65 -- # modprobe ib_cm 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@66 -- # modprobe ib_core 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@67 -- # modprobe ib_umad 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@69 -- # modprobe iw_cm 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@27 -- # local -gA dev_map 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@28 -- # local -g _dev 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@44 -- # ips=() 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@58 -- # key_initiator=target1 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@11 -- # local val=167772161 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:17:00.337 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:17:00.338 10.0.0.1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@11 -- # local val=167772162 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:17:00.338 10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@38 -- # ping_ips 1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@107 -- # local dev=target0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:00.338 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:00.338 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:17:00.338 00:17:00.338 --- 10.0.0.2 ping statistics --- 00:17:00.338 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:00.338 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@107 -- # local dev=target0 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:00.338 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:00.338 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:17:00.338 00:17:00.338 --- 10.0.0.2 ping statistics --- 00:17:00.338 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:00.338 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@98 -- # (( pair++ )) 00:17:00.338 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@266 -- # return 0 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@107 -- # local dev=target0 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@107 -- # local dev=target1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:17:00.339 15:22:00 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@107 -- # local dev=target0 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@107 -- # local dev=target1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@16 -- # timing_enter start_nvmf_tgt 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@724 -- # xtrace_disable 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@19 -- # nvmfpid=1818377 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@21 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@23 -- # waitforlisten 1818377 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@831 -- # '[' -z 1818377 ']' 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:00.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:00.339 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.339 [2024-09-27 15:22:01.124398] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:17:00.339 [2024-09-27 15:22:01.124460] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:00.339 [2024-09-27 15:22:01.212430] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:17:00.339 [2024-09-27 15:22:01.298033] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:17:00.339 [2024-09-27 15:22:01.298080] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:17:00.339 [2024-09-27 15:22:01.298090] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:17:00.339 [2024-09-27 15:22:01.298098] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:17:00.339 [2024-09-27 15:22:01.298106] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:17:00.339 [2024-09-27 15:22:01.298198] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:17:00.339 [2024-09-27 15:22:01.298301] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:17:00.339 [2024-09-27 15:22:01.298406] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:00.339 [2024-09-27 15:22:01.298406] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:17:00.340 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:00.340 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@864 -- # return 0 00:17:00.340 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@24 -- # rpc_cmd nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:17:00.340 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.340 15:22:01 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.340 [2024-09-27 15:22:02.003878] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x16ba4a0/0x16be990) succeed. 00:17:00.340 [2024-09-27 15:22:02.015197] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x16bbae0/0x1700030) succeed. 00:17:00.340 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.340 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@25 -- # timing_exit start_nvmf_tgt 00:17:00.340 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@730 -- # xtrace_disable 00:17:00.340 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@27 -- # rpc_cmd bdev_malloc_create 64 512 -b Malloc0 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 Malloc0 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@28 -- # rpc_cmd nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@31 -- # rpc_cmd nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 --nguid ABCDEF0123456789ABCDEF0123456789 --eui64 ABCDEF0123456789 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@34 -- # rpc_cmd nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 [2024-09-27 15:22:02.234170] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@35 -- # rpc_cmd nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@37 -- # rpc_cmd nvmf_get_subsystems 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.603 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.603 [ 00:17:00.603 { 00:17:00.603 "nqn": "nqn.2014-08.org.nvmexpress.discovery", 00:17:00.603 "subtype": "Discovery", 00:17:00.603 "listen_addresses": [ 00:17:00.603 { 00:17:00.603 "trtype": "RDMA", 00:17:00.603 "adrfam": "IPv4", 00:17:00.603 "traddr": "10.0.0.2", 00:17:00.603 "trsvcid": "4420" 00:17:00.603 } 00:17:00.603 ], 00:17:00.603 "allow_any_host": true, 00:17:00.603 "hosts": [] 00:17:00.603 }, 00:17:00.603 { 00:17:00.603 "nqn": "nqn.2016-06.io.spdk:cnode1", 00:17:00.603 "subtype": "NVMe", 00:17:00.603 "listen_addresses": [ 00:17:00.603 { 00:17:00.603 "trtype": "RDMA", 00:17:00.603 "adrfam": "IPv4", 00:17:00.603 "traddr": "10.0.0.2", 00:17:00.603 "trsvcid": "4420" 00:17:00.603 } 00:17:00.603 ], 00:17:00.603 "allow_any_host": true, 00:17:00.603 "hosts": [], 00:17:00.603 "serial_number": "SPDK00000000000001", 00:17:00.603 "model_number": "SPDK bdev Controller", 00:17:00.603 "max_namespaces": 32, 00:17:00.603 "min_cntlid": 1, 00:17:00.603 "max_cntlid": 65519, 00:17:00.603 "namespaces": [ 00:17:00.603 { 00:17:00.603 "nsid": 1, 00:17:00.603 "bdev_name": "Malloc0", 00:17:00.603 "name": "Malloc0", 00:17:00.603 "nguid": "ABCDEF0123456789ABCDEF0123456789", 00:17:00.604 "eui64": "ABCDEF0123456789", 00:17:00.604 "uuid": "f61756c0-8c56-4812-a540-a6f138fd37d8" 00:17:00.604 } 00:17:00.604 ] 00:17:00.604 } 00:17:00.604 ] 00:17:00.604 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.604 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@39 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_identify -r ' trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2014-08.org.nvmexpress.discovery' -L all 00:17:00.604 [2024-09-27 15:22:02.297901] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:17:00.604 [2024-09-27 15:22:02.297942] [ DPDK EAL parameters: identify --no-shconf -c 0x1 -n 1 -m 0 --no-pci --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1818574 ] 00:17:00.604 [2024-09-27 15:22:02.345194] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to connect adminq (no timeout) 00:17:00.604 [2024-09-27 15:22:02.345272] nvme_rdma.c:2214:nvme_rdma_ctrlr_construct: *DEBUG*: successfully initialized the nvmf ctrlr 00:17:00.604 [2024-09-27 15:22:02.345291] nvme_rdma.c:1215:nvme_rdma_ctrlr_connect_qpair: *DEBUG*: adrfam 1 ai_family 2 00:17:00.604 [2024-09-27 15:22:02.345296] nvme_rdma.c:1219:nvme_rdma_ctrlr_connect_qpair: *DEBUG*: trsvcid is 4420 00:17:00.604 [2024-09-27 15:22:02.345326] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to wait for connect adminq (no timeout) 00:17:00.604 [2024-09-27 15:22:02.356736] nvme_rdma.c: 431:nvme_rdma_qpair_process_cm_event: *DEBUG*: Requested queue depth 32. Target receive queue depth 32. 00:17:00.604 [2024-09-27 15:22:02.367016] nvme_rdma.c:1101:nvme_rdma_connect_established: *DEBUG*: rc =0 00:17:00.604 [2024-09-27 15:22:02.367027] nvme_rdma.c:1106:nvme_rdma_connect_established: *DEBUG*: RDMA requests created 00:17:00.604 [2024-09-27 15:22:02.367035] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367043] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367049] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367056] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367062] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367069] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367075] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367081] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367088] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367094] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367101] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367107] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf7b8 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367114] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf7e0 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367120] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf808 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367126] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf830 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367133] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf858 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367139] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf880 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367145] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf8a8 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367154] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf8d0 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367161] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf8f8 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367167] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf920 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367174] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf948 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367180] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf970 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367186] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf998 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367193] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf9c0 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367199] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf9e8 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367205] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa10 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367212] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa38 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367218] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa60 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367224] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa88 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367231] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfab0 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367237] nvme_rdma.c:1120:nvme_rdma_connect_established: *DEBUG*: RDMA responses created 00:17:00.604 [2024-09-27 15:22:02.367243] nvme_rdma.c:1123:nvme_rdma_connect_established: *DEBUG*: rc =0 00:17:00.604 [2024-09-27 15:22:02.367248] nvme_rdma.c:1128:nvme_rdma_connect_established: *DEBUG*: RDMA responses submitted 00:17:00.604 [2024-09-27 15:22:02.367266] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.367280] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x2000003cf180 len:0x400 key:0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372348] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.604 [2024-09-27 15:22:02.372357] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:1 sqhd:0001 p:0 m:0 dnr:0 00:17:00.604 [2024-09-27 15:22:02.372366] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372373] nvme_fabric.c: 621:nvme_fabric_qpair_connect_poll: *DEBUG*: CNTLID 0x0001 00:17:00.604 [2024-09-27 15:22:02.372381] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to read vs (no timeout) 00:17:00.604 [2024-09-27 15:22:02.372388] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to read vs wait for vs (no timeout) 00:17:00.604 [2024-09-27 15:22:02.372402] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372411] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.604 [2024-09-27 15:22:02.372441] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.604 [2024-09-27 15:22:02.372447] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:10300 sqhd:0002 p:0 m:0 dnr:0 00:17:00.604 [2024-09-27 15:22:02.372454] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to read cap (no timeout) 00:17:00.604 [2024-09-27 15:22:02.372461] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372467] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to read cap wait for cap (no timeout) 00:17:00.604 [2024-09-27 15:22:02.372477] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372486] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.604 [2024-09-27 15:22:02.372501] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.604 [2024-09-27 15:22:02.372507] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:1e01007f sqhd:0003 p:0 m:0 dnr:0 00:17:00.604 [2024-09-27 15:22:02.372515] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to check en (no timeout) 00:17:00.604 [2024-09-27 15:22:02.372521] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372528] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to check en wait for cc (timeout 15000 ms) 00:17:00.604 [2024-09-27 15:22:02.372536] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372544] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.604 [2024-09-27 15:22:02.372560] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.604 [2024-09-27 15:22:02.372566] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0004 p:0 m:0 dnr:0 00:17:00.604 [2024-09-27 15:22:02.372572] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to disable and wait for CSTS.RDY = 0 (timeout 15000 ms) 00:17:00.604 [2024-09-27 15:22:02.372579] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372587] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.604 [2024-09-27 15:22:02.372595] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.604 [2024-09-27 15:22:02.372616] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.604 [2024-09-27 15:22:02.372622] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0005 p:0 m:0 dnr:0 00:17:00.604 [2024-09-27 15:22:02.372629] nvme_ctrlr.c:3893:nvme_ctrlr_process_init_wait_for_ready_0: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] CC.EN = 0 && CSTS.RDY = 0 00:17:00.604 [2024-09-27 15:22:02.372635] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to controller is disabled (timeout 15000 ms) 00:17:00.605 [2024-09-27 15:22:02.372641] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372648] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to enable controller by writing CC.EN = 1 (timeout 15000 ms) 00:17:00.605 [2024-09-27 15:22:02.372754] nvme_ctrlr.c:4091:nvme_ctrlr_process_init: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] Setting CC.EN = 1 00:17:00.605 [2024-09-27 15:22:02.372760] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to enable controller by writing CC.EN = 1 reg (timeout 15000 ms) 00:17:00.605 [2024-09-27 15:22:02.372770] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372778] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY SET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.605 [2024-09-27 15:22:02.372794] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.372799] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0006 p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.372808] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to wait for CSTS.RDY = 1 (timeout 15000 ms) 00:17:00.605 [2024-09-27 15:22:02.372814] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372822] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372830] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.605 [2024-09-27 15:22:02.372852] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.372857] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:1 sqhd:0007 p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.372864] nvme_ctrlr.c:3928:nvme_ctrlr_process_init_enable_wait_for_ready_1: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] CC.EN = 1 && CSTS.RDY = 1 - controller is ready 00:17:00.605 [2024-09-27 15:22:02.372870] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to reset admin queue (timeout 30000 ms) 00:17:00.605 [2024-09-27 15:22:02.372876] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372883] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to identify controller (no timeout) 00:17:00.605 [2024-09-27 15:22:02.372892] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to wait for identify controller (timeout 30000 ms) 00:17:00.605 [2024-09-27 15:22:02.372903] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372911] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:0 nsid:0 cdw10:00000001 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0x1000 key:0x1bf200 00:17:00.605 [2024-09-27 15:22:02.372949] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.372955] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0008 p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.372964] nvme_ctrlr.c:2077:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] transport max_xfer_size 4294967295 00:17:00.605 [2024-09-27 15:22:02.372971] nvme_ctrlr.c:2081:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] MDTS max_xfer_size 131072 00:17:00.605 [2024-09-27 15:22:02.372976] nvme_ctrlr.c:2084:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] CNTLID 0x0001 00:17:00.605 [2024-09-27 15:22:02.372983] nvme_ctrlr.c:2108:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] transport max_sges 16 00:17:00.605 [2024-09-27 15:22:02.372989] nvme_ctrlr.c:2123:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] fuses compare and write: 1 00:17:00.605 [2024-09-27 15:22:02.372995] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to configure AER (timeout 30000 ms) 00:17:00.605 [2024-09-27 15:22:02.373001] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373009] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to wait for configure aer (timeout 30000 ms) 00:17:00.605 [2024-09-27 15:22:02.373019] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373027] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ASYNC EVENT CONFIGURATION cid:0 cdw10:0000000b SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.605 [2024-09-27 15:22:02.373045] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.373051] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0009 p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.373061] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0480 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373068] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.605 [2024-09-27 15:22:02.373076] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d05c0 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373083] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.605 [2024-09-27 15:22:02.373090] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373097] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.605 [2024-09-27 15:22:02.373104] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0840 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373111] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.605 [2024-09-27 15:22:02.373117] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to set keep alive timeout (timeout 30000 ms) 00:17:00.605 [2024-09-27 15:22:02.373123] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373134] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to wait for set keep alive timeout (timeout 30000 ms) 00:17:00.605 [2024-09-27 15:22:02.373141] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373149] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES KEEP ALIVE TIMER cid:0 cdw10:0000000f SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.605 [2024-09-27 15:22:02.373165] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.373171] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:2710 sqhd:000a p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.373177] nvme_ctrlr.c:3046:nvme_ctrlr_set_keep_alive_timeout_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] Sending keep alive every 5000000 us 00:17:00.605 [2024-09-27 15:22:02.373184] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] setting state to ready (no timeout) 00:17:00.605 [2024-09-27 15:22:02.373190] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373199] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373207] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:0 nsid:0 cdw10:00000001 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0x1000 key:0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373229] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.373235] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:000b p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.373243] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373253] nvme_ctrlr.c:4189:nvme_ctrlr_process_init: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] Ctrlr already in ready state 00:17:00.605 [2024-09-27 15:22:02.373277] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373286] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:0 nsid:0 cdw10:00ff0070 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003cf000 len:0x400 key:0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373293] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373302] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: KEEP ALIVE (18) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.605 [2024-09-27 15:22:02.373316] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.605 [2024-09-27 15:22:02.373322] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:000c p:0 m:0 dnr:0 00:17:00.605 [2024-09-27 15:22:02.373335] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0ac0 length 0x40 lkey 0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373350] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:02ff0070 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0xc00 key:0x1bf200 00:17:00.605 [2024-09-27 15:22:02.373357] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7b8 length 0x10 lkey 0x1bf200 00:17:00.606 [2024-09-27 15:22:02.373364] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.606 [2024-09-27 15:22:02.373369] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:0 sqhd:000d p:0 m:0 dnr:0 00:17:00.606 [2024-09-27 15:22:02.373375] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7e0 length 0x10 lkey 0x1bf200 00:17:00.606 [2024-09-27 15:22:02.373382] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.606 [2024-09-27 15:22:02.373388] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:6 cdw0:0 sqhd:000e p:0 m:0 dnr:0 00:17:00.606 [2024-09-27 15:22:02.373398] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.606 [2024-09-27 15:22:02.373405] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00010070 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003cf000 len:0x8 key:0x1bf200 00:17:00.606 [2024-09-27 15:22:02.373412] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf808 length 0x10 lkey 0x1bf200 00:17:00.606 [2024-09-27 15:22:02.373432] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.606 [2024-09-27 15:22:02.373437] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:17:00.606 [2024-09-27 15:22:02.373448] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf830 length 0x10 lkey 0x1bf200 00:17:00.606 ===================================================== 00:17:00.606 NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2014-08.org.nvmexpress.discovery 00:17:00.606 ===================================================== 00:17:00.606 Controller Capabilities/Features 00:17:00.606 ================================ 00:17:00.606 Vendor ID: 0000 00:17:00.606 Subsystem Vendor ID: 0000 00:17:00.606 Serial Number: .................... 00:17:00.606 Model Number: ........................................ 00:17:00.606 Firmware Version: 25.01 00:17:00.606 Recommended Arb Burst: 0 00:17:00.606 IEEE OUI Identifier: 00 00 00 00:17:00.606 Multi-path I/O 00:17:00.606 May have multiple subsystem ports: No 00:17:00.606 May have multiple controllers: No 00:17:00.606 Associated with SR-IOV VF: No 00:17:00.606 Max Data Transfer Size: 131072 00:17:00.606 Max Number of Namespaces: 0 00:17:00.606 Max Number of I/O Queues: 1024 00:17:00.606 NVMe Specification Version (VS): 1.3 00:17:00.606 NVMe Specification Version (Identify): 1.3 00:17:00.606 Maximum Queue Entries: 128 00:17:00.606 Contiguous Queues Required: Yes 00:17:00.606 Arbitration Mechanisms Supported 00:17:00.606 Weighted Round Robin: Not Supported 00:17:00.606 Vendor Specific: Not Supported 00:17:00.606 Reset Timeout: 15000 ms 00:17:00.606 Doorbell Stride: 4 bytes 00:17:00.606 NVM Subsystem Reset: Not Supported 00:17:00.606 Command Sets Supported 00:17:00.606 NVM Command Set: Supported 00:17:00.606 Boot Partition: Not Supported 00:17:00.606 Memory Page Size Minimum: 4096 bytes 00:17:00.606 Memory Page Size Maximum: 4096 bytes 00:17:00.606 Persistent Memory Region: Not Supported 00:17:00.606 Optional Asynchronous Events Supported 00:17:00.606 Namespace Attribute Notices: Not Supported 00:17:00.606 Firmware Activation Notices: Not Supported 00:17:00.606 ANA Change Notices: Not Supported 00:17:00.606 PLE Aggregate Log Change Notices: Not Supported 00:17:00.606 LBA Status Info Alert Notices: Not Supported 00:17:00.606 EGE Aggregate Log Change Notices: Not Supported 00:17:00.606 Normal NVM Subsystem Shutdown event: Not Supported 00:17:00.606 Zone Descriptor Change Notices: Not Supported 00:17:00.606 Discovery Log Change Notices: Supported 00:17:00.606 Controller Attributes 00:17:00.606 128-bit Host Identifier: Not Supported 00:17:00.606 Non-Operational Permissive Mode: Not Supported 00:17:00.606 NVM Sets: Not Supported 00:17:00.606 Read Recovery Levels: Not Supported 00:17:00.606 Endurance Groups: Not Supported 00:17:00.606 Predictable Latency Mode: Not Supported 00:17:00.606 Traffic Based Keep ALive: Not Supported 00:17:00.606 Namespace Granularity: Not Supported 00:17:00.606 SQ Associations: Not Supported 00:17:00.606 UUID List: Not Supported 00:17:00.606 Multi-Domain Subsystem: Not Supported 00:17:00.606 Fixed Capacity Management: Not Supported 00:17:00.606 Variable Capacity Management: Not Supported 00:17:00.606 Delete Endurance Group: Not Supported 00:17:00.606 Delete NVM Set: Not Supported 00:17:00.606 Extended LBA Formats Supported: Not Supported 00:17:00.606 Flexible Data Placement Supported: Not Supported 00:17:00.606 00:17:00.606 Controller Memory Buffer Support 00:17:00.606 ================================ 00:17:00.606 Supported: No 00:17:00.606 00:17:00.606 Persistent Memory Region Support 00:17:00.606 ================================ 00:17:00.606 Supported: No 00:17:00.606 00:17:00.606 Admin Command Set Attributes 00:17:00.606 ============================ 00:17:00.606 Security Send/Receive: Not Supported 00:17:00.606 Format NVM: Not Supported 00:17:00.606 Firmware Activate/Download: Not Supported 00:17:00.606 Namespace Management: Not Supported 00:17:00.606 Device Self-Test: Not Supported 00:17:00.606 Directives: Not Supported 00:17:00.606 NVMe-MI: Not Supported 00:17:00.606 Virtualization Management: Not Supported 00:17:00.606 Doorbell Buffer Config: Not Supported 00:17:00.606 Get LBA Status Capability: Not Supported 00:17:00.606 Command & Feature Lockdown Capability: Not Supported 00:17:00.606 Abort Command Limit: 1 00:17:00.606 Async Event Request Limit: 4 00:17:00.606 Number of Firmware Slots: N/A 00:17:00.606 Firmware Slot 1 Read-Only: N/A 00:17:00.606 Firmware Activation Without Reset: N/A 00:17:00.606 Multiple Update Detection Support: N/A 00:17:00.606 Firmware Update Granularity: No Information Provided 00:17:00.606 Per-Namespace SMART Log: No 00:17:00.606 Asymmetric Namespace Access Log Page: Not Supported 00:17:00.606 Subsystem NQN: nqn.2014-08.org.nvmexpress.discovery 00:17:00.606 Command Effects Log Page: Not Supported 00:17:00.606 Get Log Page Extended Data: Supported 00:17:00.606 Telemetry Log Pages: Not Supported 00:17:00.606 Persistent Event Log Pages: Not Supported 00:17:00.606 Supported Log Pages Log Page: May Support 00:17:00.606 Commands Supported & Effects Log Page: Not Supported 00:17:00.606 Feature Identifiers & Effects Log Page:May Support 00:17:00.606 NVMe-MI Commands & Effects Log Page: May Support 00:17:00.606 Data Area 4 for Telemetry Log: Not Supported 00:17:00.606 Error Log Page Entries Supported: 128 00:17:00.606 Keep Alive: Not Supported 00:17:00.606 00:17:00.606 NVM Command Set Attributes 00:17:00.606 ========================== 00:17:00.606 Submission Queue Entry Size 00:17:00.606 Max: 1 00:17:00.606 Min: 1 00:17:00.606 Completion Queue Entry Size 00:17:00.606 Max: 1 00:17:00.606 Min: 1 00:17:00.606 Number of Namespaces: 0 00:17:00.606 Compare Command: Not Supported 00:17:00.606 Write Uncorrectable Command: Not Supported 00:17:00.606 Dataset Management Command: Not Supported 00:17:00.606 Write Zeroes Command: Not Supported 00:17:00.606 Set Features Save Field: Not Supported 00:17:00.606 Reservations: Not Supported 00:17:00.606 Timestamp: Not Supported 00:17:00.606 Copy: Not Supported 00:17:00.606 Volatile Write Cache: Not Present 00:17:00.606 Atomic Write Unit (Normal): 1 00:17:00.606 Atomic Write Unit (PFail): 1 00:17:00.606 Atomic Compare & Write Unit: 1 00:17:00.606 Fused Compare & Write: Supported 00:17:00.606 Scatter-Gather List 00:17:00.606 SGL Command Set: Supported 00:17:00.606 SGL Keyed: Supported 00:17:00.606 SGL Bit Bucket Descriptor: Not Supported 00:17:00.606 SGL Metadata Pointer: Not Supported 00:17:00.606 Oversized SGL: Not Supported 00:17:00.606 SGL Metadata Address: Not Supported 00:17:00.606 SGL Offset: Supported 00:17:00.606 Transport SGL Data Block: Not Supported 00:17:00.606 Replay Protected Memory Block: Not Supported 00:17:00.606 00:17:00.606 Firmware Slot Information 00:17:00.606 ========================= 00:17:00.607 Active slot: 0 00:17:00.607 00:17:00.607 00:17:00.607 Error Log 00:17:00.607 ========= 00:17:00.607 00:17:00.607 Active Namespaces 00:17:00.607 ================= 00:17:00.607 Discovery Log Page 00:17:00.607 ================== 00:17:00.607 Generation Counter: 2 00:17:00.607 Number of Records: 2 00:17:00.607 Record Format: 0 00:17:00.607 00:17:00.607 Discovery Log Entry 0 00:17:00.607 ---------------------- 00:17:00.607 Transport Type: 1 (RDMA) 00:17:00.607 Address Family: 1 (IPv4) 00:17:00.607 Subsystem Type: 3 (Current Discovery Subsystem) 00:17:00.607 Entry Flags: 00:17:00.607 Duplicate Returned Information: 1 00:17:00.607 Explicit Persistent Connection Support for Discovery: 1 00:17:00.607 Transport Requirements: 00:17:00.607 Secure Channel: Not Required 00:17:00.607 Port ID: 0 (0x0000) 00:17:00.607 Controller ID: 65535 (0xffff) 00:17:00.607 Admin Max SQ Size: 128 00:17:00.607 Transport Service Identifier: 4420 00:17:00.607 NVM Subsystem Qualified Name: nqn.2014-08.org.nvmexpress.discovery 00:17:00.607 Transport Address: 10.0.0.2 00:17:00.607 Transport Specific Address Subtype - RDMA 00:17:00.607 RDMA QP Service Type: 1 (Reliable Connected) 00:17:00.607 RDMA Provider Type: 1 (No provider specified) 00:17:00.607 RDMA CM Service: 1 (RDMA_CM) 00:17:00.607 Discovery Log Entry 1 00:17:00.607 ---------------------- 00:17:00.607 Transport Type: 1 (RDMA) 00:17:00.607 Address Family: 1 (IPv4) 00:17:00.607 Subsystem Type: 2 (NVM Subsystem) 00:17:00.607 Entry Flags: 00:17:00.607 Duplicate Returned Information: 0 00:17:00.607 Explicit Persistent Connection Support for Discovery: 0 00:17:00.607 Transport Requirements: 00:17:00.607 Secure Channel: Not Required 00:17:00.607 Port ID: 0 (0x0000) 00:17:00.607 Controller ID: 65535 (0xffff) 00:17:00.607 Admin Max SQ Size: [2024-09-27 15:22:02.373519] nvme_ctrlr.c:4386:nvme_ctrlr_destruct_async: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] Prepare to destruct SSD 00:17:00.607 [2024-09-27 15:22:02.373530] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 41204 doesn't match qid 00:17:00.607 [2024-09-27 15:22:02.373544] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32547 cdw0:5 sqhd:c9b0 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373551] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 41204 doesn't match qid 00:17:00.607 [2024-09-27 15:22:02.373560] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32547 cdw0:5 sqhd:c9b0 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373566] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 41204 doesn't match qid 00:17:00.607 [2024-09-27 15:22:02.373574] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32547 cdw0:5 sqhd:c9b0 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373581] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 41204 doesn't match qid 00:17:00.607 [2024-09-27 15:22:02.373589] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32547 cdw0:5 sqhd:c9b0 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373598] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0840 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373607] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:4 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373628] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373636] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:4 cdw0:460001 sqhd:0010 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373644] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373653] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY SET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373660] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf858 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373675] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373681] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373688] nvme_ctrlr.c:1147:nvme_ctrlr_shutdown_set_cc_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] RTD3E = 0 us 00:17:00.607 [2024-09-27 15:22:02.373696] nvme_ctrlr.c:1150:nvme_ctrlr_shutdown_set_cc_done: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] shutdown timeout = 10000 ms 00:17:00.607 [2024-09-27 15:22:02.373703] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf880 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373712] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373721] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373737] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373743] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0012 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373751] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8a8 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373762] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373770] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373787] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373794] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0013 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373801] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8d0 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373811] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373820] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373840] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373847] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0014 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373854] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8f8 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373863] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373871] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373886] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373892] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0015 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373899] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf920 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373910] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373919] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373933] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373939] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0016 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373946] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf948 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373955] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.373963] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.373979] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.373985] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0017 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.373991] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf970 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.374000] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.374008] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.374030] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.374036] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0018 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.374042] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf998 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.374051] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.374059] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.607 [2024-09-27 15:22:02.374079] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.607 [2024-09-27 15:22:02.374085] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0019 p:0 m:0 dnr:0 00:17:00.607 [2024-09-27 15:22:02.374091] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9c0 length 0x10 lkey 0x1bf200 00:17:00.607 [2024-09-27 15:22:02.374100] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374108] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374124] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374130] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001a p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374137] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9e8 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374146] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374153] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374174] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374180] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001b p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374186] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa10 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374197] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374205] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374223] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374229] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001c p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374235] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa38 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374244] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374252] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374268] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374274] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001d p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374280] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa60 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374289] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374297] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374317] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374323] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001e p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374329] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa88 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374338] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374353] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374369] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374375] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001f p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374382] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfab0 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374391] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374398] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374415] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374420] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0000 p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374427] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374436] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374444] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374462] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374467] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0001 p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374475] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374484] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374492] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374510] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374516] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0002 p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374523] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374532] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374539] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374563] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374569] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0003 p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374576] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374584] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374592] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374612] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.608 [2024-09-27 15:22:02.374618] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0004 p:0 m:0 dnr:0 00:17:00.608 [2024-09-27 15:22:02.374625] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374634] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.608 [2024-09-27 15:22:02.374642] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.608 [2024-09-27 15:22:02.374662] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374667] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0005 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374674] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374683] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374691] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374711] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374716] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0006 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374723] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374732] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374740] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374756] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374761] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0007 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374769] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374779] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374786] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374804] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374810] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0008 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374817] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374826] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374833] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374853] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374859] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0009 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374866] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374875] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374882] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374902] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374908] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000a p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374915] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374924] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374931] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374951] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.374957] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000b p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.374964] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7b8 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374973] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.374980] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.374995] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375000] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000c p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375007] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7e0 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375016] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375024] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375040] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375047] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000d p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375054] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf808 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375063] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375071] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375087] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375092] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000e p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375099] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf830 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375108] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375116] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375136] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375141] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000f p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375148] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf858 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375157] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375165] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375181] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375187] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0010 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375193] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf880 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375202] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375210] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375230] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375236] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0011 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375242] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8a8 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375251] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375259] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375277] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375283] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0012 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375289] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8d0 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375298] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375306] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375326] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375333] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0013 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375344] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8f8 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375353] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375361] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375376] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.609 [2024-09-27 15:22:02.375381] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0014 p:0 m:0 dnr:0 00:17:00.609 [2024-09-27 15:22:02.375388] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf920 length 0x10 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375397] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.609 [2024-09-27 15:22:02.375405] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.609 [2024-09-27 15:22:02.375419] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375425] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0015 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375431] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf948 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375440] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375448] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375470] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375476] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0016 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375482] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf970 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375491] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375499] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375519] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375525] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0017 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375531] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf998 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375540] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375548] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375564] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375570] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0018 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375576] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9c0 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375585] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375593] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375613] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375619] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0019 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375625] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9e8 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375634] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375642] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375662] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375668] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001a p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375674] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa10 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375683] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375691] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375707] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375713] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001b p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375719] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa38 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375728] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375736] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375758] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375764] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001c p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375770] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa60 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375779] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375787] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375803] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375809] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001d p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375815] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa88 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375824] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375832] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375848] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375854] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001e p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375860] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfab0 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375869] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375877] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375891] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375897] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001f p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375903] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375912] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375920] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375942] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375948] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0000 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375954] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375963] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.375971] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.375987] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.375993] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0001 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.375999] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.376008] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.376016] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.376034] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.376040] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0002 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.376046] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.376055] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.376063] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.376081] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.610 [2024-09-27 15:22:02.376087] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0003 p:0 m:0 dnr:0 00:17:00.610 [2024-09-27 15:22:02.376093] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.376102] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.610 [2024-09-27 15:22:02.376110] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.610 [2024-09-27 15:22:02.376128] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.611 [2024-09-27 15:22:02.376134] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0004 p:0 m:0 dnr:0 00:17:00.611 [2024-09-27 15:22:02.376140] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376149] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376159] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.611 [2024-09-27 15:22:02.376177] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.611 [2024-09-27 15:22:02.376183] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0005 p:0 m:0 dnr:0 00:17:00.611 [2024-09-27 15:22:02.376189] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376198] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376206] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.611 [2024-09-27 15:22:02.376224] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.611 [2024-09-27 15:22:02.376230] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0006 p:0 m:0 dnr:0 00:17:00.611 [2024-09-27 15:22:02.376236] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376245] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376253] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.611 [2024-09-27 15:22:02.376275] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.611 [2024-09-27 15:22:02.376281] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0007 p:0 m:0 dnr:0 00:17:00.611 [2024-09-27 15:22:02.376287] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376296] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.376304] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.611 [2024-09-27 15:22:02.376320] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.611 [2024-09-27 15:22:02.376326] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0008 p:0 m:0 dnr:0 00:17:00.611 [2024-09-27 15:22:02.376332] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.380348] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.380358] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.611 [2024-09-27 15:22:02.380375] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.611 [2024-09-27 15:22:02.380380] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:9 sqhd:0009 p:0 m:0 dnr:0 00:17:00.611 [2024-09-27 15:22:02.380387] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.611 [2024-09-27 15:22:02.380394] nvme_ctrlr.c:1269:nvme_ctrlr_shutdown_poll_async: *DEBUG*: [nqn.2014-08.org.nvmexpress.discovery] shutdown complete in 6 milliseconds 00:17:00.611 128 00:17:00.611 Transport Service Identifier: 4420 00:17:00.611 NVM Subsystem Qualified Name: nqn.2016-06.io.spdk:cnode1 00:17:00.611 Transport Address: 10.0.0.2 00:17:00.611 Transport Specific Address Subtype - RDMA 00:17:00.611 RDMA QP Service Type: 1 (Reliable Connected) 00:17:00.611 RDMA Provider Type: 1 (No provider specified) 00:17:00.611 RDMA CM Service: 1 (RDMA_CM) 00:17:00.611 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@45 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_identify -r ' trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2016-06.io.spdk:cnode1' -L all 00:17:00.875 [2024-09-27 15:22:02.452619] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:17:00.875 [2024-09-27 15:22:02.452662] [ DPDK EAL parameters: identify --no-shconf -c 0x1 -n 1 -m 0 --no-pci --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1818589 ] 00:17:00.875 [2024-09-27 15:22:02.499373] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to connect adminq (no timeout) 00:17:00.875 [2024-09-27 15:22:02.499441] nvme_rdma.c:2214:nvme_rdma_ctrlr_construct: *DEBUG*: successfully initialized the nvmf ctrlr 00:17:00.875 [2024-09-27 15:22:02.499460] nvme_rdma.c:1215:nvme_rdma_ctrlr_connect_qpair: *DEBUG*: adrfam 1 ai_family 2 00:17:00.875 [2024-09-27 15:22:02.499465] nvme_rdma.c:1219:nvme_rdma_ctrlr_connect_qpair: *DEBUG*: trsvcid is 4420 00:17:00.875 [2024-09-27 15:22:02.499489] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for connect adminq (no timeout) 00:17:00.875 [2024-09-27 15:22:02.514731] nvme_rdma.c: 431:nvme_rdma_qpair_process_cm_event: *DEBUG*: Requested queue depth 32. Target receive queue depth 32. 00:17:00.875 [2024-09-27 15:22:02.528649] nvme_rdma.c:1101:nvme_rdma_connect_established: *DEBUG*: rc =0 00:17:00.875 [2024-09-27 15:22:02.528659] nvme_rdma.c:1106:nvme_rdma_connect_established: *DEBUG*: RDMA requests created 00:17:00.875 [2024-09-27 15:22:02.528667] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528674] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528680] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528687] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528693] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528699] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528706] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528712] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528718] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528724] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528730] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528737] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf7b8 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528743] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf7e0 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528749] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf808 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528755] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf830 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528762] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf858 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528768] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf880 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528774] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf8a8 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528783] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf8d0 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528790] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf8f8 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528796] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf920 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528802] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf948 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528808] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf970 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528815] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf998 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528821] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf9c0 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528827] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cf9e8 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528833] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa10 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528840] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa38 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528846] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa60 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528852] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfa88 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528858] nvme_rdma.c: 889:nvme_rdma_create_rsps: *DEBUG*: local addr 0x2000003cfab0 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528864] nvme_rdma.c:1120:nvme_rdma_connect_established: *DEBUG*: RDMA responses created 00:17:00.875 [2024-09-27 15:22:02.528870] nvme_rdma.c:1123:nvme_rdma_connect_established: *DEBUG*: rc =0 00:17:00.875 [2024-09-27 15:22:02.528874] nvme_rdma.c:1128:nvme_rdma_connect_established: *DEBUG*: RDMA responses submitted 00:17:00.875 [2024-09-27 15:22:02.528891] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.528904] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x2000003cf180 len:0x400 key:0x1bf200 00:17:00.875 [2024-09-27 15:22:02.534347] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.875 [2024-09-27 15:22:02.534356] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:1 sqhd:0001 p:0 m:0 dnr:0 00:17:00.875 [2024-09-27 15:22:02.534363] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.534370] nvme_fabric.c: 621:nvme_fabric_qpair_connect_poll: *DEBUG*: CNTLID 0x0001 00:17:00.875 [2024-09-27 15:22:02.534377] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to read vs (no timeout) 00:17:00.875 [2024-09-27 15:22:02.534384] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to read vs wait for vs (no timeout) 00:17:00.875 [2024-09-27 15:22:02.534396] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.875 [2024-09-27 15:22:02.534404] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.534426] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534432] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:10300 sqhd:0002 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534438] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to read cap (no timeout) 00:17:00.876 [2024-09-27 15:22:02.534444] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534451] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to read cap wait for cap (no timeout) 00:17:00.876 [2024-09-27 15:22:02.534462] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534470] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.534492] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534497] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:1e01007f sqhd:0003 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534505] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to check en (no timeout) 00:17:00.876 [2024-09-27 15:22:02.534511] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534518] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to check en wait for cc (timeout 15000 ms) 00:17:00.876 [2024-09-27 15:22:02.534526] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534534] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.534552] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534557] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0004 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534564] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to disable and wait for CSTS.RDY = 0 (timeout 15000 ms) 00:17:00.876 [2024-09-27 15:22:02.534570] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534579] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534586] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.534604] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534610] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0005 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534616] nvme_ctrlr.c:3893:nvme_ctrlr_process_init_wait_for_ready_0: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] CC.EN = 0 && CSTS.RDY = 0 00:17:00.876 [2024-09-27 15:22:02.534622] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to controller is disabled (timeout 15000 ms) 00:17:00.876 [2024-09-27 15:22:02.534628] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534635] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to enable controller by writing CC.EN = 1 (timeout 15000 ms) 00:17:00.876 [2024-09-27 15:22:02.534742] nvme_ctrlr.c:4091:nvme_ctrlr_process_init: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] Setting CC.EN = 1 00:17:00.876 [2024-09-27 15:22:02.534747] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to enable controller by writing CC.EN = 1 reg (timeout 15000 ms) 00:17:00.876 [2024-09-27 15:22:02.534755] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534763] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY SET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.534779] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534785] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0006 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534792] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for CSTS.RDY = 1 (timeout 15000 ms) 00:17:00.876 [2024-09-27 15:22:02.534799] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534808] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534815] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:0 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.534832] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534837] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:1 sqhd:0007 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534844] nvme_ctrlr.c:3928:nvme_ctrlr_process_init_enable_wait_for_ready_1: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] CC.EN = 1 && CSTS.RDY = 1 - controller is ready 00:17:00.876 [2024-09-27 15:22:02.534850] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to reset admin queue (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.534856] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534863] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to identify controller (no timeout) 00:17:00.876 [2024-09-27 15:22:02.534871] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for identify controller (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.534881] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534889] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:0 nsid:0 cdw10:00000001 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0x1000 key:0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534931] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.534937] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0008 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.534946] nvme_ctrlr.c:2077:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] transport max_xfer_size 4294967295 00:17:00.876 [2024-09-27 15:22:02.534952] nvme_ctrlr.c:2081:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] MDTS max_xfer_size 131072 00:17:00.876 [2024-09-27 15:22:02.534958] nvme_ctrlr.c:2084:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] CNTLID 0x0001 00:17:00.876 [2024-09-27 15:22:02.534963] nvme_ctrlr.c:2108:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] transport max_sges 16 00:17:00.876 [2024-09-27 15:22:02.534969] nvme_ctrlr.c:2123:nvme_ctrlr_identify_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] fuses compare and write: 1 00:17:00.876 [2024-09-27 15:22:02.534975] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to configure AER (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.534981] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.534988] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for configure aer (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.534998] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535006] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ASYNC EVENT CONFIGURATION cid:0 cdw10:0000000b SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.535025] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.535030] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0009 p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.535039] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0480 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535046] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.876 [2024-09-27 15:22:02.535055] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d05c0 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535062] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.876 [2024-09-27 15:22:02.535069] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535076] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.876 [2024-09-27 15:22:02.535083] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0840 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535090] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.876 [2024-09-27 15:22:02.535096] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set keep alive timeout (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.535102] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535113] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for set keep alive timeout (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.535120] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535128] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES KEEP ALIVE TIMER cid:0 cdw10:0000000f SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.876 [2024-09-27 15:22:02.535143] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.876 [2024-09-27 15:22:02.535148] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:2710 sqhd:000a p:0 m:0 dnr:0 00:17:00.876 [2024-09-27 15:22:02.535155] nvme_ctrlr.c:3046:nvme_ctrlr_set_keep_alive_timeout_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] Sending keep alive every 5000000 us 00:17:00.876 [2024-09-27 15:22:02.535161] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to identify controller iocs specific (timeout 30000 ms) 00:17:00.876 [2024-09-27 15:22:02.535167] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.876 [2024-09-27 15:22:02.535176] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set number of queues (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535183] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for set number of queues (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535191] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535198] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NUMBER OF QUEUES cid:0 cdw10:00000007 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.877 [2024-09-27 15:22:02.535218] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535224] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:7e007e sqhd:000b p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535276] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to identify active ns (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535282] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535291] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for identify active ns (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535299] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535307] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:0 nsid:0 cdw10:00000002 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003cc000 len:0x1000 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535329] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535335] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:000c p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535353] nvme_ctrlr.c:4722:spdk_nvme_ctrlr_get_ns: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] Namespace 1 was added 00:17:00.877 [2024-09-27 15:22:02.535368] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to identify ns (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535374] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7b8 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535382] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for identify ns (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535390] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535398] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:0 nsid:1 cdw10:00000000 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0x1000 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535432] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535437] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:000d p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535449] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to identify namespace id descriptors (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535456] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7e0 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535464] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to wait for identify namespace id descriptors (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535472] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535480] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:0 nsid:1 cdw10:00000003 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0x1000 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535500] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535506] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:000e p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535517] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to identify ns iocs specific (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535523] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf808 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535530] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set supported log pages (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535541] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set supported features (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535548] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set host behavior support feature (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535555] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set doorbell buffer config (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535561] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to set host ID (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535567] nvme_ctrlr.c:3134:nvme_ctrlr_set_host_id: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] NVMe-oF transport - not sending Set Features - Host ID 00:17:00.877 [2024-09-27 15:22:02.535573] nvme_ctrlr.c:1557:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to transport ready (timeout 30000 ms) 00:17:00.877 [2024-09-27 15:22:02.535582] nvme_ctrlr.c:1563:_nvme_ctrlr_set_state: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] setting state to ready (no timeout) 00:17:00.877 [2024-09-27 15:22:02.535597] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535605] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES ARBITRATION cid:0 cdw10:00000001 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.877 [2024-09-27 15:22:02.535613] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535620] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: KEEP ALIVE (18) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:17:00.877 [2024-09-27 15:22:02.535631] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535637] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535643] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf830 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535650] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535655] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535661] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf858 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535671] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535679] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES POWER MANAGEMENT cid:5 cdw10:00000002 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.877 [2024-09-27 15:22:02.535696] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535702] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535709] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf880 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535718] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535725] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES TEMPERATURE THRESHOLD cid:5 cdw10:00000004 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.877 [2024-09-27 15:22:02.535743] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535748] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535755] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8a8 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535764] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535772] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES NUMBER OF QUEUES cid:5 cdw10:00000007 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.877 [2024-09-27 15:22:02.535792] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535798] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:7e007e sqhd:0013 p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535804] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8d0 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535818] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0980 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535826] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:ffffffff cdw10:07ff0001 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003c9000 len:0x2000 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535836] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0340 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535844] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:0 nsid:ffffffff cdw10:007f0002 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003cf000 len:0x200 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535853] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0ac0 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535860] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:ffffffff cdw10:007f0003 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003ce000 len:0x200 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535869] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0c00 length 0x40 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535876] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:ffffffff cdw10:03ff0005 cdw11:00000000 SGL KEYED DATA BLOCK ADDRESS 0x2000003c7000 len:0x1000 key:0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535885] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535891] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:5 cdw0:0 sqhd:0014 p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535903] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8f8 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535910] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.877 [2024-09-27 15:22:02.535915] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:0 cdw0:0 sqhd:0015 p:0 m:0 dnr:0 00:17:00.877 [2024-09-27 15:22:02.535926] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf920 length 0x10 lkey 0x1bf200 00:17:00.877 [2024-09-27 15:22:02.535933] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.878 [2024-09-27 15:22:02.535939] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:6 cdw0:0 sqhd:0016 p:0 m:0 dnr:0 00:17:00.878 [2024-09-27 15:22:02.535946] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf948 length 0x10 lkey 0x1bf200 00:17:00.878 [2024-09-27 15:22:02.535952] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.878 [2024-09-27 15:22:02.535958] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:7 cdw0:0 sqhd:0017 p:0 m:0 dnr:0 00:17:00.878 [2024-09-27 15:22:02.535967] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf970 length 0x10 lkey 0x1bf200 00:17:00.878 ===================================================== 00:17:00.878 NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:17:00.878 ===================================================== 00:17:00.878 Controller Capabilities/Features 00:17:00.878 ================================ 00:17:00.878 Vendor ID: 8086 00:17:00.878 Subsystem Vendor ID: 8086 00:17:00.878 Serial Number: SPDK00000000000001 00:17:00.878 Model Number: SPDK bdev Controller 00:17:00.878 Firmware Version: 25.01 00:17:00.878 Recommended Arb Burst: 6 00:17:00.878 IEEE OUI Identifier: e4 d2 5c 00:17:00.878 Multi-path I/O 00:17:00.878 May have multiple subsystem ports: Yes 00:17:00.878 May have multiple controllers: Yes 00:17:00.878 Associated with SR-IOV VF: No 00:17:00.878 Max Data Transfer Size: 131072 00:17:00.878 Max Number of Namespaces: 32 00:17:00.878 Max Number of I/O Queues: 127 00:17:00.878 NVMe Specification Version (VS): 1.3 00:17:00.878 NVMe Specification Version (Identify): 1.3 00:17:00.878 Maximum Queue Entries: 128 00:17:00.878 Contiguous Queues Required: Yes 00:17:00.878 Arbitration Mechanisms Supported 00:17:00.878 Weighted Round Robin: Not Supported 00:17:00.878 Vendor Specific: Not Supported 00:17:00.878 Reset Timeout: 15000 ms 00:17:00.878 Doorbell Stride: 4 bytes 00:17:00.878 NVM Subsystem Reset: Not Supported 00:17:00.878 Command Sets Supported 00:17:00.878 NVM Command Set: Supported 00:17:00.878 Boot Partition: Not Supported 00:17:00.878 Memory Page Size Minimum: 4096 bytes 00:17:00.878 Memory Page Size Maximum: 4096 bytes 00:17:00.878 Persistent Memory Region: Not Supported 00:17:00.878 Optional Asynchronous Events Supported 00:17:00.878 Namespace Attribute Notices: Supported 00:17:00.878 Firmware Activation Notices: Not Supported 00:17:00.878 ANA Change Notices: Not Supported 00:17:00.878 PLE Aggregate Log Change Notices: Not Supported 00:17:00.878 LBA Status Info Alert Notices: Not Supported 00:17:00.878 EGE Aggregate Log Change Notices: Not Supported 00:17:00.878 Normal NVM Subsystem Shutdown event: Not Supported 00:17:00.878 Zone Descriptor Change Notices: Not Supported 00:17:00.878 Discovery Log Change Notices: Not Supported 00:17:00.878 Controller Attributes 00:17:00.878 128-bit Host Identifier: Supported 00:17:00.878 Non-Operational Permissive Mode: Not Supported 00:17:00.878 NVM Sets: Not Supported 00:17:00.878 Read Recovery Levels: Not Supported 00:17:00.878 Endurance Groups: Not Supported 00:17:00.878 Predictable Latency Mode: Not Supported 00:17:00.878 Traffic Based Keep ALive: Not Supported 00:17:00.878 Namespace Granularity: Not Supported 00:17:00.878 SQ Associations: Not Supported 00:17:00.878 UUID List: Not Supported 00:17:00.878 Multi-Domain Subsystem: Not Supported 00:17:00.878 Fixed Capacity Management: Not Supported 00:17:00.878 Variable Capacity Management: Not Supported 00:17:00.878 Delete Endurance Group: Not Supported 00:17:00.878 Delete NVM Set: Not Supported 00:17:00.878 Extended LBA Formats Supported: Not Supported 00:17:00.878 Flexible Data Placement Supported: Not Supported 00:17:00.878 00:17:00.878 Controller Memory Buffer Support 00:17:00.878 ================================ 00:17:00.878 Supported: No 00:17:00.878 00:17:00.878 Persistent Memory Region Support 00:17:00.878 ================================ 00:17:00.878 Supported: No 00:17:00.878 00:17:00.878 Admin Command Set Attributes 00:17:00.878 ============================ 00:17:00.878 Security Send/Receive: Not Supported 00:17:00.878 Format NVM: Not Supported 00:17:00.878 Firmware Activate/Download: Not Supported 00:17:00.878 Namespace Management: Not Supported 00:17:00.878 Device Self-Test: Not Supported 00:17:00.878 Directives: Not Supported 00:17:00.878 NVMe-MI: Not Supported 00:17:00.878 Virtualization Management: Not Supported 00:17:00.878 Doorbell Buffer Config: Not Supported 00:17:00.878 Get LBA Status Capability: Not Supported 00:17:00.878 Command & Feature Lockdown Capability: Not Supported 00:17:00.878 Abort Command Limit: 4 00:17:00.878 Async Event Request Limit: 4 00:17:00.878 Number of Firmware Slots: N/A 00:17:00.878 Firmware Slot 1 Read-Only: N/A 00:17:00.878 Firmware Activation Without Reset: N/A 00:17:00.878 Multiple Update Detection Support: N/A 00:17:00.878 Firmware Update Granularity: No Information Provided 00:17:00.878 Per-Namespace SMART Log: No 00:17:00.878 Asymmetric Namespace Access Log Page: Not Supported 00:17:00.878 Subsystem NQN: nqn.2016-06.io.spdk:cnode1 00:17:00.878 Command Effects Log Page: Supported 00:17:00.878 Get Log Page Extended Data: Supported 00:17:00.878 Telemetry Log Pages: Not Supported 00:17:00.878 Persistent Event Log Pages: Not Supported 00:17:00.878 Supported Log Pages Log Page: May Support 00:17:00.878 Commands Supported & Effects Log Page: Not Supported 00:17:00.878 Feature Identifiers & Effects Log Page:May Support 00:17:00.878 NVMe-MI Commands & Effects Log Page: May Support 00:17:00.878 Data Area 4 for Telemetry Log: Not Supported 00:17:00.878 Error Log Page Entries Supported: 128 00:17:00.878 Keep Alive: Supported 00:17:00.878 Keep Alive Granularity: 10000 ms 00:17:00.878 00:17:00.878 NVM Command Set Attributes 00:17:00.878 ========================== 00:17:00.878 Submission Queue Entry Size 00:17:00.878 Max: 64 00:17:00.878 Min: 64 00:17:00.878 Completion Queue Entry Size 00:17:00.878 Max: 16 00:17:00.878 Min: 16 00:17:00.878 Number of Namespaces: 32 00:17:00.878 Compare Command: Supported 00:17:00.878 Write Uncorrectable Command: Not Supported 00:17:00.878 Dataset Management Command: Supported 00:17:00.878 Write Zeroes Command: Supported 00:17:00.878 Set Features Save Field: Not Supported 00:17:00.878 Reservations: Supported 00:17:00.878 Timestamp: Not Supported 00:17:00.878 Copy: Supported 00:17:00.878 Volatile Write Cache: Present 00:17:00.878 Atomic Write Unit (Normal): 1 00:17:00.878 Atomic Write Unit (PFail): 1 00:17:00.878 Atomic Compare & Write Unit: 1 00:17:00.878 Fused Compare & Write: Supported 00:17:00.878 Scatter-Gather List 00:17:00.878 SGL Command Set: Supported 00:17:00.878 SGL Keyed: Supported 00:17:00.878 SGL Bit Bucket Descriptor: Not Supported 00:17:00.878 SGL Metadata Pointer: Not Supported 00:17:00.878 Oversized SGL: Not Supported 00:17:00.878 SGL Metadata Address: Not Supported 00:17:00.878 SGL Offset: Supported 00:17:00.878 Transport SGL Data Block: Not Supported 00:17:00.878 Replay Protected Memory Block: Not Supported 00:17:00.878 00:17:00.878 Firmware Slot Information 00:17:00.878 ========================= 00:17:00.878 Active slot: 1 00:17:00.878 Slot 1 Firmware Revision: 25.01 00:17:00.878 00:17:00.878 00:17:00.878 Commands Supported and Effects 00:17:00.878 ============================== 00:17:00.878 Admin Commands 00:17:00.878 -------------- 00:17:00.878 Get Log Page (02h): Supported 00:17:00.878 Identify (06h): Supported 00:17:00.878 Abort (08h): Supported 00:17:00.878 Set Features (09h): Supported 00:17:00.878 Get Features (0Ah): Supported 00:17:00.878 Asynchronous Event Request (0Ch): Supported 00:17:00.878 Keep Alive (18h): Supported 00:17:00.878 I/O Commands 00:17:00.878 ------------ 00:17:00.878 Flush (00h): Supported LBA-Change 00:17:00.878 Write (01h): Supported LBA-Change 00:17:00.878 Read (02h): Supported 00:17:00.878 Compare (05h): Supported 00:17:00.878 Write Zeroes (08h): Supported LBA-Change 00:17:00.878 Dataset Management (09h): Supported LBA-Change 00:17:00.878 Copy (19h): Supported LBA-Change 00:17:00.878 00:17:00.878 Error Log 00:17:00.878 ========= 00:17:00.878 00:17:00.878 Arbitration 00:17:00.878 =========== 00:17:00.878 Arbitration Burst: 1 00:17:00.879 00:17:00.879 Power Management 00:17:00.879 ================ 00:17:00.879 Number of Power States: 1 00:17:00.879 Current Power State: Power State #0 00:17:00.879 Power State #0: 00:17:00.879 Max Power: 0.00 W 00:17:00.879 Non-Operational State: Operational 00:17:00.879 Entry Latency: Not Reported 00:17:00.879 Exit Latency: Not Reported 00:17:00.879 Relative Read Throughput: 0 00:17:00.879 Relative Read Latency: 0 00:17:00.879 Relative Write Throughput: 0 00:17:00.879 Relative Write Latency: 0 00:17:00.879 Idle Power: Not Reported 00:17:00.879 Active Power: Not Reported 00:17:00.879 Non-Operational Permissive Mode: Not Supported 00:17:00.879 00:17:00.879 Health Information 00:17:00.879 ================== 00:17:00.879 Critical Warnings: 00:17:00.879 Available Spare Space: OK 00:17:00.879 Temperature: OK 00:17:00.879 Device Reliability: OK 00:17:00.879 Read Only: No 00:17:00.879 Volatile Memory Backup: OK 00:17:00.879 Current Temperature: 0 Kelvin (-273 Celsius) 00:17:00.879 Temperature Threshold: 0 Kelvin (-273 Celsius) 00:17:00.879 Available Spare: 0% 00:17:00.879 Available Spare Threshold: 0% 00:17:00.879 Life Percentage Used:[2024-09-27 15:22:02.536047] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0c00 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536056] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES ERROR_RECOVERY cid:7 cdw10:00000005 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536074] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536080] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:7 cdw0:0 sqhd:0018 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536086] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf998 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536114] nvme_ctrlr.c:4386:nvme_ctrlr_destruct_async: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] Prepare to destruct SSD 00:17:00.879 [2024-09-27 15:22:02.536124] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 13230 doesn't match qid 00:17:00.879 [2024-09-27 15:22:02.536139] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32696 cdw0:5 sqhd:09b0 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536145] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 13230 doesn't match qid 00:17:00.879 [2024-09-27 15:22:02.536154] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32696 cdw0:5 sqhd:09b0 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536162] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 13230 doesn't match qid 00:17:00.879 [2024-09-27 15:22:02.536170] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32696 cdw0:5 sqhd:09b0 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536176] nvme_qpair.c: 471:spdk_nvme_print_completion: *ERROR*: sqid 13230 doesn't match qid 00:17:00.879 [2024-09-27 15:22:02.536184] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32696 cdw0:5 sqhd:09b0 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536193] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0840 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536201] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:4 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536218] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536224] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:4 cdw0:460001 sqhd:0019 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536232] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536240] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY SET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536246] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9c0 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536265] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536271] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:0 sqhd:001a p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536278] nvme_ctrlr.c:1147:nvme_ctrlr_shutdown_set_cc_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] RTD3E = 0 us 00:17:00.879 [2024-09-27 15:22:02.536283] nvme_ctrlr.c:1150:nvme_ctrlr_shutdown_set_cc_done: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] shutdown timeout = 10000 ms 00:17:00.879 [2024-09-27 15:22:02.536290] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9e8 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536300] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536308] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536325] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536331] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001b p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536337] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa10 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536354] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536362] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536378] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536384] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001c p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536392] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa38 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536402] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536409] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536429] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536435] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001d p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536444] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa60 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536453] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536460] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536475] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536481] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001e p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536488] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa88 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536498] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536507] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536522] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536527] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001f p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536534] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfab0 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536543] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536553] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536567] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536574] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0000 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536581] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536590] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536598] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536617] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536623] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0001 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536630] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536640] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536648] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536668] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.879 [2024-09-27 15:22:02.536674] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0002 p:0 m:0 dnr:0 00:17:00.879 [2024-09-27 15:22:02.536680] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536689] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.879 [2024-09-27 15:22:02.536697] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.879 [2024-09-27 15:22:02.536715] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.536723] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0003 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.536730] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536739] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536747] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.536763] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.536769] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0004 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.536775] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536784] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536792] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.536814] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.536820] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0005 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.536826] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536835] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536843] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.536863] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.536868] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0006 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.536875] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536884] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536891] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.536910] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.536915] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0007 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.536922] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536931] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536938] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.536955] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.536960] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0008 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.536967] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf740 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536976] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.536984] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.536998] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537005] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0009 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537012] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf768 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537020] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537028] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537042] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537048] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000a p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537054] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf790 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537063] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537071] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537093] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537099] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000b p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537105] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7b8 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537114] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537122] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537144] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537150] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000c p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537156] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf7e0 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537165] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537173] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537193] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537198] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000d p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537205] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf808 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537214] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537221] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537242] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537247] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000e p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537254] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf830 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537262] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537270] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537290] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537295] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:000f p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537302] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf858 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537311] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537319] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537337] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.880 [2024-09-27 15:22:02.537347] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0010 p:0 m:0 dnr:0 00:17:00.880 [2024-09-27 15:22:02.537353] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf880 length 0x10 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537362] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.880 [2024-09-27 15:22:02.537370] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.880 [2024-09-27 15:22:02.537388] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537394] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0011 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537400] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8a8 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537409] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537417] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537435] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537450] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0012 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537457] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8d0 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537466] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537474] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537492] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537498] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0013 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537504] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf8f8 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537513] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537521] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537541] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537547] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0014 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537553] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf920 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537562] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537570] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537589] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537595] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0015 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537602] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf948 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537610] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537618] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537636] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537642] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0016 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537648] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf970 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537657] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537665] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537687] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537692] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0017 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537699] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf998 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537708] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537715] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537731] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537737] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0018 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537743] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9c0 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537752] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537760] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537776] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537782] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0019 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537788] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf9e8 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537797] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537805] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537823] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537828] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001a p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537835] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa10 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537844] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537853] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537875] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537880] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001b p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537887] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa38 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537896] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537904] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537923] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537929] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001c p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537935] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa60 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537944] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537952] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.537968] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.537974] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001d p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.537980] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfa88 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537989] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.537997] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.538013] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.538018] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001e p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.538025] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cfab0 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.538034] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.538041] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.538063] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.538069] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:001f p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.538075] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf600 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.538084] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.538092] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.538112] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.538117] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0000 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.538124] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf628 length 0x10 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.538133] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.881 [2024-09-27 15:22:02.538142] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.881 [2024-09-27 15:22:02.538160] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.881 [2024-09-27 15:22:02.538166] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0001 p:0 m:0 dnr:0 00:17:00.881 [2024-09-27 15:22:02.538172] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf650 length 0x10 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538181] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538188] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.882 [2024-09-27 15:22:02.538210] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.882 [2024-09-27 15:22:02.538216] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0002 p:0 m:0 dnr:0 00:17:00.882 [2024-09-27 15:22:02.538222] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf678 length 0x10 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538231] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538239] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.882 [2024-09-27 15:22:02.538255] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.882 [2024-09-27 15:22:02.538260] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0003 p:0 m:0 dnr:0 00:17:00.882 [2024-09-27 15:22:02.538267] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6a0 length 0x10 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538276] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538284] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.882 [2024-09-27 15:22:02.538302] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.882 [2024-09-27 15:22:02.538307] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0004 p:0 m:0 dnr:0 00:17:00.882 [2024-09-27 15:22:02.538314] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6c8 length 0x10 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538322] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.538330] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.882 [2024-09-27 15:22:02.542347] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.882 [2024-09-27 15:22:02.542354] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:1 sqhd:0005 p:0 m:0 dnr:0 00:17:00.882 [2024-09-27 15:22:02.542361] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf6f0 length 0x10 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.542370] nvme_rdma.c:2293:nvme_rdma_qpair_submit_request: *DEBUG*: local addr 0x2000003d0700 length 0x40 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.542378] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC PROPERTY GET qid:0 cid:3 SGL KEYED DATA BLOCK ADDRESS 0x0 len:0x0 key:0x0 00:17:00.882 [2024-09-27 15:22:02.542396] nvme_rdma.c:2496:nvme_rdma_process_recv_completion: *DEBUG*: CQ recv completion 00:17:00.882 [2024-09-27 15:22:02.542402] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: SUCCESS (00/00) qid:0 cid:3 cdw0:9 sqhd:0006 p:0 m:0 dnr:0 00:17:00.882 [2024-09-27 15:22:02.542408] nvme_rdma.c:2389:nvme_rdma_request_ready: *DEBUG*: local addr 0x2000003cf718 length 0x10 lkey 0x1bf200 00:17:00.882 [2024-09-27 15:22:02.542417] nvme_ctrlr.c:1269:nvme_ctrlr_shutdown_poll_async: *DEBUG*: [nqn.2016-06.io.spdk:cnode1] shutdown complete in 6 milliseconds 00:17:00.882 0% 00:17:00.882 Data Units Read: 0 00:17:00.882 Data Units Written: 0 00:17:00.882 Host Read Commands: 0 00:17:00.882 Host Write Commands: 0 00:17:00.882 Controller Busy Time: 0 minutes 00:17:00.882 Power Cycles: 0 00:17:00.882 Power On Hours: 0 hours 00:17:00.882 Unsafe Shutdowns: 0 00:17:00.882 Unrecoverable Media Errors: 0 00:17:00.882 Lifetime Error Log Entries: 0 00:17:00.882 Warning Temperature Time: 0 minutes 00:17:00.882 Critical Temperature Time: 0 minutes 00:17:00.882 00:17:00.882 Number of Queues 00:17:00.882 ================ 00:17:00.882 Number of I/O Submission Queues: 127 00:17:00.882 Number of I/O Completion Queues: 127 00:17:00.882 00:17:00.882 Active Namespaces 00:17:00.882 ================= 00:17:00.882 Namespace ID:1 00:17:00.882 Error Recovery Timeout: Unlimited 00:17:00.882 Command Set Identifier: NVM (00h) 00:17:00.882 Deallocate: Supported 00:17:00.882 Deallocated/Unwritten Error: Not Supported 00:17:00.882 Deallocated Read Value: Unknown 00:17:00.882 Deallocate in Write Zeroes: Not Supported 00:17:00.882 Deallocated Guard Field: 0xFFFF 00:17:00.882 Flush: Supported 00:17:00.882 Reservation: Supported 00:17:00.882 Namespace Sharing Capabilities: Multiple Controllers 00:17:00.882 Size (in LBAs): 131072 (0GiB) 00:17:00.882 Capacity (in LBAs): 131072 (0GiB) 00:17:00.882 Utilization (in LBAs): 131072 (0GiB) 00:17:00.882 NGUID: ABCDEF0123456789ABCDEF0123456789 00:17:00.882 EUI64: ABCDEF0123456789 00:17:00.882 UUID: f61756c0-8c56-4812-a540-a6f138fd37d8 00:17:00.882 Thin Provisioning: Not Supported 00:17:00.882 Per-NS Atomic Units: Yes 00:17:00.882 Atomic Boundary Size (Normal): 0 00:17:00.882 Atomic Boundary Size (PFail): 0 00:17:00.882 Atomic Boundary Offset: 0 00:17:00.882 Maximum Single Source Range Length: 65535 00:17:00.882 Maximum Copy Length: 65535 00:17:00.882 Maximum Source Range Count: 1 00:17:00.882 NGUID/EUI64 Never Reused: No 00:17:00.882 Namespace Write Protected: No 00:17:00.882 Number of LBA Formats: 1 00:17:00.882 Current LBA Format: LBA Format #00 00:17:00.882 LBA Format #00: Data Size: 512 Metadata Size: 0 00:17:00.882 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@51 -- # sync 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@52 -- # rpc_cmd nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@561 -- # xtrace_disable 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@54 -- # trap - SIGINT SIGTERM EXIT 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- host/identify.sh@56 -- # nvmftestfini 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@331 -- # nvmfcleanup 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@99 -- # sync 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@102 -- # set +e 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@103 -- # for i in {1..20} 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:17:00.882 rmmod nvme_rdma 00:17:00.882 rmmod nvme_fabrics 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@106 -- # set -e 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@107 -- # return 0 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@332 -- # '[' -n 1818377 ']' 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@333 -- # killprocess 1818377 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@950 -- # '[' -z 1818377 ']' 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@954 -- # kill -0 1818377 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@955 -- # uname 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1818377 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1818377' 00:17:00.882 killing process with pid 1818377 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@969 -- # kill 1818377 00:17:00.882 15:22:02 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@974 -- # wait 1818377 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@338 -- # nvmf_fini 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@264 -- # local dev 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@267 -- # remove_target_ns 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@22 -- # _remove_target_ns 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@268 -- # delete_main_bridge 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@130 -- # return 0 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@41 -- # _dev=0 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@41 -- # dev_map=() 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/setup.sh@284 -- # iptr 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@538 -- # iptables-save 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- nvmf/common.sh@538 -- # iptables-restore 00:17:01.452 00:17:01.452 real 0m9.254s 00:17:01.452 user 0m8.857s 00:17:01.452 sys 0m5.934s 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_identify -- common/autotest_common.sh@10 -- # set +x 00:17:01.452 ************************************ 00:17:01.452 END TEST nvmf_identify 00:17:01.452 ************************************ 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@21 -- # run_test nvmf_perf /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/perf.sh --transport=rdma 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:17:01.452 ************************************ 00:17:01.452 START TEST nvmf_perf 00:17:01.452 ************************************ 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/perf.sh --transport=rdma 00:17:01.452 * Looking for test storage... 00:17:01.452 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1681 -- # lcov --version 00:17:01.452 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@333 -- # local ver1 ver1_l 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@334 -- # local ver2 ver2_l 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@336 -- # IFS=.-: 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@336 -- # read -ra ver1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@337 -- # IFS=.-: 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@337 -- # read -ra ver2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@338 -- # local 'op=<' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@340 -- # ver1_l=2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@341 -- # ver2_l=1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@344 -- # case "$op" in 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@345 -- # : 1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@364 -- # (( v = 0 )) 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@365 -- # decimal 1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@353 -- # local d=1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@355 -- # echo 1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@365 -- # ver1[v]=1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@366 -- # decimal 2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@353 -- # local d=2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@355 -- # echo 2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@366 -- # ver2[v]=2 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@368 -- # return 0 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:17:01.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:01.713 --rc genhtml_branch_coverage=1 00:17:01.713 --rc genhtml_function_coverage=1 00:17:01.713 --rc genhtml_legend=1 00:17:01.713 --rc geninfo_all_blocks=1 00:17:01.713 --rc geninfo_unexecuted_blocks=1 00:17:01.713 00:17:01.713 ' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:17:01.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:01.713 --rc genhtml_branch_coverage=1 00:17:01.713 --rc genhtml_function_coverage=1 00:17:01.713 --rc genhtml_legend=1 00:17:01.713 --rc geninfo_all_blocks=1 00:17:01.713 --rc geninfo_unexecuted_blocks=1 00:17:01.713 00:17:01.713 ' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:17:01.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:01.713 --rc genhtml_branch_coverage=1 00:17:01.713 --rc genhtml_function_coverage=1 00:17:01.713 --rc genhtml_legend=1 00:17:01.713 --rc geninfo_all_blocks=1 00:17:01.713 --rc geninfo_unexecuted_blocks=1 00:17:01.713 00:17:01.713 ' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:17:01.713 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:01.713 --rc genhtml_branch_coverage=1 00:17:01.713 --rc genhtml_function_coverage=1 00:17:01.713 --rc genhtml_legend=1 00:17:01.713 --rc geninfo_all_blocks=1 00:17:01.713 --rc geninfo_unexecuted_blocks=1 00:17:01.713 00:17:01.713 ' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@7 -- # uname -s 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@15 -- # shopt -s extglob 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- paths/export.sh@5 -- # export PATH 00:17:01.713 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@50 -- # : 0 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:17:01.714 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@54 -- # have_pci_nics=0 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@12 -- # MALLOC_BDEV_SIZE=64 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@13 -- # MALLOC_BLOCK_SIZE=512 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@15 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@17 -- # nvmftestinit 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@292 -- # prepare_net_devs 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@254 -- # local -g is_hw=no 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@256 -- # remove_target_ns 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@22 -- # _remove_target_ns 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@125 -- # xtrace_disable 00:17:01.714 15:22:03 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@10 -- # set +x 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@131 -- # pci_devs=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@131 -- # local -a pci_devs 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@132 -- # pci_net_devs=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@133 -- # pci_drivers=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@133 -- # local -A pci_drivers 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@135 -- # net_devs=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@135 -- # local -ga net_devs 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@136 -- # e810=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@136 -- # local -ga e810 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@137 -- # x722=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@137 -- # local -ga x722 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@138 -- # mlx=() 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@138 -- # local -ga mlx 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:17:08.293 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:17:08.294 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:17:08.294 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:17:08.294 Found net devices under 0000:18:00.0: mlx_0_0 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:17:08.294 Found net devices under 0000:18:00.1: mlx_0_1 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@249 -- # get_rdma_if_list 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@75 -- # rdma_devs=() 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@89 -- # continue 2 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@89 -- # continue 2 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@258 -- # is_hw=yes 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:17:08.294 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@61 -- # uname 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@65 -- # modprobe ib_cm 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@66 -- # modprobe ib_core 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@67 -- # modprobe ib_umad 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@69 -- # modprobe iw_cm 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@27 -- # local -gA dev_map 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@28 -- # local -g _dev 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:08.554 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@44 -- # ips=() 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@58 -- # key_initiator=target1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@11 -- # local val=167772161 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:17:08.555 10.0.0.1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@11 -- # local val=167772162 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:17:08.555 10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@38 -- # ping_ips 1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@107 -- # local dev=target0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:08.555 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:08.555 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.031 ms 00:17:08.555 00:17:08.555 --- 10.0.0.2 ping statistics --- 00:17:08.555 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:08.555 rtt min/avg/max/mdev = 0.031/0.031/0.031/0.000 ms 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@107 -- # local dev=target0 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:08.555 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:08.555 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.033 ms 00:17:08.555 00:17:08.555 --- 10.0.0.2 ping statistics --- 00:17:08.555 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:08.555 rtt min/avg/max/mdev = 0.033/0.033/0.033/0.000 ms 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@98 -- # (( pair++ )) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:08.555 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@266 -- # return 0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@107 -- # local dev=target0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@107 -- # local dev=target1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@107 -- # local dev=target0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@107 -- # local dev=target1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:17:08.556 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@18 -- # nvmfappstart -m 0xF 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@724 -- # xtrace_disable 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@10 -- # set +x 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@324 -- # nvmfpid=1821523 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@325 -- # waitforlisten 1821523 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@831 -- # '[' -z 1821523 ']' 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:08.815 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:08.815 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@10 -- # set +x 00:17:08.815 [2024-09-27 15:22:10.484952] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:17:08.815 [2024-09-27 15:22:10.485007] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:08.815 [2024-09-27 15:22:10.554718] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:17:08.815 [2024-09-27 15:22:10.642614] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:17:08.815 [2024-09-27 15:22:10.642661] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:17:08.815 [2024-09-27 15:22:10.642671] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:17:08.815 [2024-09-27 15:22:10.642680] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:17:08.815 [2024-09-27 15:22:10.642687] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:17:08.815 [2024-09-27 15:22:10.642751] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:17:08.815 [2024-09-27 15:22:10.642861] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:17:08.815 [2024-09-27 15:22:10.642962] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:08.815 [2024-09-27 15:22:10.642964] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@864 -- # return 0 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@730 -- # xtrace_disable 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@10 -- # set +x 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/gen_nvme.sh 00:17:09.074 15:22:10 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:17:09.643 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@30 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py framework_get_config bdev 00:17:09.643 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@30 -- # jq -r '.[].params | select(.name=="Nvme0").traddr' 00:17:09.902 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@30 -- # local_nvme_trid=0000:5e:00.0 00:17:09.902 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 00:17:10.161 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@31 -- # bdevs=' Malloc0' 00:17:10.161 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@33 -- # '[' -n 0000:5e:00.0 ']' 00:17:10.161 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@34 -- # bdevs=' Malloc0 Nvme0n1' 00:17:10.161 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@37 -- # '[' rdma == rdma ']' 00:17:10.161 15:22:11 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@40 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -c 0 00:17:10.161 [2024-09-27 15:22:11.932538] rdma.c:2734:nvmf_rdma_create: *WARNING*: In capsule data size is set to 256, this is minimum size required to support msdbd=16 00:17:10.161 [2024-09-27 15:22:11.953637] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x703e30/0x5d9aa0) succeed. 00:17:10.161 [2024-09-27 15:22:11.964259] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x705290/0x659580) succeed. 00:17:10.421 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@44 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:17:10.680 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@45 -- # for bdev in $bdevs 00:17:10.680 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:17:10.680 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@45 -- # for bdev in $bdevs 00:17:10.680 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@46 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Nvme0n1 00:17:10.939 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:17:11.198 [2024-09-27 15:22:12.882105] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:17:11.198 15:22:12 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@49 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:17:11.457 15:22:13 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@52 -- # '[' -n 0000:5e:00.0 ']' 00:17:11.457 15:22:13 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@53 -- # perf_app -i 0 -q 32 -o 4096 -w randrw -M 50 -t 1 -r 'trtype:PCIe traddr:0000:5e:00.0' 00:17:11.457 15:22:13 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@21 -- # '[' 0 -eq 1 ']' 00:17:11.457 15:22:13 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -i 0 -q 32 -o 4096 -w randrw -M 50 -t 1 -r 'trtype:PCIe traddr:0000:5e:00.0' 00:17:12.836 Initializing NVMe Controllers 00:17:12.836 Attached to NVMe Controller at 0000:5e:00.0 [144d:a80a] 00:17:12.836 Associating PCIE (0000:5e:00.0) NSID 1 with lcore 0 00:17:12.836 Initialization complete. Launching workers. 00:17:12.836 ======================================================== 00:17:12.836 Latency(us) 00:17:12.836 Device Information : IOPS MiB/s Average min max 00:17:12.836 PCIE (0000:5e:00.0) NSID 1 from core 0: 94954.03 370.91 336.50 45.69 4975.06 00:17:12.836 ======================================================== 00:17:12.836 Total : 94954.03 370.91 336.50 45.69 4975.06 00:17:12.836 00:17:12.836 15:22:14 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@56 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 1 -o 4096 -w randrw -M 50 -t 1 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' 00:17:16.146 Initializing NVMe Controllers 00:17:16.146 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:17:16.146 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:17:16.146 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 with lcore 0 00:17:16.146 Initialization complete. Launching workers. 00:17:16.146 ======================================================== 00:17:16.146 Latency(us) 00:17:16.146 Device Information : IOPS MiB/s Average min max 00:17:16.146 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 6788.54 26.52 147.07 49.11 4091.88 00:17:16.146 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 from core 0: 5076.71 19.83 195.97 71.89 4108.98 00:17:16.146 ======================================================== 00:17:16.146 Total : 11865.25 46.35 167.99 49.11 4108.98 00:17:16.146 00:17:16.146 15:22:17 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 32 -o 4096 -w randrw -M 50 -t 1 -HI -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' 00:17:19.436 Initializing NVMe Controllers 00:17:19.436 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:17:19.436 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:17:19.436 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 with lcore 0 00:17:19.436 Initialization complete. Launching workers. 00:17:19.436 ======================================================== 00:17:19.436 Latency(us) 00:17:19.436 Device Information : IOPS MiB/s Average min max 00:17:19.436 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 18086.98 70.65 1769.00 502.80 5605.07 00:17:19.436 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 from core 0: 4032.00 15.75 7979.35 5974.14 9882.58 00:17:19.436 ======================================================== 00:17:19.436 Total : 22118.98 86.40 2901.06 502.80 9882.58 00:17:19.436 00:17:19.436 15:22:21 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@59 -- # [[ mlx5 == \e\8\1\0 ]] 00:17:19.436 15:22:21 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 128 -o 262144 -O 16384 -w randrw -M 50 -t 2 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' 00:17:23.637 Initializing NVMe Controllers 00:17:23.637 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:17:23.637 Controller IO queue size 128, less than required. 00:17:23.637 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:17:23.637 Controller IO queue size 128, less than required. 00:17:23.637 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:17:23.637 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:17:23.637 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 with lcore 0 00:17:23.637 Initialization complete. Launching workers. 00:17:23.637 ======================================================== 00:17:23.637 Latency(us) 00:17:23.637 Device Information : IOPS MiB/s Average min max 00:17:23.637 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 3903.58 975.90 32845.96 15488.25 70302.24 00:17:23.637 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 from core 0: 3986.58 996.64 31814.77 15433.14 51623.31 00:17:23.637 ======================================================== 00:17:23.637 Total : 7890.16 1972.54 32324.94 15433.14 70302.24 00:17:23.637 00:17:23.896 15:22:25 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 128 -o 36964 -O 4096 -w randrw -M 50 -t 5 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' -c 0xf -P 4 00:17:24.155 No valid NVMe controllers or AIO or URING devices found 00:17:24.155 Initializing NVMe Controllers 00:17:24.155 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:17:24.155 Controller IO queue size 128, less than required. 00:17:24.155 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:17:24.155 WARNING: IO size 36964 (-o) is not a multiple of nsid 1 sector size 512. Removing this ns from test 00:17:24.155 Controller IO queue size 128, less than required. 00:17:24.155 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:17:24.155 WARNING: IO size 36964 (-o) is not a multiple of nsid 2 sector size 512. Removing this ns from test 00:17:24.155 WARNING: Some requested NVMe devices were skipped 00:17:24.155 15:22:25 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@65 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_perf -q 128 -o 262144 -w randrw -M 50 -t 2 -r 'trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420' --transport-stat 00:17:28.422 Initializing NVMe Controllers 00:17:28.422 Attached to NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:cnode1 00:17:28.422 Controller IO queue size 128, less than required. 00:17:28.422 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:17:28.422 Controller IO queue size 128, less than required. 00:17:28.422 Consider using lower queue depth or smaller IO size, because IO requests may be queued at the NVMe driver. 00:17:28.422 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 with lcore 0 00:17:28.422 Associating RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 with lcore 0 00:17:28.422 Initialization complete. Launching workers. 00:17:28.422 00:17:28.422 ==================== 00:17:28.422 lcore 0, ns RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 statistics: 00:17:28.422 RDMA transport: 00:17:28.422 dev name: mlx5_1 00:17:28.422 polls: 394828 00:17:28.422 idle_polls: 391624 00:17:28.422 completions: 43118 00:17:28.422 queued_requests: 1 00:17:28.422 total_send_wrs: 21559 00:17:28.422 send_doorbell_updates: 2931 00:17:28.422 total_recv_wrs: 21686 00:17:28.422 recv_doorbell_updates: 2936 00:17:28.422 --------------------------------- 00:17:28.422 00:17:28.422 ==================== 00:17:28.422 lcore 0, ns RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 statistics: 00:17:28.422 RDMA transport: 00:17:28.422 dev name: mlx5_1 00:17:28.422 polls: 398481 00:17:28.422 idle_polls: 398213 00:17:28.422 completions: 19778 00:17:28.422 queued_requests: 1 00:17:28.422 total_send_wrs: 9889 00:17:28.422 send_doorbell_updates: 253 00:17:28.422 total_recv_wrs: 10016 00:17:28.422 recv_doorbell_updates: 254 00:17:28.422 --------------------------------- 00:17:28.422 ======================================================== 00:17:28.422 Latency(us) 00:17:28.422 Device Information : IOPS MiB/s Average min max 00:17:28.422 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 1 from core 0: 5389.50 1347.37 23785.72 11468.00 58567.17 00:17:28.422 RDMA (addr:10.0.0.2 subnqn:nqn.2016-06.io.spdk:cnode1) NSID 2 from core 0: 2472.00 618.00 51728.05 29763.38 75907.68 00:17:28.422 ======================================================== 00:17:28.422 Total : 7861.50 1965.37 32572.02 11468.00 75907.68 00:17:28.422 00:17:28.422 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@66 -- # sync 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@67 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@69 -- # '[' 0 -eq 1 ']' 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@112 -- # trap - SIGINT SIGTERM EXIT 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- host/perf.sh@114 -- # nvmftestfini 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@331 -- # nvmfcleanup 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@99 -- # sync 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@102 -- # set +e 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@103 -- # for i in {1..20} 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:17:28.682 rmmod nvme_rdma 00:17:28.682 rmmod nvme_fabrics 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@106 -- # set -e 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@107 -- # return 0 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@332 -- # '[' -n 1821523 ']' 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@333 -- # killprocess 1821523 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@950 -- # '[' -z 1821523 ']' 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@954 -- # kill -0 1821523 00:17:28.682 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@955 -- # uname 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1821523 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1821523' 00:17:28.942 killing process with pid 1821523 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@969 -- # kill 1821523 00:17:28.942 15:22:30 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@974 -- # wait 1821523 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@338 -- # nvmf_fini 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@264 -- # local dev 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@267 -- # remove_target_ns 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@22 -- # _remove_target_ns 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@268 -- # delete_main_bridge 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@130 -- # return 0 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@41 -- # _dev=0 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@41 -- # dev_map=() 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/setup.sh@284 -- # iptr 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@538 -- # iptables-save 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- nvmf/common.sh@538 -- # iptables-restore 00:17:31.481 00:17:31.481 real 0m29.603s 00:17:31.481 user 1m31.137s 00:17:31.481 sys 0m6.751s 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_perf -- common/autotest_common.sh@10 -- # set +x 00:17:31.481 ************************************ 00:17:31.481 END TEST nvmf_perf 00:17:31.481 ************************************ 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@22 -- # run_test nvmf_fio_host /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/fio.sh --transport=rdma 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:17:31.481 ************************************ 00:17:31.481 START TEST nvmf_fio_host 00:17:31.481 ************************************ 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/fio.sh --transport=rdma 00:17:31.481 * Looking for test storage... 00:17:31.481 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1681 -- # lcov --version 00:17:31.481 15:22:32 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@333 -- # local ver1 ver1_l 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@334 -- # local ver2 ver2_l 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@336 -- # IFS=.-: 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@336 -- # read -ra ver1 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@337 -- # IFS=.-: 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@337 -- # read -ra ver2 00:17:31.481 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@338 -- # local 'op=<' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@340 -- # ver1_l=2 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@341 -- # ver2_l=1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@344 -- # case "$op" in 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@345 -- # : 1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@364 -- # (( v = 0 )) 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@365 -- # decimal 1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@353 -- # local d=1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@355 -- # echo 1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@365 -- # ver1[v]=1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@366 -- # decimal 2 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@353 -- # local d=2 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@355 -- # echo 2 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@366 -- # ver2[v]=2 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@368 -- # return 0 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:17:31.482 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:31.482 --rc genhtml_branch_coverage=1 00:17:31.482 --rc genhtml_function_coverage=1 00:17:31.482 --rc genhtml_legend=1 00:17:31.482 --rc geninfo_all_blocks=1 00:17:31.482 --rc geninfo_unexecuted_blocks=1 00:17:31.482 00:17:31.482 ' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:17:31.482 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:31.482 --rc genhtml_branch_coverage=1 00:17:31.482 --rc genhtml_function_coverage=1 00:17:31.482 --rc genhtml_legend=1 00:17:31.482 --rc geninfo_all_blocks=1 00:17:31.482 --rc geninfo_unexecuted_blocks=1 00:17:31.482 00:17:31.482 ' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:17:31.482 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:31.482 --rc genhtml_branch_coverage=1 00:17:31.482 --rc genhtml_function_coverage=1 00:17:31.482 --rc genhtml_legend=1 00:17:31.482 --rc geninfo_all_blocks=1 00:17:31.482 --rc geninfo_unexecuted_blocks=1 00:17:31.482 00:17:31.482 ' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:17:31.482 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:31.482 --rc genhtml_branch_coverage=1 00:17:31.482 --rc genhtml_function_coverage=1 00:17:31.482 --rc genhtml_legend=1 00:17:31.482 --rc geninfo_all_blocks=1 00:17:31.482 --rc geninfo_unexecuted_blocks=1 00:17:31.482 00:17:31.482 ' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@15 -- # shopt -s extglob 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@5 -- # export PATH 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@7 -- # uname -s 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@15 -- # shopt -s extglob 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.482 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@5 -- # export PATH 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@50 -- # : 0 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:17:31.483 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@54 -- # have_pci_nics=0 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@12 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@14 -- # nvmftestinit 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@292 -- # prepare_net_devs 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@254 -- # local -g is_hw=no 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@256 -- # remove_target_ns 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@22 -- # _remove_target_ns 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@125 -- # xtrace_disable 00:17:31.483 15:22:33 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@10 -- # set +x 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@131 -- # pci_devs=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@131 -- # local -a pci_devs 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@132 -- # pci_net_devs=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@133 -- # pci_drivers=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@133 -- # local -A pci_drivers 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@135 -- # net_devs=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@135 -- # local -ga net_devs 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@136 -- # e810=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@136 -- # local -ga e810 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@137 -- # x722=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@137 -- # local -ga x722 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@138 -- # mlx=() 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@138 -- # local -ga mlx 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:17:38.097 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:17:38.097 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:17:38.097 Found net devices under 0000:18:00.0: mlx_0_0 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:38.097 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:17:38.097 Found net devices under 0000:18:00.1: mlx_0_1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@249 -- # get_rdma_if_list 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@75 -- # rdma_devs=() 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@89 -- # continue 2 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@89 -- # continue 2 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@258 -- # is_hw=yes 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@61 -- # uname 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@65 -- # modprobe ib_cm 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@66 -- # modprobe ib_core 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@67 -- # modprobe ib_umad 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@69 -- # modprobe iw_cm 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@27 -- # local -gA dev_map 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@28 -- # local -g _dev 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@44 -- # ips=() 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@58 -- # key_initiator=target1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@11 -- # local val=167772161 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:17:38.098 10.0.0.1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@11 -- # local val=167772162 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:17:38.098 10.0.0.2 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@38 -- # ping_ips 1 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:17:38.098 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@107 -- # local dev=target0 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:38.099 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:38.358 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:38.358 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:17:38.358 00:17:38.358 --- 10.0.0.2 ping statistics --- 00:17:38.358 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:38.358 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@107 -- # local dev=target0 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:38.358 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:38.359 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:38.359 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.024 ms 00:17:38.359 00:17:38.359 --- 10.0.0.2 ping statistics --- 00:17:38.359 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:38.359 rtt min/avg/max/mdev = 0.024/0.024/0.024/0.000 ms 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@98 -- # (( pair++ )) 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@266 -- # return 0 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@107 -- # local dev=target0 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:38.359 15:22:39 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@107 -- # local dev=target1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@107 -- # local dev=target0 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@107 -- # local dev=target1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@16 -- # [[ y != y ]] 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@21 -- # timing_enter start_nvmf_tgt 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@724 -- # xtrace_disable 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@10 -- # set +x 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@24 -- # nvmfpid=1827465 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@23 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xF 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@26 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@28 -- # waitforlisten 1827465 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@831 -- # '[' -z 1827465 ']' 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:38.359 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:38.359 15:22:40 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@10 -- # set +x 00:17:38.359 [2024-09-27 15:22:40.158637] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:17:38.359 [2024-09-27 15:22:40.158705] [ DPDK EAL parameters: nvmf -c 0xF --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:38.620 [2024-09-27 15:22:40.243190] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:17:38.620 [2024-09-27 15:22:40.336933] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:17:38.620 [2024-09-27 15:22:40.336978] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:17:38.620 [2024-09-27 15:22:40.336988] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:17:38.620 [2024-09-27 15:22:40.336997] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:17:38.620 [2024-09-27 15:22:40.337004] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:17:38.620 [2024-09-27 15:22:40.337072] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:17:38.620 [2024-09-27 15:22:40.337172] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:17:38.620 [2024-09-27 15:22:40.337277] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:17:38.620 [2024-09-27 15:22:40.337278] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:17:39.189 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:39.189 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@864 -- # return 0 00:17:39.189 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@29 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:17:39.449 [2024-09-27 15:22:41.208321] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1d734a0/0x1d77990) succeed. 00:17:39.449 [2024-09-27 15:22:41.218958] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1d74ae0/0x1db9030) succeed. 00:17:39.708 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@30 -- # timing_exit start_nvmf_tgt 00:17:39.708 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@730 -- # xtrace_disable 00:17:39.708 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@10 -- # set +x 00:17:39.708 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@32 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 -b Malloc1 00:17:39.968 Malloc1 00:17:39.968 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@33 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:17:40.227 15:22:41 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@34 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc1 00:17:40.227 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@35 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:17:40.487 [2024-09-27 15:22:42.216015] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:17:40.487 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener discovery -t rdma -a 10.0.0.2 -s 4420 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@38 -- # PLUGIN_DIR=/var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@41 -- # fio_nvme /var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme/example_config.fio '--filename=trtype=rdma adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 ns=1' --bs=4096 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1360 -- # fio_plugin /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme /var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme/example_config.fio '--filename=trtype=rdma adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 ns=1' --bs=4096 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1339 -- # local sanitizers 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1341 -- # shift 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1343 -- # local asan_lib= 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # grep libasan 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # asan_lib= 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # asan_lib= 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme' 00:17:40.747 15:22:42 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio /var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme/example_config.fio '--filename=trtype=rdma adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 ns=1' --bs=4096 00:17:41.007 test: (g=0): rw=randrw, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk, iodepth=128 00:17:41.007 fio-3.35 00:17:41.007 Starting 1 thread 00:17:43.542 00:17:43.542 test: (groupid=0, jobs=1): err= 0: pid=1827985: Fri Sep 27 15:22:45 2024 00:17:43.542 read: IOPS=17.6k, BW=68.6MiB/s (72.0MB/s)(138MiB/2004msec) 00:17:43.542 slat (nsec): min=1379, max=34918, avg=1495.04, stdev=454.62 00:17:43.542 clat (usec): min=2154, max=6542, avg=3616.42, stdev=81.94 00:17:43.542 lat (usec): min=2170, max=6543, avg=3617.92, stdev=81.84 00:17:43.542 clat percentiles (usec): 00:17:43.542 | 1.00th=[ 3556], 5.00th=[ 3589], 10.00th=[ 3589], 20.00th=[ 3589], 00:17:43.542 | 30.00th=[ 3621], 40.00th=[ 3621], 50.00th=[ 3621], 60.00th=[ 3621], 00:17:43.542 | 70.00th=[ 3621], 80.00th=[ 3621], 90.00th=[ 3621], 95.00th=[ 3654], 00:17:43.543 | 99.00th=[ 3720], 99.50th=[ 3916], 99.90th=[ 4293], 99.95th=[ 5604], 00:17:43.543 | 99.99th=[ 6456] 00:17:43.543 bw ( KiB/s): min=68856, max=70904, per=100.00%, avg=70290.00, stdev=961.27, samples=4 00:17:43.543 iops : min=17214, max=17726, avg=17572.50, stdev=240.32, samples=4 00:17:43.543 write: IOPS=17.6k, BW=68.7MiB/s (72.0MB/s)(138MiB/2004msec); 0 zone resets 00:17:43.543 slat (nsec): min=1413, max=18559, avg=1826.89, stdev=467.49 00:17:43.543 clat (usec): min=2172, max=6554, avg=3615.86, stdev=94.73 00:17:43.543 lat (usec): min=2183, max=6556, avg=3617.68, stdev=94.65 00:17:43.543 clat percentiles (usec): 00:17:43.543 | 1.00th=[ 3556], 5.00th=[ 3589], 10.00th=[ 3589], 20.00th=[ 3589], 00:17:43.543 | 30.00th=[ 3621], 40.00th=[ 3621], 50.00th=[ 3621], 60.00th=[ 3621], 00:17:43.543 | 70.00th=[ 3621], 80.00th=[ 3621], 90.00th=[ 3621], 95.00th=[ 3654], 00:17:43.543 | 99.00th=[ 3720], 99.50th=[ 3949], 99.90th=[ 4752], 99.95th=[ 6063], 00:17:43.543 | 99.99th=[ 6521] 00:17:43.543 bw ( KiB/s): min=68880, max=70832, per=100.00%, avg=70322.00, stdev=961.70, samples=4 00:17:43.543 iops : min=17220, max=17708, avg=17580.50, stdev=240.42, samples=4 00:17:43.543 lat (msec) : 4=99.84%, 10=0.16% 00:17:43.543 cpu : usr=99.40%, sys=0.15%, ctx=16, majf=0, minf=3 00:17:43.543 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.9% 00:17:43.543 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:43.543 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:17:43.543 issued rwts: total=35203,35230,0,0 short=0,0,0,0 dropped=0,0,0,0 00:17:43.543 latency : target=0, window=0, percentile=100.00%, depth=128 00:17:43.543 00:17:43.543 Run status group 0 (all jobs): 00:17:43.543 READ: bw=68.6MiB/s (72.0MB/s), 68.6MiB/s-68.6MiB/s (72.0MB/s-72.0MB/s), io=138MiB (144MB), run=2004-2004msec 00:17:43.543 WRITE: bw=68.7MiB/s (72.0MB/s), 68.7MiB/s-68.7MiB/s (72.0MB/s-72.0MB/s), io=138MiB (144MB), run=2004-2004msec 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@45 -- # fio_nvme /var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme/mock_sgl_config.fio '--filename=trtype=rdma adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 ns=1' 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1360 -- # fio_plugin /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme /var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme/mock_sgl_config.fio '--filename=trtype=rdma adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 ns=1' 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1339 -- # local sanitizers 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1341 -- # shift 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1343 -- # local asan_lib= 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # grep libasan 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # asan_lib= 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1345 -- # asan_lib= 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/fio/spdk_nvme' 00:17:43.543 15:22:45 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio /var/jenkins/workspace/nvmf-phy-autotest/spdk/app/fio/nvme/mock_sgl_config.fio '--filename=trtype=rdma adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 ns=1' 00:17:43.800 test: (g=0): rw=randrw, bs=(R) 16.0KiB-16.0KiB, (W) 16.0KiB-16.0KiB, (T) 16.0KiB-16.0KiB, ioengine=spdk, iodepth=128 00:17:43.800 fio-3.35 00:17:43.800 Starting 1 thread 00:17:46.323 00:17:46.323 test: (groupid=0, jobs=1): err= 0: pid=1828396: Fri Sep 27 15:22:47 2024 00:17:46.323 read: IOPS=14.3k, BW=224MiB/s (235MB/s)(444MiB/1981msec) 00:17:46.323 slat (nsec): min=2299, max=51950, avg=2649.00, stdev=1186.48 00:17:46.323 clat (usec): min=511, max=8701, avg=1556.28, stdev=1205.24 00:17:46.323 lat (usec): min=514, max=8718, avg=1558.93, stdev=1205.62 00:17:46.323 clat percentiles (usec): 00:17:46.323 | 1.00th=[ 701], 5.00th=[ 799], 10.00th=[ 848], 20.00th=[ 930], 00:17:46.323 | 30.00th=[ 1004], 40.00th=[ 1090], 50.00th=[ 1188], 60.00th=[ 1303], 00:17:46.323 | 70.00th=[ 1434], 80.00th=[ 1598], 90.00th=[ 2147], 95.00th=[ 5014], 00:17:46.323 | 99.00th=[ 6325], 99.50th=[ 7046], 99.90th=[ 8455], 99.95th=[ 8586], 00:17:46.323 | 99.99th=[ 8586] 00:17:46.323 bw ( KiB/s): min=110720, max=116832, per=49.40%, avg=113288.00, stdev=2848.94, samples=4 00:17:46.323 iops : min= 6920, max= 7302, avg=7080.50, stdev=178.06, samples=4 00:17:46.323 write: IOPS=8030, BW=125MiB/s (132MB/s)(230MiB/1835msec); 0 zone resets 00:17:46.323 slat (usec): min=26, max=122, avg=30.57, stdev= 6.63 00:17:46.323 clat (usec): min=4284, max=20009, avg=12939.81, stdev=1994.90 00:17:46.323 lat (usec): min=4313, max=20037, avg=12970.38, stdev=1994.37 00:17:46.323 clat percentiles (usec): 00:17:46.323 | 1.00th=[ 7046], 5.00th=[10028], 10.00th=[10552], 20.00th=[11338], 00:17:46.323 | 30.00th=[11994], 40.00th=[12518], 50.00th=[12911], 60.00th=[13304], 00:17:46.323 | 70.00th=[13829], 80.00th=[14484], 90.00th=[15401], 95.00th=[16188], 00:17:46.323 | 99.00th=[18482], 99.50th=[18744], 99.90th=[19530], 99.95th=[19792], 00:17:46.323 | 99.99th=[19792] 00:17:46.323 bw ( KiB/s): min=113536, max=121984, per=91.11%, avg=117064.00, stdev=3703.57, samples=4 00:17:46.323 iops : min= 7096, max= 7624, avg=7316.50, stdev=231.47, samples=4 00:17:46.323 lat (usec) : 750=1.64%, 1000=17.89% 00:17:46.323 lat (msec) : 2=39.26%, 4=2.00%, 10=6.89%, 20=32.33%, 50=0.01% 00:17:46.323 cpu : usr=97.06%, sys=1.25%, ctx=183, majf=0, minf=2 00:17:46.323 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.2%, 16=0.3%, 32=0.7%, >=64=98.7% 00:17:46.323 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:46.323 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:17:46.323 issued rwts: total=28392,14736,0,0 short=0,0,0,0 dropped=0,0,0,0 00:17:46.323 latency : target=0, window=0, percentile=100.00%, depth=128 00:17:46.323 00:17:46.323 Run status group 0 (all jobs): 00:17:46.323 READ: bw=224MiB/s (235MB/s), 224MiB/s-224MiB/s (235MB/s-235MB/s), io=444MiB (465MB), run=1981-1981msec 00:17:46.323 WRITE: bw=125MiB/s (132MB/s), 125MiB/s-125MiB/s (132MB/s-132MB/s), io=230MiB (241MB), run=1835-1835msec 00:17:46.323 15:22:47 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@47 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@49 -- # '[' 0 -eq 1 ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@85 -- # rm -f ./local-test-0-verify.state 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- host/fio.sh@86 -- # nvmftestfini 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@331 -- # nvmfcleanup 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@99 -- # sync 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@102 -- # set +e 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@103 -- # for i in {1..20} 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:17:46.323 rmmod nvme_rdma 00:17:46.323 rmmod nvme_fabrics 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@106 -- # set -e 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@107 -- # return 0 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@332 -- # '[' -n 1827465 ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@333 -- # killprocess 1827465 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@950 -- # '[' -z 1827465 ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@954 -- # kill -0 1827465 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@955 -- # uname 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1827465 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1827465' 00:17:46.323 killing process with pid 1827465 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@969 -- # kill 1827465 00:17:46.323 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@974 -- # wait 1827465 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@338 -- # nvmf_fini 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@264 -- # local dev 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@267 -- # remove_target_ns 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@22 -- # _remove_target_ns 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@268 -- # delete_main_bridge 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@130 -- # return 0 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@41 -- # _dev=0 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@41 -- # dev_map=() 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/setup.sh@284 -- # iptr 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@538 -- # iptables-save 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- nvmf/common.sh@538 -- # iptables-restore 00:17:46.890 00:17:46.890 real 0m15.650s 00:17:46.890 user 0m44.971s 00:17:46.890 sys 0m6.510s 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_fio_host -- common/autotest_common.sh@10 -- # set +x 00:17:46.890 ************************************ 00:17:46.890 END TEST nvmf_fio_host 00:17:46.890 ************************************ 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@23 -- # run_test nvmf_failover /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/failover.sh --transport=rdma 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:17:46.890 ************************************ 00:17:46.890 START TEST nvmf_failover 00:17:46.890 ************************************ 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/failover.sh --transport=rdma 00:17:46.890 * Looking for test storage... 00:17:46.890 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1681 -- # lcov --version 00:17:46.890 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@333 -- # local ver1 ver1_l 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@334 -- # local ver2 ver2_l 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@336 -- # IFS=.-: 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@336 -- # read -ra ver1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@337 -- # IFS=.-: 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@337 -- # read -ra ver2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@338 -- # local 'op=<' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@340 -- # ver1_l=2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@341 -- # ver2_l=1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@344 -- # case "$op" in 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@345 -- # : 1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@364 -- # (( v = 0 )) 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@365 -- # decimal 1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@353 -- # local d=1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@355 -- # echo 1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@365 -- # ver1[v]=1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@366 -- # decimal 2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@353 -- # local d=2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@355 -- # echo 2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@366 -- # ver2[v]=2 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@368 -- # return 0 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:17:47.150 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:47.150 --rc genhtml_branch_coverage=1 00:17:47.150 --rc genhtml_function_coverage=1 00:17:47.150 --rc genhtml_legend=1 00:17:47.150 --rc geninfo_all_blocks=1 00:17:47.150 --rc geninfo_unexecuted_blocks=1 00:17:47.150 00:17:47.150 ' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:17:47.150 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:47.150 --rc genhtml_branch_coverage=1 00:17:47.150 --rc genhtml_function_coverage=1 00:17:47.150 --rc genhtml_legend=1 00:17:47.150 --rc geninfo_all_blocks=1 00:17:47.150 --rc geninfo_unexecuted_blocks=1 00:17:47.150 00:17:47.150 ' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:17:47.150 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:47.150 --rc genhtml_branch_coverage=1 00:17:47.150 --rc genhtml_function_coverage=1 00:17:47.150 --rc genhtml_legend=1 00:17:47.150 --rc geninfo_all_blocks=1 00:17:47.150 --rc geninfo_unexecuted_blocks=1 00:17:47.150 00:17:47.150 ' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:17:47.150 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:47.150 --rc genhtml_branch_coverage=1 00:17:47.150 --rc genhtml_function_coverage=1 00:17:47.150 --rc genhtml_legend=1 00:17:47.150 --rc geninfo_all_blocks=1 00:17:47.150 --rc geninfo_unexecuted_blocks=1 00:17:47.150 00:17:47.150 ' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@7 -- # uname -s 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@15 -- # shopt -s extglob 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:17:47.150 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- paths/export.sh@5 -- # export PATH 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@50 -- # : 0 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:17:47.151 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@54 -- # have_pci_nics=0 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@11 -- # MALLOC_BDEV_SIZE=64 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@12 -- # MALLOC_BLOCK_SIZE=512 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@14 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@16 -- # bdevperf_rpc_sock=/var/tmp/bdevperf.sock 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@18 -- # nvmftestinit 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@292 -- # prepare_net_devs 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@254 -- # local -g is_hw=no 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@256 -- # remove_target_ns 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@22 -- # _remove_target_ns 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@125 -- # xtrace_disable 00:17:47.151 15:22:48 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@131 -- # pci_devs=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@131 -- # local -a pci_devs 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@132 -- # pci_net_devs=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@133 -- # pci_drivers=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@133 -- # local -A pci_drivers 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@135 -- # net_devs=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@135 -- # local -ga net_devs 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@136 -- # e810=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@136 -- # local -ga e810 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@137 -- # x722=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@137 -- # local -ga x722 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@138 -- # mlx=() 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@138 -- # local -ga mlx 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:17:53.724 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:17:53.725 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:17:53.725 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:17:53.725 Found net devices under 0000:18:00.0: mlx_0_0 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:17:53.725 Found net devices under 0000:18:00.1: mlx_0_1 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@249 -- # get_rdma_if_list 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@75 -- # rdma_devs=() 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@89 -- # continue 2 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@89 -- # continue 2 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@258 -- # is_hw=yes 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@61 -- # uname 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@65 -- # modprobe ib_cm 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@66 -- # modprobe ib_core 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@67 -- # modprobe ib_umad 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@69 -- # modprobe iw_cm 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@27 -- # local -gA dev_map 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@28 -- # local -g _dev 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@44 -- # ips=() 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@58 -- # key_initiator=target1 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:17:53.725 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@11 -- # local val=167772161 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:17:53.726 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:17:53.986 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:17:53.986 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:17:53.986 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:17:53.986 10.0.0.1 00:17:53.986 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@11 -- # local val=167772162 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:17:53.987 10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@38 -- # ping_ips 1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@107 -- # local dev=target0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:53.987 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:53.987 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.033 ms 00:17:53.987 00:17:53.987 --- 10.0.0.2 ping statistics --- 00:17:53.987 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:53.987 rtt min/avg/max/mdev = 0.033/0.033/0.033/0.000 ms 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@107 -- # local dev=target0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:17:53.987 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:17:53.987 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:17:53.987 00:17:53.987 --- 10.0.0.2 ping statistics --- 00:17:53.987 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:17:53.987 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@98 -- # (( pair++ )) 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@266 -- # return 0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@107 -- # local dev=target0 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:17:53.987 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@107 -- # local dev=target1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # get_net_dev target0 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@107 -- # local dev=target0 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # get_net_dev target1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@107 -- # local dev=target1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@20 -- # nvmfappstart -m 0xE 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@724 -- # xtrace_disable 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@324 -- # nvmfpid=1831775 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0xE 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@325 -- # waitforlisten 1831775 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@831 -- # '[' -z 1831775 ']' 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:53.988 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:53.988 15:22:55 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:17:54.247 [2024-09-27 15:22:55.850290] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:17:54.247 [2024-09-27 15:22:55.850359] [ DPDK EAL parameters: nvmf -c 0xE --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:54.247 [2024-09-27 15:22:55.938899] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:17:54.247 [2024-09-27 15:22:56.023993] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:17:54.247 [2024-09-27 15:22:56.024036] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:17:54.247 [2024-09-27 15:22:56.024046] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:17:54.247 [2024-09-27 15:22:56.024054] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:17:54.247 [2024-09-27 15:22:56.024061] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:17:54.247 [2024-09-27 15:22:56.024144] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:17:54.247 [2024-09-27 15:22:56.024163] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:17:54.247 [2024-09-27 15:22:56.024165] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@864 -- # return 0 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@730 -- # xtrace_disable 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:17:55.181 15:22:56 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@22 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:17:55.181 [2024-09-27 15:22:56.953920] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1fdac10/0x1fdf100) succeed. 00:17:55.181 [2024-09-27 15:22:56.964679] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1fdc1b0/0x20207a0) succeed. 00:17:55.439 15:22:57 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@23 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 -b Malloc0 00:17:55.439 Malloc0 00:17:55.697 15:22:57 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 00:17:55.697 15:22:57 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@25 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:17:55.955 15:22:57 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@26 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:17:56.213 [2024-09-27 15:22:57.873159] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:17:56.213 15:22:57 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@27 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 00:17:56.471 [2024-09-27 15:22:58.069575] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4421 *** 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@28 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4422 00:17:56.471 [2024-09-27 15:22:58.270300] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4422 *** 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@31 -- # bdevperf_pid=1832044 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@30 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -z -r /var/tmp/bdevperf.sock -q 128 -o 4096 -w verify -t 15 -f 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@33 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; cat $testdir/try.txt; rm -f $testdir/try.txt; killprocess $bdevperf_pid; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@34 -- # waitforlisten 1832044 /var/tmp/bdevperf.sock 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@831 -- # '[' -z 1832044 ']' 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:17:56.471 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:56.471 15:22:58 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:17:57.405 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:57.405 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@864 -- # return 0 00:17:57.405 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@35 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:17:57.662 NVMe0n1 00:17:57.663 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4421 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:17:57.920 00:17:57.920 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@39 -- # run_test_pid=1832269 00:17:57.920 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@38 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bdevperf.sock perform_tests 00:17:57.920 15:22:59 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@41 -- # sleep 1 00:17:59.293 15:23:00 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@43 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:17:59.293 15:23:00 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@45 -- # sleep 3 00:18:02.574 15:23:03 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@47 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4422 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:02.574 00:18:02.574 15:23:04 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@48 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 00:18:02.902 15:23:04 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@50 -- # sleep 3 00:18:06.183 15:23:07 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@53 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:18:06.183 [2024-09-27 15:23:07.653281] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:18:06.183 15:23:07 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@55 -- # sleep 1 00:18:07.117 15:23:08 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_remove_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4422 00:18:07.117 15:23:08 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@59 -- # wait 1832269 00:18:13.680 { 00:18:13.680 "results": [ 00:18:13.680 { 00:18:13.680 "job": "NVMe0n1", 00:18:13.680 "core_mask": "0x1", 00:18:13.680 "workload": "verify", 00:18:13.680 "status": "finished", 00:18:13.680 "verify_range": { 00:18:13.680 "start": 0, 00:18:13.680 "length": 16384 00:18:13.680 }, 00:18:13.680 "queue_depth": 128, 00:18:13.680 "io_size": 4096, 00:18:13.680 "runtime": 15.005287, 00:18:13.680 "iops": 14221.054219089578, 00:18:13.680 "mibps": 55.55099304331866, 00:18:13.680 "io_failed": 4645, 00:18:13.680 "io_timeout": 0, 00:18:13.680 "avg_latency_us": 8786.288154409283, 00:18:13.680 "min_latency_us": 336.58434782608697, 00:18:13.680 "max_latency_us": 1021221.8434782609 00:18:13.680 } 00:18:13.680 ], 00:18:13.680 "core_count": 1 00:18:13.680 } 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@61 -- # killprocess 1832044 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@950 -- # '[' -z 1832044 ']' 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@954 -- # kill -0 1832044 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@955 -- # uname 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1832044 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1832044' 00:18:13.680 killing process with pid 1832044 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@969 -- # kill 1832044 00:18:13.680 15:23:14 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@974 -- # wait 1832044 00:18:13.680 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@63 -- # cat /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/try.txt 00:18:13.680 [2024-09-27 15:22:58.349822] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:18:13.680 [2024-09-27 15:22:58.349885] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1832044 ] 00:18:13.680 [2024-09-27 15:22:58.436933] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:13.680 [2024-09-27 15:22:58.519374] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:13.680 Running I/O for 15 seconds... 00:18:13.681 17858.00 IOPS, 69.76 MiB/s 9792.00 IOPS, 38.25 MiB/s [2024-09-27 15:23:01.950162] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:106 nsid:1 lba:26120 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950204] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950226] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:98 nsid:1 lba:26128 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950236] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950249] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:49 nsid:1 lba:26136 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950259] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950269] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:93 nsid:1 lba:26144 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950279] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950289] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:22 nsid:1 lba:26152 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950299] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950311] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:26160 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950320] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950331] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:109 nsid:1 lba:26168 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950345] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950355] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:90 nsid:1 lba:26176 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950365] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950377] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:26184 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950386] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950397] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:6 nsid:1 lba:26192 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950406] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950417] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:26200 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950434] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950446] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:118 nsid:1 lba:26208 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950456] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950467] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:26216 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950477] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950490] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:86 nsid:1 lba:26224 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950501] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950512] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:73 nsid:1 lba:26232 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950522] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950532] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:55 nsid:1 lba:26240 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950542] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950553] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:26248 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950563] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950575] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:53 nsid:1 lba:26256 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950584] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950599] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:114 nsid:1 lba:26264 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950609] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950620] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:102 nsid:1 lba:26272 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950631] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950642] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:115 nsid:1 lba:26280 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950653] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950664] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:117 nsid:1 lba:26288 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950675] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950686] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:89 nsid:1 lba:26296 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950698] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950711] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:30 nsid:1 lba:26304 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950722] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950733] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:85 nsid:1 lba:26312 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950742] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950753] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:16 nsid:1 lba:26320 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950762] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950773] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:96 nsid:1 lba:26328 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950783] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950793] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:124 nsid:1 lba:26336 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950803] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950814] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:91 nsid:1 lba:26344 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950823] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950834] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:81 nsid:1 lba:26352 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950844] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950854] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:104 nsid:1 lba:26360 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950863] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950874] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:26368 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950883] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950894] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:87 nsid:1 lba:26376 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.681 [2024-09-27 15:23:01.950906] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.681 [2024-09-27 15:23:01.950917] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:74 nsid:1 lba:26384 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.950927] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.950937] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:42 nsid:1 lba:26392 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.950946] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.950957] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:101 nsid:1 lba:26400 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.950968] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.950979] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:35 nsid:1 lba:26408 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.950988] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.950999] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:36 nsid:1 lba:26416 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951008] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951019] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:17 nsid:1 lba:26424 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951028] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951039] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:82 nsid:1 lba:26432 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951048] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951058] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:71 nsid:1 lba:26440 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951067] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951078] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:48 nsid:1 lba:26448 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951087] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951098] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:5 nsid:1 lba:26456 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951107] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951117] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:126 nsid:1 lba:26464 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951128] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951139] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:80 nsid:1 lba:26472 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951148] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951159] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:94 nsid:1 lba:26480 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951168] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951178] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:72 nsid:1 lba:26488 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951188] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951199] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:10 nsid:1 lba:26496 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951210] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951220] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:63 nsid:1 lba:26504 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951232] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951245] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:75 nsid:1 lba:26512 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951255] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951265] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:51 nsid:1 lba:26520 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951275] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951285] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:110 nsid:1 lba:26528 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951295] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951305] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:76 nsid:1 lba:26536 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951316] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951327] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:65 nsid:1 lba:26544 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951336] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951351] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:34 nsid:1 lba:26552 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951360] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951370] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:68 nsid:1 lba:26560 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951379] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951390] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:25 nsid:1 lba:26568 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951400] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951410] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:46 nsid:1 lba:26576 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951420] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951430] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:26584 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951439] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951449] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:70 nsid:1 lba:26592 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951459] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951472] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:61 nsid:1 lba:26600 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951481] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951492] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:123 nsid:1 lba:26608 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951500] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951511] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:33 nsid:1 lba:26616 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.682 [2024-09-27 15:23:01.951521] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951533] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:119 nsid:1 lba:25600 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fe000 len:0x1000 key:0x1c2600 00:18:13.682 [2024-09-27 15:23:01.951543] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951554] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:9 nsid:1 lba:25608 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fc000 len:0x1000 key:0x1c2600 00:18:13.682 [2024-09-27 15:23:01.951564] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951574] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:77 nsid:1 lba:25616 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fa000 len:0x1000 key:0x1c2600 00:18:13.682 [2024-09-27 15:23:01.951584] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.682 [2024-09-27 15:23:01.951596] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:79 nsid:1 lba:25624 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f8000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951606] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951617] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:8 nsid:1 lba:25632 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f6000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951627] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951637] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:120 nsid:1 lba:25640 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f4000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951647] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951659] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:12 nsid:1 lba:25648 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f2000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951668] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951679] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:25656 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f0000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951688] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951699] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:25664 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ee000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951710] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951721] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:1 lba:25672 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ec000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951731] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951741] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:113 nsid:1 lba:25680 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ea000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951750] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951761] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:122 nsid:1 lba:25688 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e8000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951771] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951782] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:112 nsid:1 lba:25696 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e6000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951792] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951802] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:25704 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e4000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951811] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951822] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:32 nsid:1 lba:25712 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e2000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951831] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951843] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:54 nsid:1 lba:25720 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e0000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951852] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951863] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:1 lba:25728 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075de000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951873] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951884] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:27 nsid:1 lba:25736 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075dc000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951894] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951905] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:47 nsid:1 lba:25744 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075da000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951915] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951925] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:25752 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d8000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951935] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951947] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:25760 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d6000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951956] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951967] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:23 nsid:1 lba:25768 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d4000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951977] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.951988] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:1 lba:25776 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d2000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.951998] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952008] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:83 nsid:1 lba:25784 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d0000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952018] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952029] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:116 nsid:1 lba:25792 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ce000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952038] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952049] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:25800 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075cc000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952059] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952069] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:121 nsid:1 lba:25808 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ca000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952079] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952090] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:103 nsid:1 lba:25816 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c8000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952100] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952111] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:69 nsid:1 lba:25824 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c6000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952120] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952131] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:92 nsid:1 lba:25832 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c4000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952141] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952154] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:25840 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c2000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952163] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952174] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:25848 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c0000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952185] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952195] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:25856 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075be000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952206] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952219] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:107 nsid:1 lba:25864 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075bc000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952229] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952239] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:97 nsid:1 lba:25872 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ba000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952249] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.683 [2024-09-27 15:23:01.952259] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:25880 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b8000 len:0x1000 key:0x1c2600 00:18:13.683 [2024-09-27 15:23:01.952269] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952281] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:26 nsid:1 lba:25888 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b6000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952291] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952302] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:25896 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b4000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952311] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952321] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:67 nsid:1 lba:25904 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b2000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952332] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952346] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:25912 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b0000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952356] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952367] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:100 nsid:1 lba:25920 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ae000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952377] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952388] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:25928 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ac000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952397] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952409] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:78 nsid:1 lba:25936 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075aa000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952420] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952431] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:28 nsid:1 lba:25944 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a8000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952440] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952451] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:56 nsid:1 lba:25952 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a6000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952460] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952472] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:24 nsid:1 lba:25960 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a4000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952482] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952493] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:60 nsid:1 lba:25968 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a2000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952502] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952513] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:62 nsid:1 lba:25976 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a0000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952522] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952533] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:20 nsid:1 lba:25984 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759e000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952543] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952555] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:66 nsid:1 lba:25992 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759c000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952564] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952575] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:21 nsid:1 lba:26000 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759a000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952584] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952596] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:7 nsid:1 lba:26008 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007598000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952606] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952617] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:26016 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007596000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952626] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952637] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:84 nsid:1 lba:26024 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007594000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952646] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952659] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:108 nsid:1 lba:26032 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007592000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952669] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952680] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:125 nsid:1 lba:26040 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007590000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952689] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952699] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:15 nsid:1 lba:26048 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758e000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952709] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952720] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:26056 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758c000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952730] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952740] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:26064 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758a000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952750] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952760] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:111 nsid:1 lba:26072 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007588000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952769] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952781] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:99 nsid:1 lba:26080 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007586000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952791] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952802] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:88 nsid:1 lba:26088 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007584000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952811] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952821] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:105 nsid:1 lba:26096 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007582000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952831] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.952842] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:1 lba:26104 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007580000 len:0x1000 key:0x1c2600 00:18:13.684 [2024-09-27 15:23:01.952852] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.954698] nvme_qpair.c: 579:nvme_qpair_abort_queued_reqs: *ERROR*: aborting queued i/o 00:18:13.684 [2024-09-27 15:23:01.954714] nvme_qpair.c: 558:nvme_qpair_manual_complete_request: *NOTICE*: Command completed manually: 00:18:13.684 [2024-09-27 15:23:01.954723] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:1 lba:26112 len:8 PRP1 0x0 PRP2 0x0 00:18:13.684 [2024-09-27 15:23:01.954734] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:18:13.684 [2024-09-27 15:23:01.954784] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200019ae4900 was disconnected and freed. reset controller. 00:18:13.684 [2024-09-27 15:23:01.954797] bdev_nvme.c:1987:bdev_nvme_failover_trid: *NOTICE*: Start failover from 10.0.0.2:4420 to 10.0.0.2:4421 00:18:13.684 [2024-09-27 15:23:01.954808] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:18:13.684 [2024-09-27 15:23:01.957631] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] resetting controller 00:18:13.684 [2024-09-27 15:23:01.972263] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:18:13.685 [2024-09-27 15:23:02.014594] bdev_nvme.c:2183:_bdev_nvme_reset_ctrlr_complete: *NOTICE*: Resetting controller successful. 00:18:13.685 11557.33 IOPS, 45.15 MiB/s 13163.25 IOPS, 51.42 MiB/s 12604.80 IOPS, 49.24 MiB/s [2024-09-27 15:23:05.453282] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:18 nsid:1 lba:122040 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758e000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453327] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453354] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:111 nsid:1 lba:122048 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007590000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453365] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453377] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:80 nsid:1 lba:122056 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007592000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453387] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453398] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:96 nsid:1 lba:122064 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007594000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453407] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453418] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:122072 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007596000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453428] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453438] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:55 nsid:1 lba:122080 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007598000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453448] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453459] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:122088 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759a000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453469] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453480] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:122096 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075be000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453489] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453501] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:79 nsid:1 lba:122104 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ce000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453516] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453527] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:65 nsid:1 lba:122112 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d0000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453538] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453550] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:16 nsid:1 lba:122120 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d2000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453559] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453571] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:109 nsid:1 lba:122128 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d4000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453581] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453592] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:107 nsid:1 lba:122136 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d6000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453605] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453616] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:122144 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d8000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453628] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453639] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:122152 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075da000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453649] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453660] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:120 nsid:1 lba:122552 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453669] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453680] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:50 nsid:1 lba:122560 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453690] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453706] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:67 nsid:1 lba:122568 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453715] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453726] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:91 nsid:1 lba:122576 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453736] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453747] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:66 nsid:1 lba:122584 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453760] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453771] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:83 nsid:1 lba:122592 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453789] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453800] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:84 nsid:1 lba:122600 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453810] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453822] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:27 nsid:1 lba:122608 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.685 [2024-09-27 15:23:05.453834] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453846] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:24 nsid:1 lba:122160 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007570000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453855] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453866] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:69 nsid:1 lba:122168 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757e000 len:0x1000 key:0x1c2600 00:18:13.685 [2024-09-27 15:23:05.453876] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.685 [2024-09-27 15:23:05.453888] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:93 nsid:1 lba:122176 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007580000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.453898] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.453909] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:122184 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007582000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.453918] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.453929] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:97 nsid:1 lba:122192 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007584000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.453939] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.453951] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:125 nsid:1 lba:122200 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007586000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.453960] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.453971] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:5 nsid:1 lba:122208 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007588000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.453980] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.453991] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:81 nsid:1 lba:122216 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758a000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454000] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454011] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:99 nsid:1 lba:122224 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075dc000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454020] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454032] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:122616 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454041] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454053] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:104 nsid:1 lba:122624 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454062] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454073] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:123 nsid:1 lba:122632 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454082] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454092] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:57 nsid:1 lba:122640 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454101] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454112] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:73 nsid:1 lba:122648 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454122] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454133] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:88 nsid:1 lba:122656 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454142] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454153] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:92 nsid:1 lba:122664 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454162] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454173] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:32 nsid:1 lba:122672 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454182] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454192] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:122232 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000756e000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454202] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454213] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:100 nsid:1 lba:122240 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000756c000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454223] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454234] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:122248 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000756a000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454243] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454254] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:12 nsid:1 lba:122256 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007568000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454264] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454276] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:102 nsid:1 lba:122264 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007566000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454286] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454297] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:9 nsid:1 lba:122272 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007564000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454307] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454318] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:122280 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007562000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454327] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454338] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:86 nsid:1 lba:122288 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758c000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454354] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454365] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:87 nsid:1 lba:122296 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000755e000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454374] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454386] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:77 nsid:1 lba:122304 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000755c000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454396] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454407] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:21 nsid:1 lba:122312 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000755a000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454417] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454428] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:10 nsid:1 lba:122320 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007558000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454437] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454448] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:105 nsid:1 lba:122328 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007556000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454458] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454470] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:122336 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007554000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454479] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454490] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:1 lba:122344 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007552000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454498] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454510] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:122352 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007550000 len:0x1000 key:0x1c2600 00:18:13.686 [2024-09-27 15:23:05.454521] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454532] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:116 nsid:1 lba:122680 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454541] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454552] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:113 nsid:1 lba:122688 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454561] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.686 [2024-09-27 15:23:05.454572] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:122696 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.686 [2024-09-27 15:23:05.454581] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454592] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:121 nsid:1 lba:122704 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454601] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454611] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:60 nsid:1 lba:122712 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454620] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454631] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:37 nsid:1 lba:122720 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454640] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454651] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:74 nsid:1 lba:122728 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454660] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454670] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:108 nsid:1 lba:122736 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454679] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454689] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:1 lba:122744 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454700] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454712] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:20 nsid:1 lba:122752 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454721] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454731] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:98 nsid:1 lba:122760 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454740] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454751] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:95 nsid:1 lba:122768 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454762] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454773] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:122776 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454783] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454794] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:6 nsid:1 lba:122784 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454803] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454814] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:122792 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454823] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454834] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:40 nsid:1 lba:122800 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454844] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454854] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:47 nsid:1 lba:122808 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454863] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454873] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:112 nsid:1 lba:122816 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454883] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454894] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:31 nsid:1 lba:122824 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454903] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454914] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:103 nsid:1 lba:122832 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454923] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454933] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:106 nsid:1 lba:122840 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454942] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454953] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:25 nsid:1 lba:122848 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454962] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454973] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:62 nsid:1 lba:122856 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.454982] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.454992] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:76 nsid:1 lba:122864 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455001] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455014] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:85 nsid:1 lba:122872 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455024] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455035] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:56 nsid:1 lba:122880 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455044] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455054] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:119 nsid:1 lba:122888 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455063] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455074] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:72 nsid:1 lba:122896 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455084] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455095] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:68 nsid:1 lba:122904 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455104] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455115] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:110 nsid:1 lba:122912 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455124] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455135] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:8 nsid:1 lba:122920 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455145] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455156] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:126 nsid:1 lba:122928 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.687 [2024-09-27 15:23:05.455165] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455176] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:117 nsid:1 lba:122360 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759c000 len:0x1000 key:0x1c2600 00:18:13.687 [2024-09-27 15:23:05.455185] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455196] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:75 nsid:1 lba:122368 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f0000 len:0x1000 key:0x1c2600 00:18:13.687 [2024-09-27 15:23:05.455206] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455217] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:22 nsid:1 lba:122376 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f2000 len:0x1000 key:0x1c2600 00:18:13.687 [2024-09-27 15:23:05.455226] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.687 [2024-09-27 15:23:05.455237] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:122 nsid:1 lba:122384 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f4000 len:0x1000 key:0x1c2600 00:18:13.687 [2024-09-27 15:23:05.455246] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455259] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:122392 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f6000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455269] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455280] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:1 lba:122400 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f8000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455289] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455299] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:23 nsid:1 lba:122408 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fa000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455308] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455319] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:122416 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fc000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455329] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455340] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:46 nsid:1 lba:122424 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075cc000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455358] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455369] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:114 nsid:1 lba:122432 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ca000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455378] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455389] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:52 nsid:1 lba:122440 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c8000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455399] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455410] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:26 nsid:1 lba:122448 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c6000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455419] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455430] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:14 nsid:1 lba:122456 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c4000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455439] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455450] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:70 nsid:1 lba:122464 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c2000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455460] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455471] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:122472 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c0000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455480] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455492] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:49 nsid:1 lba:122480 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075bc000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455500] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455512] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:122936 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455521] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455534] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:122944 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455543] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455554] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:34 nsid:1 lba:122952 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455563] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455573] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:89 nsid:1 lba:122960 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455582] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455594] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:82 nsid:1 lba:122968 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455605] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455616] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:122976 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455625] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455635] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:42 nsid:1 lba:122984 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455645] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455656] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:101 nsid:1 lba:122992 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455665] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455675] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:122488 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fe000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455685] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455696] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:122496 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757c000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455706] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455717] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:90 nsid:1 lba:122504 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757a000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455727] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455739] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:94 nsid:1 lba:122512 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007578000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455748] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455760] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:7 nsid:1 lba:122520 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007576000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455769] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455780] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:124 nsid:1 lba:122528 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007574000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455789] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455800] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:17 nsid:1 lba:122536 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007572000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455809] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455820] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:115 nsid:1 lba:122544 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ee000 len:0x1000 key:0x1c2600 00:18:13.688 [2024-09-27 15:23:05.455830] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455841] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:118 nsid:1 lba:123000 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455850] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455860] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:43 nsid:1 lba:123008 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455869] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455880] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:78 nsid:1 lba:123016 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455889] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455900] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:59 nsid:1 lba:123024 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455909] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455920] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:28 nsid:1 lba:123032 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.688 [2024-09-27 15:23:05.455928] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.688 [2024-09-27 15:23:05.455939] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:71 nsid:1 lba:123040 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:05.455948] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:05.455959] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:54 nsid:1 lba:123048 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:05.455968] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:05.457887] nvme_qpair.c: 579:nvme_qpair_abort_queued_reqs: *ERROR*: aborting queued i/o 00:18:13.689 [2024-09-27 15:23:05.457904] nvme_qpair.c: 558:nvme_qpair_manual_complete_request: *NOTICE*: Command completed manually: 00:18:13.689 [2024-09-27 15:23:05.457913] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:123056 len:8 PRP1 0x0 PRP2 0x0 00:18:13.689 [2024-09-27 15:23:05.457929] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:05.457976] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200019ae4840 was disconnected and freed. reset controller. 00:18:13.689 [2024-09-27 15:23:05.457989] bdev_nvme.c:1987:bdev_nvme_failover_trid: *NOTICE*: Start failover from 10.0.0.2:4421 to 10.0.0.2:4422 00:18:13.689 [2024-09-27 15:23:05.458001] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:18:13.689 [2024-09-27 15:23:05.460844] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] resetting controller 00:18:13.689 [2024-09-27 15:23:05.475385] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:18:13.689 [2024-09-27 15:23:05.520869] bdev_nvme.c:2183:_bdev_nvme_reset_ctrlr_complete: *NOTICE*: Resetting controller successful. 00:18:13.689 11541.67 IOPS, 45.08 MiB/s 12504.00 IOPS, 48.84 MiB/s 13220.25 IOPS, 51.64 MiB/s 13765.78 IOPS, 53.77 MiB/s 12389.20 IOPS, 48.40 MiB/s [2024-09-27 15:23:09.874039] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:37 nsid:1 lba:97040 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e2000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874081] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874104] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:56 nsid:1 lba:97048 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e4000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874114] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874126] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:86 nsid:1 lba:97056 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e6000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874135] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874147] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:20 nsid:1 lba:97064 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e8000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874156] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874167] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:97072 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ea000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874177] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874188] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:97080 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ec000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874198] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874209] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:105 nsid:1 lba:97088 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000754e000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874219] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874236] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:97096 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000754c000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874246] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874257] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:60 nsid:1 lba:97104 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000754a000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874267] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874280] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:99 nsid:1 lba:97112 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007548000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874289] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874301] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:62 nsid:1 lba:97120 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007546000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874310] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874322] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:75 nsid:1 lba:97128 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007544000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874333] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874348] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:71 nsid:1 lba:97136 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007542000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874358] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874369] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:16 nsid:1 lba:97144 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007540000 len:0x1000 key:0x1c2600 00:18:13.689 [2024-09-27 15:23:09.874378] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874389] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:53 nsid:1 lba:97408 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874399] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874410] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:118 nsid:1 lba:97416 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874419] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874430] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:104 nsid:1 lba:97424 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874439] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874452] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:24 nsid:1 lba:97432 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874463] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874476] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:51 nsid:1 lba:97440 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874487] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874500] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:98 nsid:1 lba:97448 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874510] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874522] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:64 nsid:1 lba:97456 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874532] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874543] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:97464 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874554] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874565] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:40 nsid:1 lba:97472 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874576] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874587] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:9 nsid:1 lba:97480 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874598] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874610] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:113 nsid:1 lba:97488 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874620] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.689 [2024-09-27 15:23:09.874630] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:78 nsid:1 lba:97496 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.689 [2024-09-27 15:23:09.874639] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874650] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:81 nsid:1 lba:97504 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874659] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874670] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:90 nsid:1 lba:97512 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874680] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874691] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:5 nsid:1 lba:97520 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874700] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874711] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:52 nsid:1 lba:97528 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874721] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874731] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:50 nsid:1 lba:97536 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874740] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874751] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:97544 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874763] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874773] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:66 nsid:1 lba:97552 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874782] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874793] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:77 nsid:1 lba:97560 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874802] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874813] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:106 nsid:1 lba:97568 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874822] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874832] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:41 nsid:1 lba:97576 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874841] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874852] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:102 nsid:1 lba:97584 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874861] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874872] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:109 nsid:1 lba:97592 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874882] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874892] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:12 nsid:1 lba:97600 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874902] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874913] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:29 nsid:1 lba:97608 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874922] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874933] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:125 nsid:1 lba:97616 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874942] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874953] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:97624 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874962] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874973] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:80 nsid:1 lba:97632 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.874983] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.874994] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:123 nsid:1 lba:97640 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875006] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875017] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:79 nsid:1 lba:97648 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875026] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875037] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:28 nsid:1 lba:97656 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875047] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875058] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:1 lba:97152 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fe000 len:0x1000 key:0x1c2600 00:18:13.690 [2024-09-27 15:23:09.875068] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875079] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:85 nsid:1 lba:97160 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757c000 len:0x1000 key:0x1c2600 00:18:13.690 [2024-09-27 15:23:09.875088] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875100] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:74 nsid:1 lba:97168 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757a000 len:0x1000 key:0x1c2600 00:18:13.690 [2024-09-27 15:23:09.875111] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875122] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:96 nsid:1 lba:97176 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007578000 len:0x1000 key:0x1c2600 00:18:13.690 [2024-09-27 15:23:09.875132] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875143] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:47 nsid:1 lba:97664 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875152] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875163] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:97 nsid:1 lba:97672 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875172] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875184] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:63 nsid:1 lba:97680 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875193] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875204] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:108 nsid:1 lba:97688 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875213] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875223] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:32 nsid:1 lba:97696 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875233] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.690 [2024-09-27 15:23:09.875244] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:124 nsid:1 lba:97704 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.690 [2024-09-27 15:23:09.875255] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875266] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:45 nsid:1 lba:97712 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875275] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875286] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:26 nsid:1 lba:97720 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875296] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875307] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:97184 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c4000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875317] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875328] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:126 nsid:1 lba:97192 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c2000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875338] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875353] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:97200 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c0000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875363] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875374] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:110 nsid:1 lba:97208 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075bc000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875384] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875395] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:44 nsid:1 lba:97728 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875405] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875416] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:101 nsid:1 lba:97736 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875425] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875436] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:48 nsid:1 lba:97744 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875446] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875459] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:121 nsid:1 lba:97752 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875469] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875479] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:21 nsid:1 lba:97760 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875488] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875499] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:54 nsid:1 lba:97768 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875511] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875522] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:92 nsid:1 lba:97216 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007574000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875532] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875543] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:115 nsid:1 lba:97224 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007576000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875552] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875562] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:120 nsid:1 lba:97232 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007594000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875572] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875584] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:6 nsid:1 lba:97240 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007592000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875593] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875604] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:83 nsid:1 lba:97248 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007590000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875613] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875624] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:69 nsid:1 lba:97256 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758e000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875633] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875644] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:93 nsid:1 lba:97264 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075be000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875655] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875665] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:97272 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759a000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875675] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875686] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:72 nsid:1 lba:97280 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007598000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875695] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875706] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:119 nsid:1 lba:97288 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007596000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875716] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875727] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:27 nsid:1 lba:97296 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c6000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875737] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875749] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:94 nsid:1 lba:97304 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c8000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875759] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875771] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:97312 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ca000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875781] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875792] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:8 nsid:1 lba:97320 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075cc000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875801] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875812] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:22 nsid:1 lba:97776 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875822] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875833] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:97784 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.691 [2024-09-27 15:23:09.875843] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875854] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:1 lba:97328 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007572000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875863] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875874] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:114 nsid:1 lba:97336 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ee000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875885] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875896] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:70 nsid:1 lba:97344 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fc000 len:0x1000 key:0x1c2600 00:18:13.691 [2024-09-27 15:23:09.875906] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.691 [2024-09-27 15:23:09.875917] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:103 nsid:1 lba:97352 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075fa000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.875926] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.875937] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:100 nsid:1 lba:97360 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007564000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.875947] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.875958] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:88 nsid:1 lba:97368 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007566000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.875971] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.875982] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:97376 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007568000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.875993] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876004] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:111 nsid:1 lba:97384 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000756a000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.876014] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876025] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:91 nsid:1 lba:97792 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876034] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876045] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:117 nsid:1 lba:97800 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876054] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876065] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:49 nsid:1 lba:97808 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876075] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876086] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:89 nsid:1 lba:97816 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876095] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876106] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:97824 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876115] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876126] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:112 nsid:1 lba:97832 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876136] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876147] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:55 nsid:1 lba:97840 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876157] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876167] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:17 nsid:1 lba:97848 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876177] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876187] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:73 nsid:1 lba:97856 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876197] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876208] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:76 nsid:1 lba:97864 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876217] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876228] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:97872 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876244] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876256] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:61 nsid:1 lba:97880 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876266] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876277] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:87 nsid:1 lba:97888 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876286] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876297] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:84 nsid:1 lba:97896 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876306] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876318] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:97904 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876327] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876338] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:33 nsid:1 lba:97912 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876352] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876363] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:25 nsid:1 lba:97920 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876372] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876383] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:10 nsid:1 lba:97928 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876393] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876404] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:42 nsid:1 lba:97936 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876414] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876424] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:46 nsid:1 lba:97944 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876433] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876444] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:82 nsid:1 lba:97952 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876454] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876465] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:97960 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876474] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876485] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:97968 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876494] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876507] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:31 nsid:1 lba:97976 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876517] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876528] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:68 nsid:1 lba:97392 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f0000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.876537] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876548] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:116 nsid:1 lba:97400 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759c000 len:0x1000 key:0x1c2600 00:18:13.692 [2024-09-27 15:23:09.876557] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876568] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:34 nsid:1 lba:97984 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876578] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876589] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:122 nsid:1 lba:97992 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876598] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876609] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:107 nsid:1 lba:98000 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.692 [2024-09-27 15:23:09.876618] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.692 [2024-09-27 15:23:09.876629] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:23 nsid:1 lba:98008 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.693 [2024-09-27 15:23:09.876638] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.876650] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:43 nsid:1 lba:98016 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.693 [2024-09-27 15:23:09.876659] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.876670] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:7 nsid:1 lba:98024 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.693 [2024-09-27 15:23:09.876679] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.876689] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:65 nsid:1 lba:98032 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.693 [2024-09-27 15:23:09.876699] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.876710] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:67 nsid:1 lba:98040 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.693 [2024-09-27 15:23:09.876720] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.876730] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:36 nsid:1 lba:98048 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:18:13.693 [2024-09-27 15:23:09.876739] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:3287c000 sqhd:7250 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.878500] nvme_qpair.c: 579:nvme_qpair_abort_queued_reqs: *ERROR*: aborting queued i/o 00:18:13.693 [2024-09-27 15:23:09.878518] nvme_qpair.c: 558:nvme_qpair_manual_complete_request: *NOTICE*: Command completed manually: 00:18:13.693 [2024-09-27 15:23:09.878528] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:1 lba:98056 len:8 PRP1 0x0 PRP2 0x0 00:18:13.693 [2024-09-27 15:23:09.878540] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:1 cid:0 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:18:13.693 [2024-09-27 15:23:09.878587] bdev_nvme.c:1730:bdev_nvme_disconnected_qpair_cb: *NOTICE*: qpair 0x200019ae4840 was disconnected and freed. reset controller. 00:18:13.693 [2024-09-27 15:23:09.878599] bdev_nvme.c:1987:bdev_nvme_failover_trid: *NOTICE*: Start failover from 10.0.0.2:4422 to 10.0.0.2:4420 00:18:13.693 [2024-09-27 15:23:09.878611] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:18:13.693 [2024-09-27 15:23:09.881412] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] resetting controller 00:18:13.693 [2024-09-27 15:23:09.895497] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:18:13.693 [2024-09-27 15:23:09.943540] bdev_nvme.c:2183:_bdev_nvme_reset_ctrlr_complete: *NOTICE*: Resetting controller successful. 00:18:13.693 12793.09 IOPS, 49.97 MiB/s 13239.17 IOPS, 51.72 MiB/s 13616.15 IOPS, 53.19 MiB/s 13938.79 IOPS, 54.45 MiB/s 14221.20 IOPS, 55.55 MiB/s 00:18:13.693 Latency(us) 00:18:13.693 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:13.693 Job: NVMe0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:18:13.693 Verification LBA range: start 0x0 length 0x4000 00:18:13.693 NVMe0n1 : 15.01 14221.05 55.55 309.56 0.00 8786.29 336.58 1021221.84 00:18:13.693 =================================================================================================================== 00:18:13.693 Total : 14221.05 55.55 309.56 0.00 8786.29 336.58 1021221.84 00:18:13.693 Received shutdown signal, test time was about 15.000000 seconds 00:18:13.693 00:18:13.693 Latency(us) 00:18:13.693 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:13.693 =================================================================================================================== 00:18:13.693 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@65 -- # grep -c 'Resetting controller successful' 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@65 -- # count=3 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@67 -- # (( count != 3 )) 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@73 -- # bdevperf_pid=1834284 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@72 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -z -r /var/tmp/bdevperf.sock -q 128 -o 4096 -w verify -t 1 -f 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@75 -- # waitforlisten 1834284 /var/tmp/bdevperf.sock 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@831 -- # '[' -z 1834284 ']' 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:18:13.693 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:13.693 15:23:15 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:18:14.624 15:23:16 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:14.624 15:23:16 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@864 -- # return 0 00:18:14.624 15:23:16 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@76 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 00:18:14.624 [2024-09-27 15:23:16.290221] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4421 *** 00:18:14.624 15:23:16 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@77 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4422 00:18:14.881 [2024-09-27 15:23:16.494927] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4422 *** 00:18:14.881 15:23:16 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@78 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:15.138 NVMe0n1 00:18:15.138 15:23:16 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@79 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4421 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:15.395 00:18:15.395 15:23:17 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@80 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b NVMe0 -t rdma -a 10.0.0.2 -s 4422 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:15.652 00:18:15.652 15:23:17 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@82 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_controllers 00:18:15.652 15:23:17 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@82 -- # grep -q NVMe0 00:18:15.909 15:23:17 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@84 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_detach_controller NVMe0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:15.909 15:23:17 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@87 -- # sleep 3 00:18:19.182 15:23:20 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@88 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_controllers 00:18:19.182 15:23:20 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@88 -- # grep -q NVMe0 00:18:19.182 15:23:20 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@89 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bdevperf.sock perform_tests 00:18:19.182 15:23:20 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@90 -- # run_test_pid=1835090 00:18:19.182 15:23:20 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@92 -- # wait 1835090 00:18:20.556 { 00:18:20.556 "results": [ 00:18:20.556 { 00:18:20.556 "job": "NVMe0n1", 00:18:20.556 "core_mask": "0x1", 00:18:20.556 "workload": "verify", 00:18:20.556 "status": "finished", 00:18:20.556 "verify_range": { 00:18:20.556 "start": 0, 00:18:20.556 "length": 16384 00:18:20.556 }, 00:18:20.556 "queue_depth": 128, 00:18:20.556 "io_size": 4096, 00:18:20.556 "runtime": 1.011427, 00:18:20.556 "iops": 17970.649389427017, 00:18:20.556 "mibps": 70.19784917744929, 00:18:20.556 "io_failed": 0, 00:18:20.556 "io_timeout": 0, 00:18:20.556 "avg_latency_us": 7083.966785058175, 00:18:20.556 "min_latency_us": 2678.4278260869564, 00:18:20.556 "max_latency_us": 12651.297391304348 00:18:20.556 } 00:18:20.556 ], 00:18:20.556 "core_count": 1 00:18:20.556 } 00:18:20.556 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@94 -- # cat /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/try.txt 00:18:20.556 [2024-09-27 15:23:15.266762] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:18:20.556 [2024-09-27 15:23:15.266832] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1834284 ] 00:18:20.556 [2024-09-27 15:23:15.353844] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:20.556 [2024-09-27 15:23:15.439715] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:20.556 [2024-09-27 15:23:17.695299] bdev_nvme.c:1987:bdev_nvme_failover_trid: *NOTICE*: Start failover from 10.0.0.2:4420 to 10.0.0.2:4421 00:18:20.556 [2024-09-27 15:23:17.695835] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode1] in failed state. 00:18:20.556 [2024-09-27 15:23:17.695871] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode1] resetting controller 00:18:20.556 [2024-09-27 15:23:17.713063] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:18:20.556 [2024-09-27 15:23:17.729401] bdev_nvme.c:2183:_bdev_nvme_reset_ctrlr_complete: *NOTICE*: Resetting controller successful. 00:18:20.556 Running I/O for 1 seconds... 00:18:20.556 17923.00 IOPS, 70.01 MiB/s 00:18:20.556 Latency(us) 00:18:20.556 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:20.556 Job: NVMe0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:18:20.556 Verification LBA range: start 0x0 length 0x4000 00:18:20.556 NVMe0n1 : 1.01 17970.65 70.20 0.00 0.00 7083.97 2678.43 12651.30 00:18:20.556 =================================================================================================================== 00:18:20.556 Total : 17970.65 70.20 0.00 0.00 7083.97 2678.43 12651.30 00:18:20.556 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@95 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_controllers 00:18:20.556 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@95 -- # grep -q NVMe0 00:18:20.556 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@98 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_detach_controller NVMe0 -t rdma -a 10.0.0.2 -s 4422 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:20.815 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@99 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_controllers 00:18:20.815 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@99 -- # grep -q NVMe0 00:18:21.072 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@100 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_detach_controller NVMe0 -t rdma -a 10.0.0.2 -s 4421 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 00:18:21.329 15:23:22 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@101 -- # sleep 3 00:18:24.609 15:23:25 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@103 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_controllers 00:18:24.609 15:23:25 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@103 -- # grep -q NVMe0 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@108 -- # killprocess 1834284 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@950 -- # '[' -z 1834284 ']' 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@954 -- # kill -0 1834284 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@955 -- # uname 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1834284 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1834284' 00:18:24.609 killing process with pid 1834284 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@969 -- # kill 1834284 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@974 -- # wait 1834284 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@110 -- # sync 00:18:24.609 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@111 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@113 -- # trap - SIGINT SIGTERM EXIT 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@115 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/try.txt 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- host/failover.sh@116 -- # nvmftestfini 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@331 -- # nvmfcleanup 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@99 -- # sync 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@102 -- # set +e 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@103 -- # for i in {1..20} 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:18:24.868 rmmod nvme_rdma 00:18:24.868 rmmod nvme_fabrics 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@106 -- # set -e 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@107 -- # return 0 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@332 -- # '[' -n 1831775 ']' 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@333 -- # killprocess 1831775 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@950 -- # '[' -z 1831775 ']' 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@954 -- # kill -0 1831775 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@955 -- # uname 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1831775 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1831775' 00:18:24.868 killing process with pid 1831775 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@969 -- # kill 1831775 00:18:24.868 15:23:26 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@974 -- # wait 1831775 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@338 -- # nvmf_fini 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@264 -- # local dev 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@267 -- # remove_target_ns 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@22 -- # _remove_target_ns 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@268 -- # delete_main_bridge 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@130 -- # return 0 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@41 -- # _dev=0 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@41 -- # dev_map=() 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/setup.sh@284 -- # iptr 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@538 -- # iptables-save 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- nvmf/common.sh@538 -- # iptables-restore 00:18:25.436 00:18:25.436 real 0m38.486s 00:18:25.436 user 2m7.336s 00:18:25.436 sys 0m7.845s 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_failover -- common/autotest_common.sh@10 -- # set +x 00:18:25.436 ************************************ 00:18:25.436 END TEST nvmf_failover 00:18:25.436 ************************************ 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@24 -- # run_test nvmf_host_multipath_status /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/multipath_status.sh --transport=rdma 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:18:25.436 ************************************ 00:18:25.436 START TEST nvmf_host_multipath_status 00:18:25.436 ************************************ 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/multipath_status.sh --transport=rdma 00:18:25.436 * Looking for test storage... 00:18:25.436 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1681 -- # lcov --version 00:18:25.436 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@333 -- # local ver1 ver1_l 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@334 -- # local ver2 ver2_l 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@336 -- # IFS=.-: 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@336 -- # read -ra ver1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@337 -- # IFS=.-: 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@337 -- # read -ra ver2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@338 -- # local 'op=<' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@340 -- # ver1_l=2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@341 -- # ver2_l=1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@344 -- # case "$op" in 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@345 -- # : 1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@364 -- # (( v = 0 )) 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@365 -- # decimal 1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@353 -- # local d=1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@355 -- # echo 1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@365 -- # ver1[v]=1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@366 -- # decimal 2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@353 -- # local d=2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@355 -- # echo 2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@366 -- # ver2[v]=2 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@368 -- # return 0 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:18:25.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:25.697 --rc genhtml_branch_coverage=1 00:18:25.697 --rc genhtml_function_coverage=1 00:18:25.697 --rc genhtml_legend=1 00:18:25.697 --rc geninfo_all_blocks=1 00:18:25.697 --rc geninfo_unexecuted_blocks=1 00:18:25.697 00:18:25.697 ' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:18:25.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:25.697 --rc genhtml_branch_coverage=1 00:18:25.697 --rc genhtml_function_coverage=1 00:18:25.697 --rc genhtml_legend=1 00:18:25.697 --rc geninfo_all_blocks=1 00:18:25.697 --rc geninfo_unexecuted_blocks=1 00:18:25.697 00:18:25.697 ' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:18:25.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:25.697 --rc genhtml_branch_coverage=1 00:18:25.697 --rc genhtml_function_coverage=1 00:18:25.697 --rc genhtml_legend=1 00:18:25.697 --rc geninfo_all_blocks=1 00:18:25.697 --rc geninfo_unexecuted_blocks=1 00:18:25.697 00:18:25.697 ' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:18:25.697 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:25.697 --rc genhtml_branch_coverage=1 00:18:25.697 --rc genhtml_function_coverage=1 00:18:25.697 --rc genhtml_legend=1 00:18:25.697 --rc geninfo_all_blocks=1 00:18:25.697 --rc geninfo_unexecuted_blocks=1 00:18:25.697 00:18:25.697 ' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@7 -- # uname -s 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@15 -- # shopt -s extglob 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:18:25.697 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- paths/export.sh@5 -- # export PATH 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@50 -- # : 0 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:18:25.698 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@54 -- # have_pci_nics=0 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@12 -- # MALLOC_BDEV_SIZE=64 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@13 -- # MALLOC_BLOCK_SIZE=512 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@15 -- # rpc_py=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@16 -- # bpf_sh=/var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/bpftrace.sh 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@18 -- # bdevperf_rpc_sock=/var/tmp/bdevperf.sock 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@21 -- # NQN=nqn.2016-06.io.spdk:cnode1 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@31 -- # nvmftestinit 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@292 -- # prepare_net_devs 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@254 -- # local -g is_hw=no 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@256 -- # remove_target_ns 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@22 -- # _remove_target_ns 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@125 -- # xtrace_disable 00:18:25.698 15:23:27 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@10 -- # set +x 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@131 -- # pci_devs=() 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@131 -- # local -a pci_devs 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@132 -- # pci_net_devs=() 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@133 -- # pci_drivers=() 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@133 -- # local -A pci_drivers 00:18:32.270 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@135 -- # net_devs=() 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@135 -- # local -ga net_devs 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@136 -- # e810=() 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@136 -- # local -ga e810 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@137 -- # x722=() 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@137 -- # local -ga x722 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@138 -- # mlx=() 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@138 -- # local -ga mlx 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:18:32.271 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:18:32.271 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:18:32.271 Found net devices under 0000:18:00.0: mlx_0_0 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:18:32.271 Found net devices under 0000:18:00.1: mlx_0_1 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@249 -- # get_rdma_if_list 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@75 -- # rdma_devs=() 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@89 -- # continue 2 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@89 -- # continue 2 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@258 -- # is_hw=yes 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:18:32.271 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@61 -- # uname 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@65 -- # modprobe ib_cm 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@66 -- # modprobe ib_core 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@67 -- # modprobe ib_umad 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@69 -- # modprobe iw_cm 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:18:32.532 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@27 -- # local -gA dev_map 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@28 -- # local -g _dev 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@44 -- # ips=() 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@58 -- # key_initiator=target1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@11 -- # local val=167772161 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:18:32.533 10.0.0.1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@11 -- # local val=167772162 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:18:32.533 10.0.0.2 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@38 -- # ping_ips 1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # get_net_dev target0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@107 -- # local dev=target0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:18:32.533 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:18:32.533 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:18:32.533 00:18:32.533 --- 10.0.0.2 ping statistics --- 00:18:32.533 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:18:32.533 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:18:32.533 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # get_net_dev target0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@107 -- # local dev=target0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:18:32.534 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:18:32.534 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:18:32.534 00:18:32.534 --- 10.0.0.2 ping statistics --- 00:18:32.534 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:18:32.534 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@98 -- # (( pair++ )) 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@266 -- # return 0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # get_net_dev target0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@107 -- # local dev=target0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # get_net_dev target1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@107 -- # local dev=target1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # get_net_dev target0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@107 -- # local dev=target0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # get_net_dev target1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@107 -- # local dev=target1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:18:32.534 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@33 -- # nvmfappstart -m 0x3 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@724 -- # xtrace_disable 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@10 -- # set +x 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@324 -- # nvmfpid=1838749 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@325 -- # waitforlisten 1838749 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x3 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@831 -- # '[' -z 1838749 ']' 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:32.795 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:32.795 15:23:34 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@10 -- # set +x 00:18:32.795 [2024-09-27 15:23:34.456479] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:18:32.795 [2024-09-27 15:23:34.456542] [ DPDK EAL parameters: nvmf -c 0x3 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:32.795 [2024-09-27 15:23:34.541663] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:18:32.795 [2024-09-27 15:23:34.628204] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:18:32.795 [2024-09-27 15:23:34.628244] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:18:32.795 [2024-09-27 15:23:34.628254] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:18:32.795 [2024-09-27 15:23:34.628278] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:18:32.795 [2024-09-27 15:23:34.628286] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:18:32.795 [2024-09-27 15:23:34.628347] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:18:32.795 [2024-09-27 15:23:34.628354] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@864 -- # return 0 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@730 -- # xtrace_disable 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@10 -- # set +x 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@34 -- # nvmfapp_pid=1838749 00:18:33.733 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@36 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_transport -t rdma --num-shared-buffers 1024 -u 8192 00:18:33.733 [2024-09-27 15:23:35.549386] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1bd6da0/0x1bdb290) succeed. 00:18:33.733 [2024-09-27 15:23:35.558544] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1bd82a0/0x1c1c930) succeed. 00:18:33.992 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@37 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py bdev_malloc_create 64 512 -b Malloc0 00:18:33.992 Malloc0 00:18:34.250 15:23:35 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@39 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2016-06.io.spdk:cnode1 -a -s SPDK00000000000001 -r -m 2 00:18:34.250 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@40 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2016-06.io.spdk:cnode1 Malloc0 00:18:34.509 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@41 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 00:18:34.768 [2024-09-27 15:23:36.421142] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:18:34.768 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@42 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 00:18:34.768 [2024-09-27 15:23:36.609460] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4421 *** 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@45 -- # bdevperf_pid=1839128 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@44 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/examples/bdevperf -m 0x4 -z -r /var/tmp/bdevperf.sock -q 128 -o 4096 -w verify -t 90 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@47 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@48 -- # waitforlisten 1839128 /var/tmp/bdevperf.sock 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@831 -- # '[' -z 1839128 ']' 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bdevperf.sock 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock...' 00:18:35.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bdevperf.sock... 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:35.028 15:23:36 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@10 -- # set +x 00:18:35.967 15:23:37 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:35.967 15:23:37 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@864 -- # return 0 00:18:35.967 15:23:37 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@52 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_set_options -r -1 00:18:35.967 15:23:37 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@55 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b Nvme0 -t rdma -a 10.0.0.2 -s 4420 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 -l -1 -o 10 00:18:36.226 Nvme0n1 00:18:36.226 15:23:38 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@56 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_attach_controller -b Nvme0 -t rdma -a 10.0.0.2 -s 4421 -f ipv4 -n nqn.2016-06.io.spdk:cnode1 -x multipath -l -1 -o 10 00:18:36.485 Nvme0n1 00:18:36.485 15:23:38 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@78 -- # sleep 2 00:18:36.485 15:23:38 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@76 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 120 -s /var/tmp/bdevperf.sock perform_tests 00:18:39.020 15:23:40 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@90 -- # set_ANA_state optimized optimized 00:18:39.020 15:23:40 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n optimized 00:18:39.020 15:23:40 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n optimized 00:18:39.020 15:23:40 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@91 -- # sleep 1 00:18:39.958 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@92 -- # check_status true false true true true true 00:18:39.958 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current true 00:18:39.958 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:39.958 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:40.265 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:40.265 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current false 00:18:40.265 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:40.265 15:23:41 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:40.525 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:40.785 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:40.785 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:18:40.785 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:40.785 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:41.045 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:41.045 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:18:41.045 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:41.045 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:41.304 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:41.304 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@94 -- # set_ANA_state non_optimized optimized 00:18:41.304 15:23:42 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n non_optimized 00:18:41.563 15:23:43 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n optimized 00:18:41.563 15:23:43 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@95 -- # sleep 1 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@96 -- # check_status false true true true true true 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current false 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current true 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:42.941 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:43.200 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:43.200 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:43.200 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:43.200 15:23:44 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:43.200 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:43.200 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:43.200 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:43.200 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:43.459 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:43.459 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:18:43.459 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:43.459 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:43.719 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:43.719 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:18:43.719 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:43.719 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:43.978 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:43.978 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@100 -- # set_ANA_state non_optimized non_optimized 00:18:43.978 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n non_optimized 00:18:44.237 15:23:45 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n non_optimized 00:18:44.237 15:23:46 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@101 -- # sleep 1 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@102 -- # check_status true false true true true true 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current true 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current false 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:45.615 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:45.874 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:46.132 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:46.132 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:18:46.132 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:46.132 15:23:47 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:46.390 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:46.390 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:18:46.390 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:46.390 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:46.649 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:46.649 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@104 -- # set_ANA_state non_optimized inaccessible 00:18:46.649 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n non_optimized 00:18:46.908 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n inaccessible 00:18:47.166 15:23:48 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@105 -- # sleep 1 00:18:48.122 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@106 -- # check_status true false true true true false 00:18:48.122 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current true 00:18:48.122 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:48.122 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:48.380 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:48.380 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current false 00:18:48.380 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:48.380 15:23:49 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:48.380 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:48.380 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:48.380 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:48.380 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:48.639 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:48.639 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:48.639 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:48.639 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:48.898 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:48.898 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:18:48.898 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:48.898 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:49.155 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:49.155 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible false 00:18:49.155 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:49.155 15:23:50 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:49.413 15:23:51 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:49.413 15:23:51 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@108 -- # set_ANA_state inaccessible inaccessible 00:18:49.413 15:23:51 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n inaccessible 00:18:49.413 15:23:51 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n inaccessible 00:18:49.671 15:23:51 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@109 -- # sleep 1 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@110 -- # check_status false false true true false false 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current false 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current false 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:51.045 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:51.303 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:51.303 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:51.303 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:51.303 15:23:52 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible false 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:51.562 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:51.820 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:51.820 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible false 00:18:51.820 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:51.820 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:52.079 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:52.079 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@112 -- # set_ANA_state inaccessible optimized 00:18:52.079 15:23:53 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n inaccessible 00:18:52.337 15:23:54 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n optimized 00:18:52.596 15:23:54 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@113 -- # sleep 1 00:18:53.532 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@114 -- # check_status false true true true false true 00:18:53.532 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current false 00:18:53.532 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:53.532 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:53.790 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:53.790 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current true 00:18:53.790 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:53.790 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:54.049 15:23:55 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:54.308 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:54.308 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible false 00:18:54.308 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:54.308 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:54.567 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:54.567 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:18:54.567 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:54.567 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:54.826 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:54.826 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@116 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_set_multipath_policy -b Nvme0n1 -p active_active 00:18:54.826 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@119 -- # set_ANA_state optimized optimized 00:18:54.826 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n optimized 00:18:55.085 15:23:56 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n optimized 00:18:55.343 15:23:57 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@120 -- # sleep 1 00:18:56.280 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@121 -- # check_status true true true true true true 00:18:56.280 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current true 00:18:56.280 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:56.280 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:56.539 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:56.539 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current true 00:18:56.539 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:56.539 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:56.797 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:56.797 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:56.797 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:56.797 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:57.056 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:57.056 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:57.056 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:57.056 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:57.315 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:57.315 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:18:57.315 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:57.315 15:23:58 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:18:57.315 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:57.315 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:18:57.315 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:57.315 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:18:57.574 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:57.574 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@123 -- # set_ANA_state non_optimized optimized 00:18:57.574 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n non_optimized 00:18:57.833 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n optimized 00:18:58.092 15:23:59 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@124 -- # sleep 1 00:18:59.030 15:24:00 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@125 -- # check_status false true true true true true 00:18:59.030 15:24:00 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current false 00:18:59.031 15:24:00 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:59.031 15:24:00 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:18:59.290 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:18:59.290 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current true 00:18:59.290 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:59.290 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:18:59.549 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:59.549 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:18:59.549 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:59.549 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:18:59.807 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:19:00.066 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:00.066 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:19:00.066 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:00.066 15:24:01 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:19:00.326 15:24:02 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:00.326 15:24:02 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@129 -- # set_ANA_state non_optimized non_optimized 00:19:00.326 15:24:02 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n non_optimized 00:19:00.585 15:24:02 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n non_optimized 00:19:00.844 15:24:02 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@130 -- # sleep 1 00:19:01.782 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@131 -- # check_status true true true true true true 00:19:01.782 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current true 00:19:01.782 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:01.782 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:19:02.041 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:02.041 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current true 00:19:02.041 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:02.041 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:19:02.299 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:02.299 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:19:02.300 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:02.300 15:24:03 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:19:02.300 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:02.300 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:19:02.300 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:02.300 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:19:02.559 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:02.559 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:19:02.559 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:02.559 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:19:02.818 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:02.818 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible true 00:19:02.818 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:02.818 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:19:03.077 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:03.077 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@133 -- # set_ANA_state non_optimized inaccessible 00:19:03.077 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@59 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4420 -n non_optimized 00:19:03.336 15:24:04 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@60 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_subsystem_listener_set_ana_state nqn.2016-06.io.spdk:cnode1 -t rdma -a 10.0.0.2 -s 4421 -n inaccessible 00:19:03.595 15:24:05 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@134 -- # sleep 1 00:19:04.537 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@135 -- # check_status true false true true true false 00:19:04.537 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@68 -- # port_status 4420 current true 00:19:04.537 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:04.537 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").current' 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@69 -- # port_status 4421 current false 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").current' 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@70 -- # port_status 4420 connected true 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:04.798 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").connected' 00:19:05.117 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:05.117 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@71 -- # port_status 4421 connected true 00:19:05.117 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").connected' 00:19:05.117 15:24:06 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:05.376 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:05.376 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@72 -- # port_status 4420 accessible true 00:19:05.376 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:05.376 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4420").accessible' 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ true == \t\r\u\e ]] 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@73 -- # port_status 4421 accessible false 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bdevperf.sock bdev_nvme_get_io_paths 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # jq -r '.poll_groups[].io_paths[] | select (.transport.trsvcid=="4421").accessible' 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@64 -- # [[ false == \f\a\l\s\e ]] 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@137 -- # killprocess 1839128 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@950 -- # '[' -z 1839128 ']' 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@954 -- # kill -0 1839128 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@955 -- # uname 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:05.634 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1839128 00:19:05.893 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:19:05.893 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:19:05.893 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1839128' 00:19:05.893 killing process with pid 1839128 00:19:05.893 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@969 -- # kill 1839128 00:19:05.893 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@974 -- # wait 1839128 00:19:05.893 { 00:19:05.893 "results": [ 00:19:05.893 { 00:19:05.893 "job": "Nvme0n1", 00:19:05.893 "core_mask": "0x4", 00:19:05.893 "workload": "verify", 00:19:05.893 "status": "terminated", 00:19:05.893 "verify_range": { 00:19:05.893 "start": 0, 00:19:05.893 "length": 16384 00:19:05.893 }, 00:19:05.893 "queue_depth": 128, 00:19:05.893 "io_size": 4096, 00:19:05.893 "runtime": 29.06667, 00:19:05.893 "iops": 15862.222951579937, 00:19:05.893 "mibps": 61.96180840460913, 00:19:05.893 "io_failed": 0, 00:19:05.893 "io_timeout": 0, 00:19:05.893 "avg_latency_us": 8050.12099489779, 00:19:05.893 "min_latency_us": 53.87130434782609, 00:19:05.893 "max_latency_us": 3019898.88 00:19:05.893 } 00:19:05.893 ], 00:19:05.893 "core_count": 1 00:19:05.893 } 00:19:06.156 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@139 -- # wait 1839128 00:19:06.156 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@141 -- # cat /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/try.txt 00:19:06.156 [2024-09-27 15:23:36.689319] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:19:06.156 [2024-09-27 15:23:36.689400] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x4 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1839128 ] 00:19:06.156 [2024-09-27 15:23:36.771769] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:06.156 [2024-09-27 15:23:36.860269] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:19:06.156 [2024-09-27 15:23:38.266977] bdev_nvme.c:5605:nvme_bdev_ctrlr_create: *WARNING*: multipath_config: deprecated feature bdev_nvme_attach_controller.multipath configuration mismatch to be removed in v25.01 00:19:06.156 Running I/O for 90 seconds... 00:19:06.156 18176.00 IOPS, 71.00 MiB/s 18209.50 IOPS, 71.13 MiB/s 18218.67 IOPS, 71.17 MiB/s 18265.00 IOPS, 71.35 MiB/s 18267.20 IOPS, 71.36 MiB/s 18344.67 IOPS, 71.66 MiB/s 18360.43 IOPS, 71.72 MiB/s 18365.88 IOPS, 71.74 MiB/s 18362.67 IOPS, 71.73 MiB/s 18356.70 IOPS, 71.71 MiB/s 18362.73 IOPS, 71.73 MiB/s 18353.17 IOPS, 71.69 MiB/s [2024-09-27 15:23:51.221832] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:66 nsid:1 lba:43376 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007532000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.221879] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:66 cdw0:0 sqhd:002b p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.221936] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:8 nsid:1 lba:43384 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007534000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.221948] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:8 cdw0:0 sqhd:002c p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.221962] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:43392 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ba000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.221972] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:33 cdw0:0 sqhd:002d p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.221984] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:17 nsid:1 lba:43400 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075bc000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.221994] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:17 cdw0:0 sqhd:002e p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222006] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:71 nsid:1 lba:43408 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075be000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222015] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:71 cdw0:0 sqhd:002f p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222027] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:35 nsid:1 lba:43416 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c0000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222036] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:35 cdw0:0 sqhd:0030 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222048] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:98 nsid:1 lba:43424 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c2000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222057] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:98 cdw0:0 sqhd:0031 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222069] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:85 nsid:1 lba:43432 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c4000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222078] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:85 cdw0:0 sqhd:0032 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222097] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:116 nsid:1 lba:43440 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c6000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222107] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:116 cdw0:0 sqhd:0033 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222119] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:82 nsid:1 lba:43448 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c8000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222128] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:82 cdw0:0 sqhd:0034 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222141] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:43456 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ca000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222150] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:43 cdw0:0 sqhd:0035 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222162] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:16 nsid:1 lba:43464 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075cc000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222171] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:16 cdw0:0 sqhd:0036 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222184] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:5 nsid:1 lba:43472 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ce000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222193] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:5 cdw0:0 sqhd:0037 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222205] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:65 nsid:1 lba:43480 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d0000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222214] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:65 cdw0:0 sqhd:0038 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222227] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:111 nsid:1 lba:43488 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d2000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222236] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:111 cdw0:0 sqhd:0039 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222248] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:81 nsid:1 lba:43496 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d4000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222257] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:81 cdw0:0 sqhd:003a p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222268] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:57 nsid:1 lba:43504 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d6000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222277] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:57 cdw0:0 sqhd:003b p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222289] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:46 nsid:1 lba:43512 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007530000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222299] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:46 cdw0:0 sqhd:003c p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222311] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:86 nsid:1 lba:43520 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075da000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222320] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:86 cdw0:0 sqhd:003d p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222332] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:89 nsid:1 lba:43528 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075dc000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222349] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:89 cdw0:0 sqhd:003e p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222361] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:121 nsid:1 lba:43536 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075de000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222370] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:121 cdw0:0 sqhd:003f p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222382] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:96 nsid:1 lba:43544 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e0000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222392] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:96 cdw0:0 sqhd:0040 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222404] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:113 nsid:1 lba:43552 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e2000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222413] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:113 cdw0:0 sqhd:0041 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222425] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:67 nsid:1 lba:43560 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e4000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222434] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:67 cdw0:0 sqhd:0042 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222446] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:53 nsid:1 lba:43568 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e6000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222455] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:53 cdw0:0 sqhd:0043 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222467] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:50 nsid:1 lba:43576 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e8000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222476] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:50 cdw0:0 sqhd:0044 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222488] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:105 nsid:1 lba:43584 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ea000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222498] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:105 cdw0:0 sqhd:0045 p:0 m:0 dnr:0 00:19:06.156 [2024-09-27 15:23:51.222509] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:69 nsid:1 lba:43592 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ec000 len:0x1000 key:0x1bfc00 00:19:06.156 [2024-09-27 15:23:51.222518] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:69 cdw0:0 sqhd:0046 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222530] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:91 nsid:1 lba:43600 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ee000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222541] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:91 cdw0:0 sqhd:0047 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222553] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:87 nsid:1 lba:43608 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f0000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222562] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:87 cdw0:0 sqhd:0048 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222574] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:90 nsid:1 lba:43616 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757a000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222584] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:90 cdw0:0 sqhd:0049 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222596] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:101 nsid:1 lba:44080 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.157 [2024-09-27 15:23:51.222605] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:101 cdw0:0 sqhd:004a p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222617] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:43624 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d8000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222627] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:59 cdw0:0 sqhd:004b p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222639] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:123 nsid:1 lba:44088 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.157 [2024-09-27 15:23:51.222648] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:123 cdw0:0 sqhd:004c p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222660] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:99 nsid:1 lba:43632 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f2000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222670] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:99 cdw0:0 sqhd:004d p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222682] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:9 nsid:1 lba:43640 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757c000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222691] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:9 cdw0:0 sqhd:004e p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222702] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:115 nsid:1 lba:43648 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757e000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222712] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:115 cdw0:0 sqhd:004f p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222724] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:43656 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007580000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222733] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:39 cdw0:0 sqhd:0050 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222745] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:25 nsid:1 lba:43664 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007582000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222754] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:25 cdw0:0 sqhd:0051 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222766] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:41 nsid:1 lba:43672 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007584000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222775] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:41 cdw0:0 sqhd:0052 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222787] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:83 nsid:1 lba:43680 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007586000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222796] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:83 cdw0:0 sqhd:0053 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222808] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:6 nsid:1 lba:43688 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007588000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222817] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:6 cdw0:0 sqhd:0054 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222831] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:92 nsid:1 lba:43696 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758a000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222840] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:92 cdw0:0 sqhd:0055 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222852] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:108 nsid:1 lba:43704 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758c000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222861] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:108 cdw0:0 sqhd:0056 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222873] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:49 nsid:1 lba:43712 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758e000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222882] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:49 cdw0:0 sqhd:0057 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222894] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:74 nsid:1 lba:43720 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007590000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222904] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:74 cdw0:0 sqhd:0058 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222916] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:31 nsid:1 lba:43728 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007592000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222925] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:31 cdw0:0 sqhd:0059 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222937] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:14 nsid:1 lba:43736 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007594000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222946] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:14 cdw0:0 sqhd:005a p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222958] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:94 nsid:1 lba:43744 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007596000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222967] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:94 cdw0:0 sqhd:005b p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.222979] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:64 nsid:1 lba:43752 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007598000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.222988] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:64 cdw0:0 sqhd:005c p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223000] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:120 nsid:1 lba:43760 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759a000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223009] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:120 cdw0:0 sqhd:005d p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223021] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:110 nsid:1 lba:43768 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759c000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223030] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:110 cdw0:0 sqhd:005e p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223041] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:1 lba:43776 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000750e000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223050] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:0 cdw0:0 sqhd:005f p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223062] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:63 nsid:1 lba:43784 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000750c000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223073] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:63 cdw0:0 sqhd:0060 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223085] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:107 nsid:1 lba:43792 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000750a000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223095] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:107 cdw0:0 sqhd:0061 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223107] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:125 nsid:1 lba:43800 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007508000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223116] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:125 cdw0:0 sqhd:0062 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223128] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:122 nsid:1 lba:43808 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007506000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223137] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:122 cdw0:0 sqhd:0063 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223149] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:7 nsid:1 lba:43816 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007504000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223158] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:7 cdw0:0 sqhd:0064 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223170] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:19 nsid:1 lba:43824 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075aa000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223179] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:19 cdw0:0 sqhd:0065 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223191] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:119 nsid:1 lba:43832 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ac000 len:0x1000 key:0x1bfc00 00:19:06.157 [2024-09-27 15:23:51.223200] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:119 cdw0:0 sqhd:0066 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223211] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:28 nsid:1 lba:44096 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.157 [2024-09-27 15:23:51.223221] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:28 cdw0:0 sqhd:0067 p:0 m:0 dnr:0 00:19:06.157 [2024-09-27 15:23:51.223232] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:80 nsid:1 lba:44104 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223242] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:80 cdw0:0 sqhd:0068 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223254] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:21 nsid:1 lba:44112 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223263] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:21 cdw0:0 sqhd:0069 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223275] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:32 nsid:1 lba:44120 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223284] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:32 cdw0:0 sqhd:006a p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223296] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:18 nsid:1 lba:44128 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223305] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:18 cdw0:0 sqhd:006b p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223318] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:78 nsid:1 lba:44136 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223327] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:78 cdw0:0 sqhd:006c p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223346] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:54 nsid:1 lba:44144 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223356] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:54 cdw0:0 sqhd:006d p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223367] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:93 nsid:1 lba:44152 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.158 [2024-09-27 15:23:51.223376] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:93 cdw0:0 sqhd:006e p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223388] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:104 nsid:1 lba:43840 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b8000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223398] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:104 cdw0:0 sqhd:006f p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223410] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:77 nsid:1 lba:43848 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b6000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223419] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:77 cdw0:0 sqhd:0070 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223431] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:1 lba:43856 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b4000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223440] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:2 cdw0:0 sqhd:0071 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223452] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:118 nsid:1 lba:43864 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b2000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223461] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:118 cdw0:0 sqhd:0072 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223473] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:11 nsid:1 lba:43872 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075b0000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223482] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:11 cdw0:0 sqhd:0073 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223494] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:97 nsid:1 lba:43880 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ae000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223504] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:97 cdw0:0 sqhd:0074 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223515] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:20 nsid:1 lba:43888 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007502000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223525] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:20 cdw0:0 sqhd:0075 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223537] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:126 nsid:1 lba:43896 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007500000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223547] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:126 cdw0:0 sqhd:0076 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223558] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:1 lba:43904 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a8000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223570] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:1 cdw0:0 sqhd:0077 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223582] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:45 nsid:1 lba:43912 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a6000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223591] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:45 cdw0:0 sqhd:0078 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223603] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:106 nsid:1 lba:43920 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a4000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223612] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:106 cdw0:0 sqhd:0079 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223624] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:56 nsid:1 lba:43928 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a2000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223633] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:56 cdw0:0 sqhd:007a p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223644] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:60 nsid:1 lba:43936 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a0000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223654] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:60 cdw0:0 sqhd:007b p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223666] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:13 nsid:1 lba:43944 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759e000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223676] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:13 cdw0:0 sqhd:007c p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223688] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:12 nsid:1 lba:43952 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007510000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223697] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:12 cdw0:0 sqhd:007d p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223709] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:48 nsid:1 lba:43960 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007512000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223718] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:48 cdw0:0 sqhd:007e p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223730] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:15 nsid:1 lba:43968 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007514000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223739] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:15 cdw0:0 sqhd:007f p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223751] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:1 lba:43976 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007516000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223760] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:4 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223772] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:22 nsid:1 lba:43984 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007518000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223782] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:22 cdw0:0 sqhd:0001 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223794] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:43992 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000751a000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223808] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:29 cdw0:0 sqhd:0002 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223820] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:30 nsid:1 lba:44000 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000751c000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223829] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:30 cdw0:0 sqhd:0003 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223841] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:51 nsid:1 lba:44008 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000751e000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223851] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:51 cdw0:0 sqhd:0004 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223863] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:72 nsid:1 lba:44016 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007520000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223872] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:72 cdw0:0 sqhd:0005 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223884] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:44 nsid:1 lba:44024 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007522000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223894] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:44 cdw0:0 sqhd:0006 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223906] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:34 nsid:1 lba:44032 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007524000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223919] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:34 cdw0:0 sqhd:0007 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223931] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:40 nsid:1 lba:44040 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007526000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223940] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:40 cdw0:0 sqhd:0008 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223952] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:117 nsid:1 lba:44048 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007528000 len:0x1000 key:0x1bfc00 00:19:06.158 [2024-09-27 15:23:51.223961] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:117 cdw0:0 sqhd:0009 p:0 m:0 dnr:0 00:19:06.158 [2024-09-27 15:23:51.223974] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:95 nsid:1 lba:44056 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000752a000 len:0x1000 key:0x1bfc00 00:19:06.159 [2024-09-27 15:23:51.223983] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:95 cdw0:0 sqhd:000a p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.223995] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:70 nsid:1 lba:44064 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000752c000 len:0x1000 key:0x1bfc00 00:19:06.159 [2024-09-27 15:23:51.224004] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:70 cdw0:0 sqhd:000b p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224298] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:70 nsid:1 lba:44072 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000752e000 len:0x1000 key:0x1bfc00 00:19:06.159 [2024-09-27 15:23:51.224310] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:70 cdw0:0 sqhd:000c p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224329] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:95 nsid:1 lba:44160 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224338] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:95 cdw0:0 sqhd:000d p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224701] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:117 nsid:1 lba:44168 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224712] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:117 cdw0:0 sqhd:000e p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224730] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:40 nsid:1 lba:44176 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224740] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:40 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224757] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:34 nsid:1 lba:44184 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224766] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:34 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224783] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:44 nsid:1 lba:44192 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224793] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:44 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224809] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:72 nsid:1 lba:44200 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224818] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:72 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224835] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:51 nsid:1 lba:44208 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224845] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:51 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224862] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:30 nsid:1 lba:44216 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224871] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:30 cdw0:0 sqhd:0014 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224888] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:29 nsid:1 lba:44224 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224897] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:29 cdw0:0 sqhd:0015 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224914] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:22 nsid:1 lba:44232 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224923] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:22 cdw0:0 sqhd:0016 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224940] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:44240 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224951] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:4 cdw0:0 sqhd:0017 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224968] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:15 nsid:1 lba:44248 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.224977] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:15 cdw0:0 sqhd:0018 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.224994] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:48 nsid:1 lba:44256 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225003] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:48 cdw0:0 sqhd:0019 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225021] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:12 nsid:1 lba:44264 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225032] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:12 cdw0:0 sqhd:001a p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225048] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:13 nsid:1 lba:44272 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225057] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:13 cdw0:0 sqhd:001b p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225074] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:60 nsid:1 lba:44280 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225083] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:60 cdw0:0 sqhd:001c p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225100] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:56 nsid:1 lba:44288 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225109] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:56 cdw0:0 sqhd:001d p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225125] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:106 nsid:1 lba:44296 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225134] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:106 cdw0:0 sqhd:001e p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225151] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:45 nsid:1 lba:44304 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225160] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:45 cdw0:0 sqhd:001f p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225177] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:44312 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225185] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:1 cdw0:0 sqhd:0020 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225202] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:126 nsid:1 lba:44320 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225211] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:126 cdw0:0 sqhd:0021 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225228] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:20 nsid:1 lba:44328 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225237] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:20 cdw0:0 sqhd:0022 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225253] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:97 nsid:1 lba:44336 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225262] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:97 cdw0:0 sqhd:0023 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225279] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:44344 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225288] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:11 cdw0:0 sqhd:0024 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225304] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:118 nsid:1 lba:44352 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225313] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:118 cdw0:0 sqhd:0025 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225330] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:1 lba:44360 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225348] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:2 cdw0:0 sqhd:0026 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225365] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:77 nsid:1 lba:44368 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225376] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:77 cdw0:0 sqhd:0027 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225393] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:104 nsid:1 lba:44376 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225402] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:104 cdw0:0 sqhd:0028 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225419] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:93 nsid:1 lba:44384 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225428] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:93 cdw0:0 sqhd:0029 p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:23:51.225445] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:54 nsid:1 lba:44392 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:23:51.225454] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:54 cdw0:0 sqhd:002a p:0 m:0 dnr:0 00:19:06.159 18061.38 IOPS, 70.55 MiB/s 16771.29 IOPS, 65.51 MiB/s 15653.20 IOPS, 61.15 MiB/s 14919.19 IOPS, 58.28 MiB/s 15131.18 IOPS, 59.11 MiB/s 15317.72 IOPS, 59.83 MiB/s 15379.84 IOPS, 60.08 MiB/s 15368.70 IOPS, 60.03 MiB/s 15360.00 IOPS, 60.00 MiB/s 15483.77 IOPS, 60.48 MiB/s 15611.35 IOPS, 60.98 MiB/s 15733.08 IOPS, 61.46 MiB/s 15703.00 IOPS, 61.34 MiB/s 15675.54 IOPS, 61.23 MiB/s [2024-09-27 15:24:05.167500] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:14 nsid:1 lba:66056 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.159 [2024-09-27 15:24:05.167544] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:14 cdw0:0 sqhd:004f p:0 m:0 dnr:0 00:19:06.159 [2024-09-27 15:24:05.167579] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:119 nsid:1 lba:66072 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167590] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:119 cdw0:0 sqhd:0050 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167603] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:92 nsid:1 lba:66088 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167612] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:92 cdw0:0 sqhd:0051 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167624] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:82 nsid:1 lba:66096 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167634] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:82 cdw0:0 sqhd:0052 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167645] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:72 nsid:1 lba:66104 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167655] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:72 cdw0:0 sqhd:0053 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167666] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:100 nsid:1 lba:66120 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167676] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:100 cdw0:0 sqhd:0054 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167688] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:5 nsid:1 lba:65600 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075a2000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.167704] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:5 cdw0:0 sqhd:0055 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167716] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:75 nsid:1 lba:65624 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075c2000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.167725] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:75 cdw0:0 sqhd:0056 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167737] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:74 nsid:1 lba:66136 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167747] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:74 cdw0:0 sqhd:0057 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167759] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:114 nsid:1 lba:65656 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000758a000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.167768] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:114 cdw0:0 sqhd:0058 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167780] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:55 nsid:1 lba:66144 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.167789] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:55 cdw0:0 sqhd:0059 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.167802] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:105 nsid:1 lba:65704 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075cc000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.167811] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:105 cdw0:0 sqhd:005a p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168202] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:76 nsid:1 lba:66160 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168214] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:76 cdw0:0 sqhd:005b p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168227] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:118 nsid:1 lba:66176 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168237] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:118 cdw0:0 sqhd:005c p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168249] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:58 nsid:1 lba:66184 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168258] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:58 cdw0:0 sqhd:005d p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168269] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:80 nsid:1 lba:66200 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168279] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:80 cdw0:0 sqhd:005e p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168291] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:33 nsid:1 lba:65512 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075d0000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168300] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:33 cdw0:0 sqhd:005f p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168313] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:102 nsid:1 lba:66216 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168322] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:102 cdw0:0 sqhd:0060 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168336] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:103 nsid:1 lba:65528 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007584000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168352] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:103 cdw0:0 sqhd:0061 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168363] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:46 nsid:1 lba:66240 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168372] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:46 cdw0:0 sqhd:0062 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168385] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:62 nsid:1 lba:65560 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075aa000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168395] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:62 cdw0:0 sqhd:0063 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168407] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:93 nsid:1 lba:65576 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000759c000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168416] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:93 cdw0:0 sqhd:0064 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168428] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:110 nsid:1 lba:66248 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168437] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:110 cdw0:0 sqhd:0065 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168449] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:42 nsid:1 lba:66256 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168458] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:42 cdw0:0 sqhd:0066 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168470] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:84 nsid:1 lba:65632 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007572000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168479] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:84 cdw0:0 sqhd:0067 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168490] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:107 nsid:1 lba:66272 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168499] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:107 cdw0:0 sqhd:0068 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168511] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:39 nsid:1 lba:65680 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007522000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168521] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:39 cdw0:0 sqhd:0069 p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168533] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:28 nsid:1 lba:65696 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007508000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168542] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:28 cdw0:0 sqhd:006a p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168554] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:77 nsid:1 lba:66288 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.160 [2024-09-27 15:24:05.168563] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:77 cdw0:0 sqhd:006b p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168575] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:104 nsid:1 lba:65720 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e2000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168584] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:104 cdw0:0 sqhd:006c p:0 m:0 dnr:0 00:19:06.160 [2024-09-27 15:24:05.168597] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:17 nsid:1 lba:65736 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000750c000 len:0x1000 key:0x1bfc00 00:19:06.160 [2024-09-27 15:24:05.168607] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:17 cdw0:0 sqhd:006d p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168618] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:57 nsid:1 lba:66312 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168627] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:57 cdw0:0 sqhd:006e p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168639] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:113 nsid:1 lba:65752 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e6000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168648] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:113 cdw0:0 sqhd:006f p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168660] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:1 lba:66328 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168669] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:4 cdw0:0 sqhd:0070 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168681] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:67 nsid:1 lba:65800 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075e8000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168690] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:67 cdw0:0 sqhd:0071 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168701] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:73 nsid:1 lba:66336 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168710] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:73 cdw0:0 sqhd:0072 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168722] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:121 nsid:1 lba:65832 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000751c000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168731] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:121 cdw0:0 sqhd:0073 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168743] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:125 nsid:1 lba:65848 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007516000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168752] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:125 cdw0:0 sqhd:0074 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168764] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:9 nsid:1 lba:66344 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168773] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:9 cdw0:0 sqhd:0075 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168785] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:16 nsid:1 lba:65896 len:8 SGL KEYED DATA BLOCK ADDRESS 0x20000757e000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168794] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:16 cdw0:0 sqhd:0076 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168805] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:109 nsid:1 lba:66352 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168814] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:109 cdw0:0 sqhd:0077 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168826] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:50 nsid:1 lba:66360 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168837] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:50 cdw0:0 sqhd:0078 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168849] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:11 nsid:1 lba:66368 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168858] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:11 cdw0:0 sqhd:0079 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168870] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:21 nsid:1 lba:66384 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168879] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:21 cdw0:0 sqhd:007a p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168892] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:59 nsid:1 lba:65968 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ee000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168901] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:59 cdw0:0 sqhd:007b p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168913] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:38 nsid:1 lba:66000 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075bc000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168922] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:38 cdw0:0 sqhd:007c p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168934] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:70 nsid:1 lba:66400 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.168942] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:70 cdw0:0 sqhd:007d p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168954] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:108 nsid:1 lba:66032 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ec000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168964] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:108 cdw0:0 sqhd:007e p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168976] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:36 nsid:1 lba:65760 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075f0000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.168985] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:36 cdw0:0 sqhd:007f p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.168997] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:122 nsid:1 lba:66416 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169006] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:122 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169017] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:68 nsid:1 lba:66424 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169027] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:68 cdw0:0 sqhd:0001 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169038] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:1 lba:66432 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169047] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:1 cdw0:0 sqhd:0002 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169059] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:117 nsid:1 lba:66448 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169068] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:117 cdw0:0 sqhd:0003 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169079] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:94 nsid:1 lba:66464 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169090] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:94 cdw0:0 sqhd:0004 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169102] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:10 nsid:1 lba:65872 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007564000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.169111] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:10 cdw0:0 sqhd:0005 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169123] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:49 nsid:1 lba:66472 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169132] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:49 cdw0:0 sqhd:0006 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169144] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:89 nsid:1 lba:66480 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169153] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:89 cdw0:0 sqhd:0007 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169165] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:61 nsid:1 lba:65936 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007574000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.169174] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:61 cdw0:0 sqhd:0008 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169185] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:116 nsid:1 lba:66488 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169194] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:116 cdw0:0 sqhd:0009 p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169206] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:1 lba:65960 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007582000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.169215] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:2 cdw0:0 sqhd:000a p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169227] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:43 nsid:1 lba:65976 len:8 SGL KEYED DATA BLOCK ADDRESS 0x2000075ce000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.169236] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:43 cdw0:0 sqhd:000b p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169248] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:26 nsid:1 lba:66512 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169257] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:26 cdw0:0 sqhd:000c p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169269] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:29 nsid:1 lba:66016 len:8 SGL KEYED DATA BLOCK ADDRESS 0x200007570000 len:0x1000 key:0x1bfc00 00:19:06.161 [2024-09-27 15:24:05.169278] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:29 cdw0:0 sqhd:000d p:0 m:0 dnr:0 00:19:06.161 [2024-09-27 15:24:05.169289] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:111 nsid:1 lba:66528 len:8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:19:06.161 [2024-09-27 15:24:05.169299] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ASYMMETRIC ACCESS INACCESSIBLE (03/02) qid:1 cid:111 cdw0:0 sqhd:000e p:0 m:0 dnr:0 00:19:06.161 15682.63 IOPS, 61.26 MiB/s 15776.79 IOPS, 61.63 MiB/s 15866.38 IOPS, 61.98 MiB/s Received shutdown signal, test time was about 29.067300 seconds 00:19:06.161 00:19:06.161 Latency(us) 00:19:06.162 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:19:06.162 Job: Nvme0n1 (Core Mask 0x4, workload: verify, depth: 128, IO size: 4096) 00:19:06.162 Verification LBA range: start 0x0 length 0x4000 00:19:06.162 Nvme0n1 : 29.07 15862.22 61.96 0.00 0.00 8050.12 53.87 3019898.88 00:19:06.162 =================================================================================================================== 00:19:06.162 Total : 15862.22 61.96 0.00 0.00 8050.12 53.87 3019898.88 00:19:06.162 [2024-09-27 15:24:07.536742] app.c:1032:log_deprecation_hits: *WARNING*: multipath_config: deprecation 'bdev_nvme_attach_controller.multipath configuration mismatch' scheduled for removal in v25.01 hit 1 times 00:19:06.162 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@143 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rpc.py nvmf_delete_subsystem nqn.2016-06.io.spdk:cnode1 00:19:06.162 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@145 -- # trap - SIGINT SIGTERM EXIT 00:19:06.162 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@147 -- # rm -f /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/try.txt 00:19:06.162 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- host/multipath_status.sh@148 -- # nvmftestfini 00:19:06.162 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@331 -- # nvmfcleanup 00:19:06.162 15:24:07 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@99 -- # sync 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@102 -- # set +e 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@103 -- # for i in {1..20} 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:19:06.421 rmmod nvme_rdma 00:19:06.421 rmmod nvme_fabrics 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@106 -- # set -e 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@107 -- # return 0 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@332 -- # '[' -n 1838749 ']' 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@333 -- # killprocess 1838749 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@950 -- # '[' -z 1838749 ']' 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@954 -- # kill -0 1838749 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@955 -- # uname 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1838749 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1838749' 00:19:06.421 killing process with pid 1838749 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@969 -- # kill 1838749 00:19:06.421 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@974 -- # wait 1838749 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@338 -- # nvmf_fini 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@264 -- # local dev 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@267 -- # remove_target_ns 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@22 -- # _remove_target_ns 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@268 -- # delete_main_bridge 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@130 -- # return 0 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@41 -- # _dev=0 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@41 -- # dev_map=() 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/setup.sh@284 -- # iptr 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@538 -- # iptables-save 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- nvmf/common.sh@538 -- # iptables-restore 00:19:06.681 00:19:06.681 real 0m41.260s 00:19:06.681 user 1m57.639s 00:19:06.681 sys 0m9.748s 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_host_multipath_status -- common/autotest_common.sh@10 -- # set +x 00:19:06.681 ************************************ 00:19:06.681 END TEST nvmf_host_multipath_status 00:19:06.681 ************************************ 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@25 -- # run_test nvmf_discovery_remove_ifc /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/discovery_remove_ifc.sh --transport=rdma 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:19:06.681 ************************************ 00:19:06.681 START TEST nvmf_discovery_remove_ifc 00:19:06.681 ************************************ 00:19:06.681 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/discovery_remove_ifc.sh --transport=rdma 00:19:06.941 * Looking for test storage... 00:19:06.941 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:19:06.941 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:19:06.941 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1681 -- # lcov --version 00:19:06.941 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:19:06.941 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:19:06.941 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:19:06.941 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@336 -- # IFS=.-: 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@336 -- # read -ra ver1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@337 -- # IFS=.-: 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@337 -- # read -ra ver2 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@338 -- # local 'op=<' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@340 -- # ver1_l=2 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@341 -- # ver2_l=1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@344 -- # case "$op" in 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@345 -- # : 1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@364 -- # (( v = 0 )) 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@365 -- # decimal 1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@353 -- # local d=1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@355 -- # echo 1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@365 -- # ver1[v]=1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@366 -- # decimal 2 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@353 -- # local d=2 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@355 -- # echo 2 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@366 -- # ver2[v]=2 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@368 -- # return 0 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:19:06.942 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:06.942 --rc genhtml_branch_coverage=1 00:19:06.942 --rc genhtml_function_coverage=1 00:19:06.942 --rc genhtml_legend=1 00:19:06.942 --rc geninfo_all_blocks=1 00:19:06.942 --rc geninfo_unexecuted_blocks=1 00:19:06.942 00:19:06.942 ' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:19:06.942 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:06.942 --rc genhtml_branch_coverage=1 00:19:06.942 --rc genhtml_function_coverage=1 00:19:06.942 --rc genhtml_legend=1 00:19:06.942 --rc geninfo_all_blocks=1 00:19:06.942 --rc geninfo_unexecuted_blocks=1 00:19:06.942 00:19:06.942 ' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:19:06.942 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:06.942 --rc genhtml_branch_coverage=1 00:19:06.942 --rc genhtml_function_coverage=1 00:19:06.942 --rc genhtml_legend=1 00:19:06.942 --rc geninfo_all_blocks=1 00:19:06.942 --rc geninfo_unexecuted_blocks=1 00:19:06.942 00:19:06.942 ' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:19:06.942 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:06.942 --rc genhtml_branch_coverage=1 00:19:06.942 --rc genhtml_function_coverage=1 00:19:06.942 --rc genhtml_legend=1 00:19:06.942 --rc geninfo_all_blocks=1 00:19:06.942 --rc geninfo_unexecuted_blocks=1 00:19:06.942 00:19:06.942 ' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@12 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@7 -- # uname -s 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@15 -- # shopt -s extglob 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- paths/export.sh@5 -- # export PATH 00:19:06.942 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@50 -- # : 0 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:19:06.943 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@54 -- # have_pci_nics=0 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@14 -- # discovery_port=8009 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@15 -- # discovery_nqn=nqn.2014-08.org.nvmexpress.discovery 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@18 -- # nqn=nqn.2016-06.io.spdk:cnode 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@20 -- # host_nqn=nqn.2021-12.io.spdk:test 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@21 -- # host_sock=/tmp/host.sock 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@34 -- # nvmftestinit 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@292 -- # prepare_net_devs 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@254 -- # local -g is_hw=no 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@256 -- # remove_target_ns 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@22 -- # _remove_target_ns 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@125 -- # xtrace_disable 00:19:06.943 15:24:08 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@131 -- # pci_devs=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@131 -- # local -a pci_devs 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@132 -- # pci_net_devs=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@133 -- # pci_drivers=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@133 -- # local -A pci_drivers 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@135 -- # net_devs=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@135 -- # local -ga net_devs 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@136 -- # e810=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@136 -- # local -ga e810 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@137 -- # x722=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@137 -- # local -ga x722 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@138 -- # mlx=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@138 -- # local -ga mlx 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:19:15.071 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:19:15.071 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:19:15.071 Found net devices under 0000:18:00.0: mlx_0_0 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:19:15.071 Found net devices under 0000:18:00.1: mlx_0_1 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@249 -- # get_rdma_if_list 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@75 -- # rdma_devs=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@89 -- # continue 2 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@89 -- # continue 2 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@258 -- # is_hw=yes 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@61 -- # uname 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@65 -- # modprobe ib_cm 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@66 -- # modprobe ib_core 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@67 -- # modprobe ib_umad 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@69 -- # modprobe iw_cm 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@27 -- # local -gA dev_map 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@28 -- # local -g _dev 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@44 -- # ips=() 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@58 -- # key_initiator=target1 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:19:15.071 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@11 -- # local val=167772161 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:19:15.072 10.0.0.1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@11 -- # local val=167772162 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:19:15.072 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@38 -- # ping_ips 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@107 -- # local dev=target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:19:15.072 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:19:15.072 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:19:15.072 00:19:15.072 --- 10.0.0.2 ping statistics --- 00:19:15.072 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:19:15.072 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@107 -- # local dev=target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:19:15.072 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:19:15.072 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.034 ms 00:19:15.072 00:19:15.072 --- 10.0.0.2 ping statistics --- 00:19:15.072 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:19:15.072 rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@98 -- # (( pair++ )) 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@266 -- # return 0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@107 -- # local dev=target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # get_net_dev target1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@107 -- # local dev=target1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@107 -- # local dev=target0 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # get_net_dev target1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@107 -- # local dev=target1 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:19:15.072 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@35 -- # nvmfappstart -m 0x2 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@724 -- # xtrace_disable 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@324 -- # nvmfpid=1846936 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@325 -- # waitforlisten 1846936 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@831 -- # '[' -z 1846936 ']' 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:15.073 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:15.073 15:24:15 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:15.073 [2024-09-27 15:24:15.875526] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:19:15.073 [2024-09-27 15:24:15.875594] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:15.073 [2024-09-27 15:24:15.960259] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:15.073 [2024-09-27 15:24:16.046920] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:19:15.073 [2024-09-27 15:24:16.046963] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:19:15.073 [2024-09-27 15:24:16.046975] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:19:15.073 [2024-09-27 15:24:16.046983] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:19:15.073 [2024-09-27 15:24:16.046990] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:19:15.073 [2024-09-27 15:24:16.047012] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@864 -- # return 0 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@730 -- # xtrace_disable 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@38 -- # rpc_cmd 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:15.073 [2024-09-27 15:24:16.785802] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_0(0x1e3c5a0/0x1e40a90) succeed. 00:19:15.073 [2024-09-27 15:24:16.794985] rdma.c:2584:create_ib_device: *NOTICE*: Create IB device mlx5_1(0x1e3daa0/0x1e82130) succeed. 00:19:15.073 [2024-09-27 15:24:16.840544] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 8009 *** 00:19:15.073 null0 00:19:15.073 [2024-09-27 15:24:16.872351] rdma.c:3039:nvmf_rdma_listen: *NOTICE*: *** NVMe/RDMA Target Listening on 10.0.0.2 port 4420 *** 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@54 -- # hostpid=1847135 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@53 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -m 0x1 -r /tmp/host.sock --wait-for-rpc -L bdev_nvme 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@55 -- # waitforlisten 1847135 /tmp/host.sock 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@831 -- # '[' -z 1847135 ']' 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@835 -- # local rpc_addr=/tmp/host.sock 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /tmp/host.sock...' 00:19:15.073 Waiting for process to start up and listen on UNIX domain socket /tmp/host.sock... 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:15.073 15:24:16 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:15.332 [2024-09-27 15:24:16.944058] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:19:15.332 [2024-09-27 15:24:16.944115] [ DPDK EAL parameters: nvmf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1847135 ] 00:19:15.332 [2024-09-27 15:24:17.027892] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:15.332 [2024-09-27 15:24:17.114926] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@864 -- # return 0 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@57 -- # trap 'process_shm --id $NVMF_APP_SHM_ID; killprocess $hostpid; nvmftestfini; exit 1' SIGINT SIGTERM EXIT 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@60 -- # rpc_cmd -s /tmp/host.sock bdev_nvme_set_options -e 1 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@61 -- # rpc_cmd -s /tmp/host.sock framework_start_init 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@64 -- # rpc_cmd -s /tmp/host.sock bdev_nvme_start_discovery -b nvme -t rdma -a 10.0.0.2 -s 8009 -f ipv4 -q nqn.2021-12.io.spdk:test --ctrlr-loss-timeout-sec 2 --reconnect-delay-sec 1 --fast-io-fail-timeout-sec 1 --wait-for-attach 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:16.270 15:24:17 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:17.207 [2024-09-27 15:24:18.933671] bdev_nvme.c:7162:discovery_attach_cb: *INFO*: Discovery[10.0.0.2:8009] discovery ctrlr attached 00:19:17.207 [2024-09-27 15:24:18.933695] bdev_nvme.c:7242:discovery_poller: *INFO*: Discovery[10.0.0.2:8009] discovery ctrlr connected 00:19:17.207 [2024-09-27 15:24:18.933708] bdev_nvme.c:7125:get_discovery_log_page: *INFO*: Discovery[10.0.0.2:8009] sent discovery log page command 00:19:17.207 [2024-09-27 15:24:18.936689] bdev_nvme.c:7091:discovery_log_page_cb: *INFO*: Discovery[10.0.0.2:8009] NVM nqn.2016-06.io.spdk:cnode0:10.0.0.2:4420 new subsystem nvme0 00:19:17.207 [2024-09-27 15:24:18.994057] bdev_nvme.c:7952:bdev_nvme_readv: *DEBUG*: read 8 blocks with offset 0 00:19:17.207 [2024-09-27 15:24:18.994099] bdev_nvme.c:7952:bdev_nvme_readv: *DEBUG*: read 1 blocks with offset 0 00:19:17.208 [2024-09-27 15:24:18.994125] bdev_nvme.c:7952:bdev_nvme_readv: *DEBUG*: read 64 blocks with offset 0 00:19:17.208 [2024-09-27 15:24:18.994139] bdev_nvme.c:6981:discovery_attach_controller_done: *INFO*: Discovery[10.0.0.2:8009] attach nvme0 done 00:19:17.208 [2024-09-27 15:24:18.994162] bdev_nvme.c:6940:discovery_remove_controllers: *INFO*: Discovery[10.0.0.2:8009] NVM nqn.2016-06.io.spdk:cnode0:10.0.0.2:4420 found again 00:19:17.208 15:24:18 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:17.208 15:24:18 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@67 -- # wait_for_bdev nvme0n1 00:19:17.208 15:24:18 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:17.208 15:24:18 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:17.208 15:24:18 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:17.208 [2024-09-27 15:24:19.013887] bdev_nvme.c:1735:bdev_nvme_disconnected_qpair_cb: *DEBUG*: qpair 0x2000138dbc40 was disconnected and freed. delete nvme_qpair. 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != \n\v\m\e\0\n\1 ]] 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@70 -- # ip addr del 10.0.0.2/24 dev mlx_0_1 00:19:17.208 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@71 -- # ip link set mlx_0_1 down 00:19:17.208 [2024-09-27 15:24:19.051720] rdma.c:3876:nvmf_process_ib_event: *NOTICE*: Async event: GID table change 00:19:17.208 [2024-09-27 15:24:19.051749] rdma.c:3876:nvmf_process_ib_event: *NOTICE*: Async event: GID table change 00:19:17.775 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@74 -- # wait_for_bdev '' 00:19:17.775 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:17.775 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:17.775 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:17.776 15:24:19 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:18.712 15:24:20 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:19.649 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:19.907 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:19.907 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:19.907 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:19.907 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:19.908 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:19.908 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:19.908 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:19.908 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:19.908 15:24:21 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:20.842 15:24:22 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:21.777 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:22.035 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:22.035 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:22.035 15:24:23 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:22.970 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:22.971 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:22.971 15:24:24 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:23.906 [2024-09-27 15:24:25.620167] nvme_rdma.c:2447:nvme_rdma_log_wc_status: *ERROR*: WC error, qid 0, qp state 5, request 0x35184700172624 type 1, status: (12): transport retry counter exceeded 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:23.906 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:24.165 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:24.165 15:24:25 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:25.102 [2024-09-27 15:24:26.632704] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:1 nsid:0 cdw10:00000000 cdw11:00000000 00:19:25.102 [2024-09-27 15:24:26.632732] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32767 cdw0:3eff200 sqhd:10f0 p:0 m:0 dnr:0 00:19:25.102 [2024-09-27 15:24:26.632745] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:19:25.102 [2024-09-27 15:24:26.632754] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32767 cdw0:3eff200 sqhd:10f0 p:0 m:0 dnr:0 00:19:25.102 [2024-09-27 15:24:26.632764] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:3 nsid:0 cdw10:00000000 cdw11:00000000 00:19:25.102 [2024-09-27 15:24:26.632773] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32767 cdw0:3eff200 sqhd:10f0 p:0 m:0 dnr:0 00:19:25.102 [2024-09-27 15:24:26.632783] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:19:25.102 [2024-09-27 15:24:26.632791] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32767 cdw0:3eff200 sqhd:10f0 p:0 m:0 dnr:0 00:19:25.102 [2024-09-27 15:24:26.632801] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: KEEP ALIVE (18) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:19:25.102 [2024-09-27 15:24:26.632810] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:32767 cdw0:3eff200 sqhd:10f0 p:0 m:0 dnr:0 00:19:25.102 [2024-09-27 15:24:26.634946] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:19:25.102 [2024-09-27 15:24:26.634961] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode0] in failed state. 00:19:25.102 [2024-09-27 15:24:26.635007] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode0] resetting controller 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:25.102 15:24:26 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:26.038 [2024-09-27 15:24:27.637453] nvme_rdma.c:1006:nvme_rdma_addr_resolved: *ERROR*: RDMA address resolution error 00:19:26.038 [2024-09-27 15:24:27.637479] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x2000138e6e00 00:19:26.038 [2024-09-27 15:24:27.637505] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:19:26.038 [2024-09-27 15:24:27.637516] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode0] in failed state. 00:19:26.038 [2024-09-27 15:24:27.637531] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode0] Ctrlr is in error state 00:19:26.038 [2024-09-27 15:24:27.637540] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode0] controller reinitialization failed 00:19:26.038 [2024-09-27 15:24:27.637550] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode0] already in failed state 00:19:26.038 [2024-09-27 15:24:27.637570] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:19:26.038 [2024-09-27 15:24:27.637580] nvme_ctrlr.c:1724:nvme_ctrlr_disconnect: *NOTICE*: [nqn.2016-06.io.spdk:cnode0] resetting controller 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme0n1 != '' ]] 00:19:26.038 15:24:27 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:26.974 [2024-09-27 15:24:28.640001] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2016-06.io.spdk:cnode0] in failed state. 00:19:26.974 [2024-09-27 15:24:28.640025] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2016-06.io.spdk:cnode0] Ctrlr is in error state 00:19:26.974 [2024-09-27 15:24:28.640035] nvme_ctrlr.c:1822:spdk_nvme_ctrlr_reconnect_poll_async: *ERROR*: [nqn.2016-06.io.spdk:cnode0] controller reinitialization failed 00:19:26.974 [2024-09-27 15:24:28.640044] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2016-06.io.spdk:cnode0] already in failed state 00:19:26.974 [2024-09-27 15:24:28.640060] bdev_nvme.c:2181:_bdev_nvme_reset_ctrlr_complete: *ERROR*: Resetting controller failed. 00:19:26.974 [2024-09-27 15:24:28.640081] bdev_nvme.c:6913:remove_discovery_entry: *INFO*: Discovery[10.0.0.2:8009] Remove discovery entry: nqn.2016-06.io.spdk:cnode0:10.0.0.2:4420 00:19:26.974 [2024-09-27 15:24:28.640696] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2014-08.org.nvmexpress.discovery] in failed state. 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ '' != '' ]] 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@77 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:19:27.233 15:24:28 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@78 -- # ip link set mlx_0_1 up 00:19:27.233 [2024-09-27 15:24:28.926633] rdma.c:3876:nvmf_process_ib_event: *NOTICE*: Async event: GID table change 00:19:27.233 [2024-09-27 15:24:28.926671] rdma.c:3876:nvmf_process_ib_event: *NOTICE*: Async event: GID table change 00:19:27.233 [2024-09-27 15:24:28.958712] nvme_ctrlr.c:4505:nvme_ctrlr_keep_alive: *ERROR*: [nqn.2014-08.org.nvmexpress.discovery] Submitting Keep Alive failed 00:19:27.233 [2024-09-27 15:24:28.958763] nvme_fabric.c: 214:nvme_fabric_prop_get_cmd_async: *ERROR*: Failed to send Property Get fabrics command 00:19:27.233 [2024-09-27 15:24:28.958774] nvme_ctrlr.c:1213:nvme_ctrlr_shutdown_async: *ERROR*: [nqn.2014-08.org.nvmexpress.discovery] Failed to read the CC register 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@81 -- # wait_for_bdev nvme1n1 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ '' != \n\v\m\e\1\n\1 ]] 00:19:27.802 15:24:29 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:28.369 [2024-09-27 15:24:29.961137] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: KEEP ALIVE (18) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:19:28.369 [2024-09-27 15:24:29.961162] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - SQ DELETION (00/08) qid:0 cid:0 cdw0:753610f0 sqhd:0000 p:0 m:0 dnr:0 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ '' != \n\v\m\e\1\n\1 ]] 00:19:28.936 15:24:30 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@29 -- # sleep 1 00:19:29.195 [2024-09-27 15:24:30.985204] bdev_nvme.c:7162:discovery_attach_cb: *INFO*: Discovery[10.0.0.2:8009] discovery ctrlr attached 00:19:29.195 [2024-09-27 15:24:30.985223] bdev_nvme.c:7242:discovery_poller: *INFO*: Discovery[10.0.0.2:8009] discovery ctrlr connected 00:19:29.195 [2024-09-27 15:24:30.985235] bdev_nvme.c:7125:get_discovery_log_page: *INFO*: Discovery[10.0.0.2:8009] sent discovery log page command 00:19:29.195 [2024-09-27 15:24:30.988224] bdev_nvme.c:7091:discovery_log_page_cb: *INFO*: Discovery[10.0.0.2:8009] NVM nqn.2016-06.io.spdk:cnode0:10.0.0.2:4420 new subsystem nvme1 00:19:29.195 [2024-09-27 15:24:31.043387] bdev_nvme.c:7952:bdev_nvme_readv: *DEBUG*: read 8 blocks with offset 0 00:19:29.195 [2024-09-27 15:24:31.043424] bdev_nvme.c:7952:bdev_nvme_readv: *DEBUG*: read 1 blocks with offset 0 00:19:29.195 [2024-09-27 15:24:31.043444] bdev_nvme.c:7952:bdev_nvme_readv: *DEBUG*: read 64 blocks with offset 0 00:19:29.195 [2024-09-27 15:24:31.043462] bdev_nvme.c:6981:discovery_attach_controller_done: *INFO*: Discovery[10.0.0.2:8009] attach nvme1 done 00:19:29.195 [2024-09-27 15:24:31.043472] bdev_nvme.c:6940:discovery_remove_controllers: *INFO*: Discovery[10.0.0.2:8009] NVM nqn.2016-06.io.spdk:cnode0:10.0.0.2:4420 found again 00:19:29.455 [2024-09-27 15:24:31.064458] bdev_nvme.c:1735:bdev_nvme_disconnected_qpair_cb: *DEBUG*: qpair 0x2000138dbc40 was disconnected and freed. delete nvme_qpair. 00:19:30.023 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # get_bdev_list 00:19:30.023 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # rpc_cmd -s /tmp/host.sock bdev_get_bdevs 00:19:30.023 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # jq -r '.[].name' 00:19:30.023 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # sort 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@24 -- # xargs 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@28 -- # [[ nvme1n1 != \n\v\m\e\1\n\1 ]] 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@85 -- # killprocess 1847135 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@950 -- # '[' -z 1847135 ']' 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@954 -- # kill -0 1847135 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@955 -- # uname 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1847135 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1847135' 00:19:30.024 killing process with pid 1847135 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@969 -- # kill 1847135 00:19:30.024 15:24:31 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@974 -- # wait 1847135 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- host/discovery_remove_ifc.sh@86 -- # nvmftestfini 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@331 -- # nvmfcleanup 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@99 -- # sync 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@102 -- # set +e 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@103 -- # for i in {1..20} 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:19:30.284 rmmod nvme_rdma 00:19:30.284 rmmod nvme_fabrics 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@106 -- # set -e 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@107 -- # return 0 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@332 -- # '[' -n 1846936 ']' 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@333 -- # killprocess 1846936 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@950 -- # '[' -z 1846936 ']' 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@954 -- # kill -0 1846936 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@955 -- # uname 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1846936 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1846936' 00:19:30.284 killing process with pid 1846936 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@969 -- # kill 1846936 00:19:30.284 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@974 -- # wait 1846936 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@338 -- # nvmf_fini 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@264 -- # local dev 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@267 -- # remove_target_ns 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@22 -- # _remove_target_ns 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@268 -- # delete_main_bridge 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@130 -- # return 0 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@41 -- # _dev=0 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@41 -- # dev_map=() 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/setup.sh@284 -- # iptr 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@538 -- # iptables-save 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- nvmf/common.sh@538 -- # iptables-restore 00:19:30.544 00:19:30.544 real 0m23.899s 00:19:30.544 user 0m34.472s 00:19:30.544 sys 0m6.864s 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:30.544 15:24:32 nvmf_rdma.nvmf_host.nvmf_discovery_remove_ifc -- common/autotest_common.sh@10 -- # set +x 00:19:30.544 ************************************ 00:19:30.544 END TEST nvmf_discovery_remove_ifc 00:19:30.544 ************************************ 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@26 -- # run_test nvmf_identify_kernel_target /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/identify_kernel_nvmf.sh --transport=rdma 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:19:30.803 ************************************ 00:19:30.803 START TEST nvmf_identify_kernel_target 00:19:30.803 ************************************ 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/identify_kernel_nvmf.sh --transport=rdma 00:19:30.803 * Looking for test storage... 00:19:30.803 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1681 -- # lcov --version 00:19:30.803 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@333 -- # local ver1 ver1_l 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@334 -- # local ver2 ver2_l 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@336 -- # IFS=.-: 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@336 -- # read -ra ver1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@337 -- # IFS=.-: 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@337 -- # read -ra ver2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@338 -- # local 'op=<' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@340 -- # ver1_l=2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@341 -- # ver2_l=1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@344 -- # case "$op" in 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@345 -- # : 1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@364 -- # (( v = 0 )) 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@365 -- # decimal 1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@353 -- # local d=1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@355 -- # echo 1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@365 -- # ver1[v]=1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@366 -- # decimal 2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@353 -- # local d=2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@355 -- # echo 2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@366 -- # ver2[v]=2 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@368 -- # return 0 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:19:31.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:31.063 --rc genhtml_branch_coverage=1 00:19:31.063 --rc genhtml_function_coverage=1 00:19:31.063 --rc genhtml_legend=1 00:19:31.063 --rc geninfo_all_blocks=1 00:19:31.063 --rc geninfo_unexecuted_blocks=1 00:19:31.063 00:19:31.063 ' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:19:31.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:31.063 --rc genhtml_branch_coverage=1 00:19:31.063 --rc genhtml_function_coverage=1 00:19:31.063 --rc genhtml_legend=1 00:19:31.063 --rc geninfo_all_blocks=1 00:19:31.063 --rc geninfo_unexecuted_blocks=1 00:19:31.063 00:19:31.063 ' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:19:31.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:31.063 --rc genhtml_branch_coverage=1 00:19:31.063 --rc genhtml_function_coverage=1 00:19:31.063 --rc genhtml_legend=1 00:19:31.063 --rc geninfo_all_blocks=1 00:19:31.063 --rc geninfo_unexecuted_blocks=1 00:19:31.063 00:19:31.063 ' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:19:31.063 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:31.063 --rc genhtml_branch_coverage=1 00:19:31.063 --rc genhtml_function_coverage=1 00:19:31.063 --rc genhtml_legend=1 00:19:31.063 --rc geninfo_all_blocks=1 00:19:31.063 --rc geninfo_unexecuted_blocks=1 00:19:31.063 00:19:31.063 ' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@9 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@7 -- # uname -s 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:19:31.063 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@15 -- # shopt -s extglob 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- paths/export.sh@5 -- # export PATH 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@50 -- # : 0 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:19:31.064 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@54 -- # have_pci_nics=0 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@11 -- # nvmftestinit 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@292 -- # prepare_net_devs 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@254 -- # local -g is_hw=no 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@256 -- # remove_target_ns 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@22 -- # _remove_target_ns 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@125 -- # xtrace_disable 00:19:31.064 15:24:32 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@10 -- # set +x 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@131 -- # pci_devs=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@131 -- # local -a pci_devs 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@132 -- # pci_net_devs=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@133 -- # pci_drivers=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@133 -- # local -A pci_drivers 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@135 -- # net_devs=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@135 -- # local -ga net_devs 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@136 -- # e810=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@136 -- # local -ga e810 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@137 -- # x722=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@137 -- # local -ga x722 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@138 -- # mlx=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@138 -- # local -ga mlx 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:19:37.638 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:19:37.638 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:19:37.638 Found net devices under 0000:18:00.0: mlx_0_0 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:19:37.638 Found net devices under 0000:18:00.1: mlx_0_1 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@249 -- # get_rdma_if_list 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@75 -- # rdma_devs=() 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@89 -- # continue 2 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@89 -- # continue 2 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@258 -- # is_hw=yes 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:19:37.638 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:19:37.639 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:19:37.639 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:19:37.639 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:19:37.639 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@61 -- # uname 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@65 -- # modprobe ib_cm 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@66 -- # modprobe ib_core 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@67 -- # modprobe ib_umad 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@69 -- # modprobe iw_cm 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@27 -- # local -gA dev_map 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@28 -- # local -g _dev 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:19:37.899 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@44 -- # ips=() 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@58 -- # key_initiator=target1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@11 -- # local val=167772161 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:19:37.900 10.0.0.1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@11 -- # local val=167772162 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:19:37.900 10.0.0.2 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@38 -- # ping_ips 1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@107 -- # local dev=target0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:19:37.900 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:19:37.900 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.032 ms 00:19:37.900 00:19:37.900 --- 10.0.0.2 ping statistics --- 00:19:37.900 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:19:37.900 rtt min/avg/max/mdev = 0.032/0.032/0.032/0.000 ms 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@107 -- # local dev=target0 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:37.900 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:19:37.901 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:19:37.901 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:19:37.901 00:19:37.901 --- 10.0.0.2 ping statistics --- 00:19:37.901 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:19:37.901 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@98 -- # (( pair++ )) 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@266 -- # return 0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@107 -- # local dev=target0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # get_net_dev target1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@107 -- # local dev=target1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@107 -- # local dev=target0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # get_net_dev target1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@107 -- # local dev=target1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:19:37.901 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@13 -- # trap 'nvmftestfini || :; clean_kernel_target' EXIT 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@15 -- # target_ip=10.0.0.2 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@16 -- # configure_kernel_target nqn.2016-06.io.spdk:testnqn 10.0.0.2 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@430 -- # local kernel_name=nqn.2016-06.io.spdk:testnqn kernel_target_ip=10.0.0.2 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@432 -- # nvmet=/sys/kernel/config/nvmet 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@433 -- # kernel_subsystem=/sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@434 -- # kernel_namespace=/sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn/namespaces/1 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@435 -- # kernel_port=/sys/kernel/config/nvmet/ports/1 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@437 -- # local block nvme 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@439 -- # [[ ! -e /sys/module/nvmet ]] 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@440 -- # modprobe nvmet 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@443 -- # [[ -e /sys/kernel/config/nvmet ]] 00:19:38.161 15:24:39 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@445 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh reset 00:19:41.545 Waiting for block devices as requested 00:19:41.545 0000:5e:00.0 (144d a80a): vfio-pci -> nvme 00:19:41.545 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:19:41.545 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:19:41.545 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:19:41.805 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:19:41.805 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:19:41.805 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:19:42.064 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:19:42.064 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:19:42.064 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:19:42.323 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:19:42.323 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:19:42.323 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:19:42.583 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:19:42.583 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:19:42.583 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:19:42.843 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@448 -- # for block in /sys/block/nvme* 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@449 -- # [[ -e /sys/block/nvme0n1 ]] 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@450 -- # is_block_zoned nvme0n1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@451 -- # block_in_use nvme0n1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@390 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:19:42.843 No valid GPT data, bailing 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@394 -- # pt= 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- scripts/common.sh@395 -- # return 1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@451 -- # nvme=/dev/nvme0n1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@454 -- # [[ -b /dev/nvme0n1 ]] 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@456 -- # mkdir /sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@457 -- # mkdir /sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn/namespaces/1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@458 -- # mkdir /sys/kernel/config/nvmet/ports/1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@463 -- # echo SPDK-nqn.2016-06.io.spdk:testnqn 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@465 -- # echo 1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@466 -- # echo /dev/nvme0n1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@467 -- # echo 1 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@469 -- # echo 10.0.0.2 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@470 -- # echo rdma 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@471 -- # echo 4420 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@472 -- # echo ipv4 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@475 -- # ln -s /sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn /sys/kernel/config/nvmet/ports/1/subsystems/ 00:19:42.843 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@478 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -a 10.0.0.2 -t rdma -s 4420 00:19:43.102 00:19:43.102 Discovery Log Number of Records 2, Generation counter 2 00:19:43.102 =====Discovery Log Entry 0====== 00:19:43.102 trtype: rdma 00:19:43.102 adrfam: ipv4 00:19:43.102 subtype: current discovery subsystem 00:19:43.102 treq: not specified, sq flow control disable supported 00:19:43.102 portid: 1 00:19:43.102 trsvcid: 4420 00:19:43.102 subnqn: nqn.2014-08.org.nvmexpress.discovery 00:19:43.102 traddr: 10.0.0.2 00:19:43.102 eflags: none 00:19:43.102 rdma_prtype: not specified 00:19:43.102 rdma_qptype: connected 00:19:43.102 rdma_cms: rdma-cm 00:19:43.102 rdma_pkey: 0x0000 00:19:43.102 =====Discovery Log Entry 1====== 00:19:43.102 trtype: rdma 00:19:43.102 adrfam: ipv4 00:19:43.102 subtype: nvme subsystem 00:19:43.102 treq: not specified, sq flow control disable supported 00:19:43.102 portid: 1 00:19:43.102 trsvcid: 4420 00:19:43.102 subnqn: nqn.2016-06.io.spdk:testnqn 00:19:43.102 traddr: 10.0.0.2 00:19:43.102 eflags: none 00:19:43.102 rdma_prtype: not specified 00:19:43.102 rdma_qptype: connected 00:19:43.102 rdma_cms: rdma-cm 00:19:43.102 rdma_pkey: 0x0000 00:19:43.102 15:24:44 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@18 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_identify -r ' trtype:rdma adrfam:IPv4 traddr:10.0.0.2 00:19:43.102 trsvcid:4420 subnqn:nqn.2014-08.org.nvmexpress.discovery' 00:19:43.363 ===================================================== 00:19:43.363 NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2014-08.org.nvmexpress.discovery 00:19:43.363 ===================================================== 00:19:43.363 Controller Capabilities/Features 00:19:43.363 ================================ 00:19:43.363 Vendor ID: 0000 00:19:43.363 Subsystem Vendor ID: 0000 00:19:43.363 Serial Number: 47e065aa7cea0f9732f4 00:19:43.363 Model Number: Linux 00:19:43.363 Firmware Version: 6.8.9-20 00:19:43.363 Recommended Arb Burst: 0 00:19:43.363 IEEE OUI Identifier: 00 00 00 00:19:43.363 Multi-path I/O 00:19:43.363 May have multiple subsystem ports: No 00:19:43.363 May have multiple controllers: No 00:19:43.363 Associated with SR-IOV VF: No 00:19:43.363 Max Data Transfer Size: Unlimited 00:19:43.363 Max Number of Namespaces: 0 00:19:43.363 Max Number of I/O Queues: 1024 00:19:43.363 NVMe Specification Version (VS): 1.3 00:19:43.363 NVMe Specification Version (Identify): 1.3 00:19:43.363 Maximum Queue Entries: 128 00:19:43.363 Contiguous Queues Required: No 00:19:43.363 Arbitration Mechanisms Supported 00:19:43.363 Weighted Round Robin: Not Supported 00:19:43.363 Vendor Specific: Not Supported 00:19:43.363 Reset Timeout: 7500 ms 00:19:43.363 Doorbell Stride: 4 bytes 00:19:43.363 NVM Subsystem Reset: Not Supported 00:19:43.363 Command Sets Supported 00:19:43.363 NVM Command Set: Supported 00:19:43.363 Boot Partition: Not Supported 00:19:43.363 Memory Page Size Minimum: 4096 bytes 00:19:43.363 Memory Page Size Maximum: 4096 bytes 00:19:43.363 Persistent Memory Region: Not Supported 00:19:43.363 Optional Asynchronous Events Supported 00:19:43.363 Namespace Attribute Notices: Not Supported 00:19:43.363 Firmware Activation Notices: Not Supported 00:19:43.363 ANA Change Notices: Not Supported 00:19:43.363 PLE Aggregate Log Change Notices: Not Supported 00:19:43.363 LBA Status Info Alert Notices: Not Supported 00:19:43.363 EGE Aggregate Log Change Notices: Not Supported 00:19:43.363 Normal NVM Subsystem Shutdown event: Not Supported 00:19:43.363 Zone Descriptor Change Notices: Not Supported 00:19:43.363 Discovery Log Change Notices: Supported 00:19:43.363 Controller Attributes 00:19:43.363 128-bit Host Identifier: Not Supported 00:19:43.363 Non-Operational Permissive Mode: Not Supported 00:19:43.363 NVM Sets: Not Supported 00:19:43.363 Read Recovery Levels: Not Supported 00:19:43.363 Endurance Groups: Not Supported 00:19:43.363 Predictable Latency Mode: Not Supported 00:19:43.363 Traffic Based Keep ALive: Not Supported 00:19:43.363 Namespace Granularity: Not Supported 00:19:43.363 SQ Associations: Not Supported 00:19:43.363 UUID List: Not Supported 00:19:43.363 Multi-Domain Subsystem: Not Supported 00:19:43.363 Fixed Capacity Management: Not Supported 00:19:43.363 Variable Capacity Management: Not Supported 00:19:43.363 Delete Endurance Group: Not Supported 00:19:43.363 Delete NVM Set: Not Supported 00:19:43.363 Extended LBA Formats Supported: Not Supported 00:19:43.363 Flexible Data Placement Supported: Not Supported 00:19:43.363 00:19:43.363 Controller Memory Buffer Support 00:19:43.363 ================================ 00:19:43.363 Supported: No 00:19:43.363 00:19:43.363 Persistent Memory Region Support 00:19:43.363 ================================ 00:19:43.363 Supported: No 00:19:43.363 00:19:43.363 Admin Command Set Attributes 00:19:43.363 ============================ 00:19:43.363 Security Send/Receive: Not Supported 00:19:43.363 Format NVM: Not Supported 00:19:43.363 Firmware Activate/Download: Not Supported 00:19:43.363 Namespace Management: Not Supported 00:19:43.363 Device Self-Test: Not Supported 00:19:43.363 Directives: Not Supported 00:19:43.363 NVMe-MI: Not Supported 00:19:43.363 Virtualization Management: Not Supported 00:19:43.363 Doorbell Buffer Config: Not Supported 00:19:43.363 Get LBA Status Capability: Not Supported 00:19:43.363 Command & Feature Lockdown Capability: Not Supported 00:19:43.363 Abort Command Limit: 1 00:19:43.363 Async Event Request Limit: 1 00:19:43.363 Number of Firmware Slots: N/A 00:19:43.363 Firmware Slot 1 Read-Only: N/A 00:19:43.363 Firmware Activation Without Reset: N/A 00:19:43.363 Multiple Update Detection Support: N/A 00:19:43.363 Firmware Update Granularity: No Information Provided 00:19:43.363 Per-Namespace SMART Log: No 00:19:43.363 Asymmetric Namespace Access Log Page: Not Supported 00:19:43.363 Subsystem NQN: nqn.2014-08.org.nvmexpress.discovery 00:19:43.363 Command Effects Log Page: Not Supported 00:19:43.363 Get Log Page Extended Data: Supported 00:19:43.363 Telemetry Log Pages: Not Supported 00:19:43.363 Persistent Event Log Pages: Not Supported 00:19:43.363 Supported Log Pages Log Page: May Support 00:19:43.363 Commands Supported & Effects Log Page: Not Supported 00:19:43.363 Feature Identifiers & Effects Log Page:May Support 00:19:43.363 NVMe-MI Commands & Effects Log Page: May Support 00:19:43.363 Data Area 4 for Telemetry Log: Not Supported 00:19:43.363 Error Log Page Entries Supported: 1 00:19:43.363 Keep Alive: Not Supported 00:19:43.363 00:19:43.363 NVM Command Set Attributes 00:19:43.363 ========================== 00:19:43.363 Submission Queue Entry Size 00:19:43.363 Max: 1 00:19:43.363 Min: 1 00:19:43.363 Completion Queue Entry Size 00:19:43.363 Max: 1 00:19:43.363 Min: 1 00:19:43.363 Number of Namespaces: 0 00:19:43.363 Compare Command: Not Supported 00:19:43.363 Write Uncorrectable Command: Not Supported 00:19:43.363 Dataset Management Command: Not Supported 00:19:43.363 Write Zeroes Command: Not Supported 00:19:43.363 Set Features Save Field: Not Supported 00:19:43.363 Reservations: Not Supported 00:19:43.363 Timestamp: Not Supported 00:19:43.363 Copy: Not Supported 00:19:43.363 Volatile Write Cache: Not Present 00:19:43.363 Atomic Write Unit (Normal): 1 00:19:43.363 Atomic Write Unit (PFail): 1 00:19:43.363 Atomic Compare & Write Unit: 1 00:19:43.363 Fused Compare & Write: Not Supported 00:19:43.363 Scatter-Gather List 00:19:43.363 SGL Command Set: Supported 00:19:43.363 SGL Keyed: Supported 00:19:43.363 SGL Bit Bucket Descriptor: Not Supported 00:19:43.363 SGL Metadata Pointer: Not Supported 00:19:43.363 Oversized SGL: Not Supported 00:19:43.363 SGL Metadata Address: Not Supported 00:19:43.363 SGL Offset: Supported 00:19:43.363 Transport SGL Data Block: Not Supported 00:19:43.363 Replay Protected Memory Block: Not Supported 00:19:43.363 00:19:43.363 Firmware Slot Information 00:19:43.363 ========================= 00:19:43.363 Active slot: 0 00:19:43.363 00:19:43.363 00:19:43.363 Error Log 00:19:43.363 ========= 00:19:43.363 00:19:43.363 Active Namespaces 00:19:43.363 ================= 00:19:43.363 Discovery Log Page 00:19:43.363 ================== 00:19:43.363 Generation Counter: 2 00:19:43.363 Number of Records: 2 00:19:43.363 Record Format: 0 00:19:43.363 00:19:43.363 Discovery Log Entry 0 00:19:43.363 ---------------------- 00:19:43.363 Transport Type: 1 (RDMA) 00:19:43.363 Address Family: 1 (IPv4) 00:19:43.363 Subsystem Type: 3 (Current Discovery Subsystem) 00:19:43.363 Entry Flags: 00:19:43.363 Duplicate Returned Information: 0 00:19:43.363 Explicit Persistent Connection Support for Discovery: 0 00:19:43.363 Transport Requirements: 00:19:43.363 Secure Channel: Not Specified 00:19:43.363 Port ID: 1 (0x0001) 00:19:43.363 Controller ID: 65535 (0xffff) 00:19:43.363 Admin Max SQ Size: 32 00:19:43.363 Transport Service Identifier: 4420 00:19:43.363 NVM Subsystem Qualified Name: nqn.2014-08.org.nvmexpress.discovery 00:19:43.363 Transport Address: 10.0.0.2 00:19:43.363 Transport Specific Address Subtype - RDMA 00:19:43.363 RDMA QP Service Type: 1 (Reliable Connected) 00:19:43.363 RDMA Provider Type: 1 (No provider specified) 00:19:43.363 RDMA CM Service: 1 (RDMA_CM) 00:19:43.363 Discovery Log Entry 1 00:19:43.363 ---------------------- 00:19:43.363 Transport Type: 1 (RDMA) 00:19:43.363 Address Family: 1 (IPv4) 00:19:43.364 Subsystem Type: 2 (NVM Subsystem) 00:19:43.364 Entry Flags: 00:19:43.364 Duplicate Returned Information: 0 00:19:43.364 Explicit Persistent Connection Support for Discovery: 0 00:19:43.364 Transport Requirements: 00:19:43.364 Secure Channel: Not Specified 00:19:43.364 Port ID: 1 (0x0001) 00:19:43.364 Controller ID: 65535 (0xffff) 00:19:43.364 Admin Max SQ Size: 32 00:19:43.364 Transport Service Identifier: 4420 00:19:43.364 NVM Subsystem Qualified Name: nqn.2016-06.io.spdk:testnqn 00:19:43.364 Transport Address: 10.0.0.2 00:19:43.364 Transport Specific Address Subtype - RDMA 00:19:43.364 RDMA QP Service Type: 1 (Reliable Connected) 00:19:43.364 RDMA Provider Type: 1 (No provider specified) 00:19:43.364 RDMA CM Service: 1 (RDMA_CM) 00:19:43.364 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@24 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/spdk_nvme_identify -r ' trtype:rdma adrfam:IPv4 traddr:10.0.0.2 trsvcid:4420 subnqn:nqn.2016-06.io.spdk:testnqn' 00:19:43.364 get_feature(0x01) failed 00:19:43.364 get_feature(0x02) failed 00:19:43.364 get_feature(0x04) failed 00:19:43.364 ===================================================== 00:19:43.364 NVMe over Fabrics controller at 10.0.0.2:4420: nqn.2016-06.io.spdk:testnqn 00:19:43.364 ===================================================== 00:19:43.364 Controller Capabilities/Features 00:19:43.364 ================================ 00:19:43.364 Vendor ID: 0000 00:19:43.364 Subsystem Vendor ID: 0000 00:19:43.364 Serial Number: dc6780e5e885d3dd957e 00:19:43.364 Model Number: SPDK-nqn.2016-06.io.spdk:testnqn 00:19:43.364 Firmware Version: 6.8.9-20 00:19:43.364 Recommended Arb Burst: 6 00:19:43.364 IEEE OUI Identifier: 00 00 00 00:19:43.364 Multi-path I/O 00:19:43.364 May have multiple subsystem ports: Yes 00:19:43.364 May have multiple controllers: Yes 00:19:43.364 Associated with SR-IOV VF: No 00:19:43.364 Max Data Transfer Size: 1048576 00:19:43.364 Max Number of Namespaces: 1024 00:19:43.364 Max Number of I/O Queues: 128 00:19:43.364 NVMe Specification Version (VS): 1.3 00:19:43.364 NVMe Specification Version (Identify): 1.3 00:19:43.364 Maximum Queue Entries: 128 00:19:43.364 Contiguous Queues Required: No 00:19:43.364 Arbitration Mechanisms Supported 00:19:43.364 Weighted Round Robin: Not Supported 00:19:43.364 Vendor Specific: Not Supported 00:19:43.364 Reset Timeout: 7500 ms 00:19:43.364 Doorbell Stride: 4 bytes 00:19:43.364 NVM Subsystem Reset: Not Supported 00:19:43.364 Command Sets Supported 00:19:43.364 NVM Command Set: Supported 00:19:43.364 Boot Partition: Not Supported 00:19:43.364 Memory Page Size Minimum: 4096 bytes 00:19:43.364 Memory Page Size Maximum: 4096 bytes 00:19:43.364 Persistent Memory Region: Not Supported 00:19:43.364 Optional Asynchronous Events Supported 00:19:43.364 Namespace Attribute Notices: Supported 00:19:43.364 Firmware Activation Notices: Not Supported 00:19:43.364 ANA Change Notices: Supported 00:19:43.364 PLE Aggregate Log Change Notices: Not Supported 00:19:43.364 LBA Status Info Alert Notices: Not Supported 00:19:43.364 EGE Aggregate Log Change Notices: Not Supported 00:19:43.364 Normal NVM Subsystem Shutdown event: Not Supported 00:19:43.364 Zone Descriptor Change Notices: Not Supported 00:19:43.364 Discovery Log Change Notices: Not Supported 00:19:43.364 Controller Attributes 00:19:43.364 128-bit Host Identifier: Supported 00:19:43.364 Non-Operational Permissive Mode: Not Supported 00:19:43.364 NVM Sets: Not Supported 00:19:43.364 Read Recovery Levels: Not Supported 00:19:43.364 Endurance Groups: Not Supported 00:19:43.364 Predictable Latency Mode: Not Supported 00:19:43.364 Traffic Based Keep ALive: Supported 00:19:43.364 Namespace Granularity: Not Supported 00:19:43.364 SQ Associations: Not Supported 00:19:43.364 UUID List: Not Supported 00:19:43.364 Multi-Domain Subsystem: Not Supported 00:19:43.364 Fixed Capacity Management: Not Supported 00:19:43.364 Variable Capacity Management: Not Supported 00:19:43.364 Delete Endurance Group: Not Supported 00:19:43.364 Delete NVM Set: Not Supported 00:19:43.364 Extended LBA Formats Supported: Not Supported 00:19:43.364 Flexible Data Placement Supported: Not Supported 00:19:43.364 00:19:43.364 Controller Memory Buffer Support 00:19:43.364 ================================ 00:19:43.364 Supported: No 00:19:43.364 00:19:43.364 Persistent Memory Region Support 00:19:43.364 ================================ 00:19:43.364 Supported: No 00:19:43.364 00:19:43.364 Admin Command Set Attributes 00:19:43.364 ============================ 00:19:43.364 Security Send/Receive: Not Supported 00:19:43.364 Format NVM: Not Supported 00:19:43.364 Firmware Activate/Download: Not Supported 00:19:43.364 Namespace Management: Not Supported 00:19:43.364 Device Self-Test: Not Supported 00:19:43.364 Directives: Not Supported 00:19:43.364 NVMe-MI: Not Supported 00:19:43.364 Virtualization Management: Not Supported 00:19:43.364 Doorbell Buffer Config: Not Supported 00:19:43.364 Get LBA Status Capability: Not Supported 00:19:43.364 Command & Feature Lockdown Capability: Not Supported 00:19:43.364 Abort Command Limit: 4 00:19:43.364 Async Event Request Limit: 4 00:19:43.364 Number of Firmware Slots: N/A 00:19:43.364 Firmware Slot 1 Read-Only: N/A 00:19:43.364 Firmware Activation Without Reset: N/A 00:19:43.364 Multiple Update Detection Support: N/A 00:19:43.364 Firmware Update Granularity: No Information Provided 00:19:43.364 Per-Namespace SMART Log: Yes 00:19:43.364 Asymmetric Namespace Access Log Page: Supported 00:19:43.364 ANA Transition Time : 10 sec 00:19:43.364 00:19:43.364 Asymmetric Namespace Access Capabilities 00:19:43.364 ANA Optimized State : Supported 00:19:43.364 ANA Non-Optimized State : Supported 00:19:43.364 ANA Inaccessible State : Supported 00:19:43.364 ANA Persistent Loss State : Supported 00:19:43.364 ANA Change State : Supported 00:19:43.364 ANAGRPID is not changed : No 00:19:43.364 Non-Zero ANAGRPID for NS Mgmt Cmd : Not Supported 00:19:43.364 00:19:43.364 ANA Group Identifier Maximum : 128 00:19:43.364 Number of ANA Group Identifiers : 128 00:19:43.364 Max Number of Allowed Namespaces : 1024 00:19:43.364 Subsystem NQN: nqn.2016-06.io.spdk:testnqn 00:19:43.364 Command Effects Log Page: Supported 00:19:43.364 Get Log Page Extended Data: Supported 00:19:43.364 Telemetry Log Pages: Not Supported 00:19:43.364 Persistent Event Log Pages: Not Supported 00:19:43.364 Supported Log Pages Log Page: May Support 00:19:43.364 Commands Supported & Effects Log Page: Not Supported 00:19:43.364 Feature Identifiers & Effects Log Page:May Support 00:19:43.364 NVMe-MI Commands & Effects Log Page: May Support 00:19:43.364 Data Area 4 for Telemetry Log: Not Supported 00:19:43.364 Error Log Page Entries Supported: 128 00:19:43.364 Keep Alive: Supported 00:19:43.364 Keep Alive Granularity: 1000 ms 00:19:43.364 00:19:43.364 NVM Command Set Attributes 00:19:43.364 ========================== 00:19:43.364 Submission Queue Entry Size 00:19:43.364 Max: 64 00:19:43.364 Min: 64 00:19:43.364 Completion Queue Entry Size 00:19:43.364 Max: 16 00:19:43.364 Min: 16 00:19:43.364 Number of Namespaces: 1024 00:19:43.364 Compare Command: Not Supported 00:19:43.364 Write Uncorrectable Command: Not Supported 00:19:43.364 Dataset Management Command: Supported 00:19:43.364 Write Zeroes Command: Supported 00:19:43.364 Set Features Save Field: Not Supported 00:19:43.364 Reservations: Not Supported 00:19:43.364 Timestamp: Not Supported 00:19:43.364 Copy: Not Supported 00:19:43.364 Volatile Write Cache: Present 00:19:43.364 Atomic Write Unit (Normal): 1 00:19:43.364 Atomic Write Unit (PFail): 1 00:19:43.364 Atomic Compare & Write Unit: 1 00:19:43.364 Fused Compare & Write: Not Supported 00:19:43.364 Scatter-Gather List 00:19:43.364 SGL Command Set: Supported 00:19:43.364 SGL Keyed: Supported 00:19:43.364 SGL Bit Bucket Descriptor: Not Supported 00:19:43.364 SGL Metadata Pointer: Not Supported 00:19:43.364 Oversized SGL: Not Supported 00:19:43.364 SGL Metadata Address: Not Supported 00:19:43.364 SGL Offset: Supported 00:19:43.364 Transport SGL Data Block: Not Supported 00:19:43.364 Replay Protected Memory Block: Not Supported 00:19:43.364 00:19:43.364 Firmware Slot Information 00:19:43.364 ========================= 00:19:43.364 Active slot: 0 00:19:43.364 00:19:43.364 Asymmetric Namespace Access 00:19:43.364 =========================== 00:19:43.364 Change Count : 0 00:19:43.364 Number of ANA Group Descriptors : 1 00:19:43.364 ANA Group Descriptor : 0 00:19:43.364 ANA Group ID : 1 00:19:43.364 Number of NSID Values : 1 00:19:43.364 Change Count : 0 00:19:43.364 ANA State : 1 00:19:43.365 Namespace Identifier : 1 00:19:43.365 00:19:43.365 Commands Supported and Effects 00:19:43.365 ============================== 00:19:43.365 Admin Commands 00:19:43.365 -------------- 00:19:43.365 Get Log Page (02h): Supported 00:19:43.365 Identify (06h): Supported 00:19:43.365 Abort (08h): Supported 00:19:43.365 Set Features (09h): Supported 00:19:43.365 Get Features (0Ah): Supported 00:19:43.365 Asynchronous Event Request (0Ch): Supported 00:19:43.365 Keep Alive (18h): Supported 00:19:43.365 I/O Commands 00:19:43.365 ------------ 00:19:43.365 Flush (00h): Supported 00:19:43.365 Write (01h): Supported LBA-Change 00:19:43.365 Read (02h): Supported 00:19:43.365 Write Zeroes (08h): Supported LBA-Change 00:19:43.365 Dataset Management (09h): Supported 00:19:43.365 00:19:43.365 Error Log 00:19:43.365 ========= 00:19:43.365 Entry: 0 00:19:43.365 Error Count: 0x3 00:19:43.365 Submission Queue Id: 0x0 00:19:43.365 Command Id: 0x5 00:19:43.365 Phase Bit: 0 00:19:43.365 Status Code: 0x2 00:19:43.365 Status Code Type: 0x0 00:19:43.365 Do Not Retry: 1 00:19:43.365 Error Location: 0x28 00:19:43.365 LBA: 0x0 00:19:43.365 Namespace: 0x0 00:19:43.365 Vendor Log Page: 0x0 00:19:43.365 ----------- 00:19:43.365 Entry: 1 00:19:43.365 Error Count: 0x2 00:19:43.365 Submission Queue Id: 0x0 00:19:43.365 Command Id: 0x5 00:19:43.365 Phase Bit: 0 00:19:43.365 Status Code: 0x2 00:19:43.365 Status Code Type: 0x0 00:19:43.365 Do Not Retry: 1 00:19:43.365 Error Location: 0x28 00:19:43.365 LBA: 0x0 00:19:43.365 Namespace: 0x0 00:19:43.365 Vendor Log Page: 0x0 00:19:43.365 ----------- 00:19:43.365 Entry: 2 00:19:43.365 Error Count: 0x1 00:19:43.365 Submission Queue Id: 0x0 00:19:43.365 Command Id: 0x0 00:19:43.365 Phase Bit: 0 00:19:43.365 Status Code: 0x2 00:19:43.365 Status Code Type: 0x0 00:19:43.365 Do Not Retry: 1 00:19:43.365 Error Location: 0x28 00:19:43.365 LBA: 0x0 00:19:43.365 Namespace: 0x0 00:19:43.365 Vendor Log Page: 0x0 00:19:43.365 00:19:43.365 Number of Queues 00:19:43.365 ================ 00:19:43.365 Number of I/O Submission Queues: 128 00:19:43.365 Number of I/O Completion Queues: 128 00:19:43.365 00:19:43.365 ZNS Specific Controller Data 00:19:43.365 ============================ 00:19:43.365 Zone Append Size Limit: 0 00:19:43.365 00:19:43.365 00:19:43.365 Active Namespaces 00:19:43.365 ================= 00:19:43.365 get_feature(0x05) failed 00:19:43.365 Namespace ID:1 00:19:43.365 Command Set Identifier: NVM (00h) 00:19:43.365 Deallocate: Supported 00:19:43.365 Deallocated/Unwritten Error: Not Supported 00:19:43.365 Deallocated Read Value: Unknown 00:19:43.365 Deallocate in Write Zeroes: Not Supported 00:19:43.365 Deallocated Guard Field: 0xFFFF 00:19:43.365 Flush: Supported 00:19:43.365 Reservation: Not Supported 00:19:43.365 Namespace Sharing Capabilities: Multiple Controllers 00:19:43.365 Size (in LBAs): 3750748848 (1788GiB) 00:19:43.365 Capacity (in LBAs): 3750748848 (1788GiB) 00:19:43.365 Utilization (in LBAs): 3750748848 (1788GiB) 00:19:43.365 UUID: 88e7858e-6c6d-4bae-b3ff-003e206c8ffa 00:19:43.365 Thin Provisioning: Not Supported 00:19:43.365 Per-NS Atomic Units: Yes 00:19:43.365 Atomic Write Unit (Normal): 8 00:19:43.365 Atomic Write Unit (PFail): 8 00:19:43.365 Preferred Write Granularity: 8 00:19:43.365 Atomic Compare & Write Unit: 8 00:19:43.365 Atomic Boundary Size (Normal): 0 00:19:43.365 Atomic Boundary Size (PFail): 0 00:19:43.365 Atomic Boundary Offset: 0 00:19:43.365 NGUID/EUI64 Never Reused: No 00:19:43.365 ANA group ID: 1 00:19:43.365 Namespace Write Protected: No 00:19:43.365 Number of LBA Formats: 1 00:19:43.365 Current LBA Format: LBA Format #00 00:19:43.365 LBA Format #00: Data Size: 512 Metadata Size: 0 00:19:43.365 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@1 -- # nvmftestfini 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@331 -- # nvmfcleanup 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@99 -- # sync 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@102 -- # set +e 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@103 -- # for i in {1..20} 00:19:43.365 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:19:43.365 rmmod nvme_rdma 00:19:43.625 rmmod nvme_fabrics 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@106 -- # set -e 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@107 -- # return 0 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@332 -- # '[' -n '' ']' 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@338 -- # nvmf_fini 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@264 -- # local dev 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@267 -- # remove_target_ns 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@22 -- # _remove_target_ns 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@268 -- # delete_main_bridge 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@130 -- # return 0 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@41 -- # _dev=0 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@41 -- # dev_map=() 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/setup.sh@284 -- # iptr 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@538 -- # iptables-save 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@538 -- # iptables-restore 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- host/identify_kernel_nvmf.sh@1 -- # clean_kernel_target 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@482 -- # [[ -e /sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn ]] 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@484 -- # echo 0 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@486 -- # rm -f /sys/kernel/config/nvmet/ports/1/subsystems/nqn.2016-06.io.spdk:testnqn 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@487 -- # rmdir /sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn/namespaces/1 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@488 -- # rmdir /sys/kernel/config/nvmet/ports/1 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@489 -- # rmdir /sys/kernel/config/nvmet/subsystems/nqn.2016-06.io.spdk:testnqn 00:19:43.625 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@491 -- # modules=(/sys/module/nvmet/holders/*) 00:19:43.626 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@493 -- # modprobe -r nvmet_rdma nvmet 00:19:43.626 15:24:45 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- nvmf/common.sh@496 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh 00:19:46.926 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:19:46.926 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:19:46.926 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:19:47.186 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:19:47.186 00:19:47.186 real 0m16.469s 00:19:47.186 user 0m4.950s 00:19:47.186 sys 0m10.691s 00:19:47.186 15:24:48 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:47.186 15:24:48 nvmf_rdma.nvmf_host.nvmf_identify_kernel_target -- common/autotest_common.sh@10 -- # set +x 00:19:47.186 ************************************ 00:19:47.186 END TEST nvmf_identify_kernel_target 00:19:47.186 ************************************ 00:19:47.186 15:24:48 nvmf_rdma.nvmf_host -- nvmf/nvmf_host.sh@27 -- # run_test nvmf_auth_host /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh --transport=rdma 00:19:47.186 15:24:48 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:19:47.186 15:24:48 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:47.186 15:24:48 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:19:47.186 ************************************ 00:19:47.186 START TEST nvmf_auth_host 00:19:47.186 ************************************ 00:19:47.186 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh --transport=rdma 00:19:47.446 * Looking for test storage... 00:19:47.446 * Found test storage at /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host 00:19:47.446 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:19:47.446 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1681 -- # lcov --version 00:19:47.446 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:19:47.446 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:19:47.446 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@333 -- # local ver1 ver1_l 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@334 -- # local ver2 ver2_l 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@336 -- # IFS=.-: 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@336 -- # read -ra ver1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@337 -- # IFS=.-: 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@337 -- # read -ra ver2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@338 -- # local 'op=<' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@340 -- # ver1_l=2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@341 -- # ver2_l=1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@344 -- # case "$op" in 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@345 -- # : 1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@364 -- # (( v = 0 )) 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@365 -- # decimal 1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@353 -- # local d=1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@355 -- # echo 1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@365 -- # ver1[v]=1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@366 -- # decimal 2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@353 -- # local d=2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@355 -- # echo 2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@366 -- # ver2[v]=2 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@368 -- # return 0 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:19:47.447 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:47.447 --rc genhtml_branch_coverage=1 00:19:47.447 --rc genhtml_function_coverage=1 00:19:47.447 --rc genhtml_legend=1 00:19:47.447 --rc geninfo_all_blocks=1 00:19:47.447 --rc geninfo_unexecuted_blocks=1 00:19:47.447 00:19:47.447 ' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:19:47.447 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:47.447 --rc genhtml_branch_coverage=1 00:19:47.447 --rc genhtml_function_coverage=1 00:19:47.447 --rc genhtml_legend=1 00:19:47.447 --rc geninfo_all_blocks=1 00:19:47.447 --rc geninfo_unexecuted_blocks=1 00:19:47.447 00:19:47.447 ' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:19:47.447 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:47.447 --rc genhtml_branch_coverage=1 00:19:47.447 --rc genhtml_function_coverage=1 00:19:47.447 --rc genhtml_legend=1 00:19:47.447 --rc geninfo_all_blocks=1 00:19:47.447 --rc geninfo_unexecuted_blocks=1 00:19:47.447 00:19:47.447 ' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:19:47.447 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:47.447 --rc genhtml_branch_coverage=1 00:19:47.447 --rc genhtml_function_coverage=1 00:19:47.447 --rc genhtml_legend=1 00:19:47.447 --rc geninfo_all_blocks=1 00:19:47.447 --rc geninfo_unexecuted_blocks=1 00:19:47.447 00:19:47.447 ' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@10 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@7 -- # uname -s 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@12 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@13 -- # NVMF_TRANSPORT_OPTS= 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@14 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@15 -- # nvme gen-hostnqn 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@15 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@16 -- # NVME_HOSTID=00e1c02b-5999-e811-99d6-a4bf01488b4e 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@17 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@18 -- # NVME_CONNECT='nvme connect' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@19 -- # NET_TYPE=phy 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@20 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@47 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/common.sh 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@15 -- # shopt -s extglob 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- paths/export.sh@5 -- # export PATH 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@48 -- # source /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/setup.sh 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@6 -- # NVMF_BRIDGE=nvmf_br 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@7 -- # NVMF_TARGET_NAMESPACE=nvmf_ns_spdk 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@8 -- # NVMF_TARGET_NS_CMD=() 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@50 -- # : 0 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@51 -- # export NVMF_APP_SHM_ID 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@52 -- # build_nvmf_app_args 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@23 -- # '[' 0 -eq 1 ']' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@27 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@29 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@31 -- # '[' '' -eq 1 ']' 00:19:47.447 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/common.sh: line 31: [: : integer expression expected 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@35 -- # '[' -n '' ']' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@37 -- # '[' 0 -eq 1 ']' 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@54 -- # have_pci_nics=0 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@13 -- # digests=("sha256" "sha384" "sha512") 00:19:47.447 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@16 -- # dhgroups=("ffdhe2048" "ffdhe3072" "ffdhe4096" "ffdhe6144" "ffdhe8192") 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@17 -- # subnqn=nqn.2024-02.io.spdk:cnode0 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@18 -- # hostnqn=nqn.2024-02.io.spdk:host0 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@19 -- # nvmet_subsys=/sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@20 -- # nvmet_host=/sys/kernel/config/nvmet/hosts/nqn.2024-02.io.spdk:host0 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@21 -- # keys=() 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@21 -- # ckeys=() 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@68 -- # nvmftestinit 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@285 -- # '[' -z rdma ']' 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@290 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@292 -- # prepare_net_devs 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@254 -- # local -g is_hw=no 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@256 -- # remove_target_ns 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@22 -- # _remove_target_ns 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@258 -- # [[ phy != virt ]] 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@258 -- # gather_supported_nvmf_pci_devs 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@125 -- # xtrace_disable 00:19:47.448 15:24:49 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@129 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@131 -- # pci_devs=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@131 -- # local -a pci_devs 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@132 -- # pci_net_devs=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@132 -- # local -a pci_net_devs 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@133 -- # pci_drivers=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@133 -- # local -A pci_drivers 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@135 -- # net_devs=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@135 -- # local -ga net_devs 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@136 -- # e810=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@136 -- # local -ga e810 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@137 -- # x722=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@137 -- # local -ga x722 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@138 -- # mlx=() 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@138 -- # local -ga mlx 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@141 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@142 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@144 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@146 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:19:55.575 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@148 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@150 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@152 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@154 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@155 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@157 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@158 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@160 -- # pci_devs+=("${e810[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@161 -- # [[ rdma == rdma ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@162 -- # pci_devs+=("${x722[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@163 -- # pci_devs+=("${mlx[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@167 -- # [[ mlx5 == mlx5 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@168 -- # pci_devs=("${mlx[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@175 -- # (( 2 == 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.0 (0x15b3 - 0x1015)' 00:19:55.576 Found 0000:18:00.0 (0x15b3 - 0x1015) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@180 -- # for pci in "${pci_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@181 -- # echo 'Found 0000:18:00.1 (0x15b3 - 0x1015)' 00:19:55.576 Found 0000:18:00.1 (0x15b3 - 0x1015) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@182 -- # [[ mlx5_core == unknown ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@186 -- # [[ mlx5_core == unbound ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@190 -- # [[ 0x1015 == \0\x\1\0\1\7 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@191 -- # [[ 0x1015 == \0\x\1\0\1\9 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@192 -- # [[ rdma == rdma ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@202 -- # NVME_CONNECT='nvme connect -i 15' 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@206 -- # (( 0 > 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@212 -- # [[ mlx5 == e810 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.0: mlx_0_0' 00:19:55.576 Found net devices under 0000:18:00.0: mlx_0_0 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@222 -- # for pci in "${pci_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@223 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@228 -- # [[ rdma == tcp ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@234 -- # (( 1 == 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@239 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@240 -- # echo 'Found net devices under 0000:18:00.1: mlx_0_1' 00:19:55.576 Found net devices under 0000:18:00.1: mlx_0_1 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@241 -- # net_devs+=("${pci_net_devs[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@244 -- # (( 2 == 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@248 -- # [[ rdma == rdma ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@249 -- # get_rdma_if_list 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@75 -- # rdma_devs=() 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@75 -- # local net_dev rxe_net_dev rxe_net_devs rdma_devs 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@77 -- # mapfile -t rxe_net_devs 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@77 -- # rxe_cfg rxe-net 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@57 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/rxe_cfg_small.sh rxe-net 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@79 -- # (( 2 == 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@87 -- # [[ mlx_0_0 == \m\l\x\_\0\_\0 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@89 -- # continue 2 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@85 -- # for net_dev in "${net_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\0 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@86 -- # for rxe_net_dev in "${rxe_net_devs[@]}" 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@87 -- # [[ mlx_0_1 == \m\l\x\_\0\_\1 ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@88 -- # rdma_devs+=("$net_dev") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@89 -- # continue 2 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@94 -- # (( 2 > 0 )) 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@95 -- # net_devs=("${rdma_devs[@]}") 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@258 -- # is_hw=yes 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@260 -- # [[ yes == yes ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@261 -- # [[ rdma == tcp ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@263 -- # [[ rdma == rdma ]] 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@264 -- # nvmf_rdma_init 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@244 -- # local total_initiator_target_pairs=1 00:19:55.576 15:24:55 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@246 -- # load_ib_rdma_modules 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@61 -- # uname 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@61 -- # '[' Linux '!=' Linux ']' 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@65 -- # modprobe ib_cm 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@66 -- # modprobe ib_core 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@67 -- # modprobe ib_umad 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@68 -- # modprobe ib_uverbs 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@69 -- # modprobe iw_cm 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@70 -- # modprobe rdma_cm 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@71 -- # modprobe rdma_ucm 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@247 -- # setup_interfaces 1 phy rdma 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@25 -- # local no=1 type=phy transport=rdma ip_pool=0x0a000001 max 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@27 -- # local -gA dev_map 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@28 -- # local -g _dev 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@31 -- # (( ip_pool += _dev * 2, (_dev + no) * 2 <= 255 )) 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@33 -- # (( _dev = _dev, max = _dev )) 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@34 -- # setup_interface_pair 0 phy 167772161 rdma 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@44 -- # ips=() 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@44 -- # local id=0 type=phy ip=167772161 transport=rdma ips 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@45 -- # local initiator=initiator0 target=target0 _ns= 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@46 -- # local key_initiator=initiator0 key_target=target0 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@49 -- # ips=("$ip" $((++ip))) 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@51 -- # [[ rdma == tcp ]] 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@53 -- # [[ rdma == rdma ]] 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@58 -- # key_initiator=target1 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@61 -- # [[ phy == phy ]] 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@64 -- # initiator=mlx_0_0 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@64 -- # target=mlx_0_1 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@67 -- # [[ phy == veth ]] 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@68 -- # [[ phy == veth ]] 00:19:55.576 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@70 -- # [[ rdma == tcp ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@72 -- # set_ip mlx_0_0 167772161 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@204 -- # local dev=mlx_0_0 ip=167772161 in_ns= 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@207 -- # val_to_ip 167772161 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@11 -- # local val=167772161 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@207 -- # ip=10.0.0.1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.1/24 dev mlx_0_0' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.1/24 dev mlx_0_0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.1 | tee /sys/class/net/mlx_0_0/ifalias' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@210 -- # echo 10.0.0.1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_0/ifalias 00:19:55.577 10.0.0.1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@73 -- # set_ip mlx_0_1 167772162 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@204 -- # local dev=mlx_0_1 ip=167772162 in_ns= 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@205 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@207 -- # val_to_ip 167772162 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@11 -- # local val=167772162 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@13 -- # printf '%u.%u.%u.%u\n' 10 0 0 2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@207 -- # ip=10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@208 -- # eval ' ip addr add 10.0.0.2/24 dev mlx_0_1' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@208 -- # ip addr add 10.0.0.2/24 dev mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@210 -- # eval 'echo 10.0.0.2 | tee /sys/class/net/mlx_0_1/ifalias' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@210 -- # echo 10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@210 -- # tee /sys/class/net/mlx_0_1/ifalias 00:19:55.577 10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@75 -- # set_up mlx_0_0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@214 -- # local dev=mlx_0_0 in_ns= 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_0 up' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@217 -- # ip link set mlx_0_0 up 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@76 -- # set_up mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@214 -- # local dev=mlx_0_1 in_ns= 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@215 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@217 -- # eval ' ip link set mlx_0_1 up' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@217 -- # ip link set mlx_0_1 up 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@78 -- # [[ phy == veth ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@79 -- # [[ phy == veth ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@81 -- # [[ rdma == tcp ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@85 -- # dev_map["$key_initiator"]=mlx_0_0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@85 -- # dev_map["$key_target"]=mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@33 -- # (( _dev++, ip_pool += 2 )) 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@33 -- # (( _dev < max + no )) 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@38 -- # ping_ips 1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@96 -- # local pairs=1 pair 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@98 -- # (( pair = 0 )) 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@99 -- # get_rdma_initiator_ip_address 0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@107 -- # local dev=target0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@99 -- # ping_ip 10.0.0.2 NVMF_TARGET_NS_CMD 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns=NVMF_TARGET_NS_CMD count=1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@90 -- # [[ -n NVMF_TARGET_NS_CMD ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@90 -- # local -n ns=NVMF_TARGET_NS_CMD 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:19:55.577 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:19:55.577 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.036 ms 00:19:55.577 00:19:55.577 --- 10.0.0.2 ping statistics --- 00:19:55.577 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:19:55.577 rtt min/avg/max/mdev = 0.036/0.036/0.036/0.000 ms 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@100 -- # get_rdma_target_ip_address 0 NVMF_TARGET_NS_CMD 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@200 -- # get_target_ip_address 0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@107 -- # local dev=target0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@100 -- # ping_ip 10.0.0.2 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@89 -- # local ip=10.0.0.2 in_ns= count=1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@90 -- # [[ -n '' ]] 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@92 -- # eval ' ping -c 1 10.0.0.2' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@92 -- # ping -c 1 10.0.0.2 00:19:55.577 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:19:55.577 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.026 ms 00:19:55.577 00:19:55.577 --- 10.0.0.2 ping statistics --- 00:19:55.577 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:19:55.577 rtt min/avg/max/mdev = 0.026/0.026/0.026/0.000 ms 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@98 -- # (( pair++ )) 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@98 -- # (( pair < pairs )) 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@266 -- # return 0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@294 -- # '[' '' == iso ']' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@302 -- # nvmf_legacy_env 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@331 -- # NVMF_TARGET_INTERFACE=mlx_0_1 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@332 -- # NVMF_TARGET_INTERFACE2=mlx_0_0 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@334 -- # get_rdma_initiator_ip_address 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address '' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:19:55.577 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@107 -- # local dev=target0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@334 -- # NVMF_FIRST_INITIATOR_IP=10.0.0.2 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@335 -- # get_rdma_initiator_ip_address 1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@192 -- # get_rdma_target_ip_address 1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # get_net_dev target1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@107 -- # local dev=target1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@335 -- # NVMF_SECOND_INITIATOR_IP=10.0.0.1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@337 -- # get_rdma_target_ip_address 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@200 -- # get_target_ip_address '' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@179 -- # get_ip_address target0 '' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@165 -- # local dev=target0 in_ns= ip 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # get_net_dev target0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@107 -- # local dev=target0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n target0 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_1 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@110 -- # echo mlx_0_1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # dev=mlx_0_1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_1/ifalias' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_1/ifalias 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # ip=10.0.0.2 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.2 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@175 -- # echo 10.0.0.2 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@337 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@338 -- # get_rdma_target_ip_address 1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@200 -- # get_target_ip_address 1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@179 -- # get_ip_address target1 '' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@165 -- # local dev=target1 in_ns= ip 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@166 -- # [[ -n '' ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # get_net_dev target1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@107 -- # local dev=target1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n target1 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@109 -- # [[ -n mlx_0_0 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@110 -- # echo mlx_0_0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@168 -- # dev=mlx_0_0 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # eval ' cat /sys/class/net/mlx_0_0/ifalias' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # cat /sys/class/net/mlx_0_0/ifalias 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@172 -- # ip=10.0.0.1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@173 -- # [[ -n 10.0.0.1 ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@175 -- # echo 10.0.0.1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@338 -- # NVMF_SECOND_TARGET_IP=10.0.0.1 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@304 -- # NVMF_TRANSPORT_OPTS='-t rdma' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@305 -- # [[ rdma == \r\d\m\a ]] 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@306 -- # NVMF_TRANSPORT_OPTS='-t rdma --num-shared-buffers 1024' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@311 -- # '[' rdma == tcp ']' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@311 -- # '[' rdma == rdma ']' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@317 -- # modprobe nvme-rdma 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@69 -- # nvmfappstart -L nvme_auth 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@322 -- # timing_enter start_nvmf_tgt 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@724 -- # xtrace_disable 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@324 -- # nvmfpid=1857835 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@323 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -L nvme_auth 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@325 -- # waitforlisten 1857835 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@831 -- # '[' -z 1857835 ']' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:55.578 15:24:56 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@864 -- # return 0 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@326 -- # timing_exit start_nvmf_tgt 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@730 -- # xtrace_disable 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@327 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@70 -- # trap 'cat /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/nvme-auth.log; cleanup' SIGINT SIGTERM EXIT 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@73 -- # gen_dhchap_key null 32 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=null 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=32 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 16 /dev/urandom 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=63981196353a38cef0cca35653e558ae 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-null.XXX 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-null.MgL 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 63981196353a38cef0cca35653e558ae 0 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 63981196353a38cef0cca35653e558ae 0 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=63981196353a38cef0cca35653e558ae 00:19:55.578 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=0 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-null.MgL 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-null.MgL 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@73 -- # keys[0]=/tmp/spdk.key-null.MgL 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@73 -- # gen_dhchap_key sha512 64 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=sha512 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=64 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 32 /dev/urandom 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=84e9728db86570af1f210bb022c527f9d22fc45d31da7514b3aed43cb07d1522 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha512.XXX 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha512.xIU 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 84e9728db86570af1f210bb022c527f9d22fc45d31da7514b3aed43cb07d1522 3 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 84e9728db86570af1f210bb022c527f9d22fc45d31da7514b3aed43cb07d1522 3 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=84e9728db86570af1f210bb022c527f9d22fc45d31da7514b3aed43cb07d1522 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=3 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha512.xIU 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha512.xIU 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@73 -- # ckeys[0]=/tmp/spdk.key-sha512.xIU 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@74 -- # gen_dhchap_key null 48 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=null 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=48 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 24 /dev/urandom 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=2a6dc3d335ba1fbffb18ba65e3838402706df7655c5bafb0 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-null.XXX 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-null.w0T 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 2a6dc3d335ba1fbffb18ba65e3838402706df7655c5bafb0 0 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 2a6dc3d335ba1fbffb18ba65e3838402706df7655c5bafb0 0 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=2a6dc3d335ba1fbffb18ba65e3838402706df7655c5bafb0 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=0 00:19:55.579 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-null.w0T 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-null.w0T 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@74 -- # keys[1]=/tmp/spdk.key-null.w0T 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@74 -- # gen_dhchap_key sha384 48 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.838 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=sha384 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=48 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 24 /dev/urandom 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=e061a704b40b75952d375b05c14fbafd76b0ac356eb3a137 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha384.XXX 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha384.ecD 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key e061a704b40b75952d375b05c14fbafd76b0ac356eb3a137 2 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 e061a704b40b75952d375b05c14fbafd76b0ac356eb3a137 2 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=e061a704b40b75952d375b05c14fbafd76b0ac356eb3a137 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=2 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha384.ecD 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha384.ecD 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@74 -- # ckeys[1]=/tmp/spdk.key-sha384.ecD 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@75 -- # gen_dhchap_key sha256 32 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=sha256 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=32 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 16 /dev/urandom 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=e066c73556cffc0841a2d2bed55fce33 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha256.XXX 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha256.urL 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key e066c73556cffc0841a2d2bed55fce33 1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 e066c73556cffc0841a2d2bed55fce33 1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=e066c73556cffc0841a2d2bed55fce33 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha256.urL 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha256.urL 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@75 -- # keys[2]=/tmp/spdk.key-sha256.urL 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@75 -- # gen_dhchap_key sha256 32 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=sha256 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=32 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 16 /dev/urandom 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=553e41a557afe1bb38cc9edfc66c4bd5 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha256.XXX 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha256.OUs 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 553e41a557afe1bb38cc9edfc66c4bd5 1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 553e41a557afe1bb38cc9edfc66c4bd5 1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=553e41a557afe1bb38cc9edfc66c4bd5 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=1 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha256.OUs 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha256.OUs 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@75 -- # ckeys[2]=/tmp/spdk.key-sha256.OUs 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@76 -- # gen_dhchap_key sha384 48 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=sha384 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=48 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 24 /dev/urandom 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=0f4c031eba36ed21f531156cc6cdb2051d2b1338dadbbbd4 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha384.XXX 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha384.GPR 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 0f4c031eba36ed21f531156cc6cdb2051d2b1338dadbbbd4 2 00:19:55.839 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 0f4c031eba36ed21f531156cc6cdb2051d2b1338dadbbbd4 2 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=0f4c031eba36ed21f531156cc6cdb2051d2b1338dadbbbd4 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=2 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha384.GPR 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha384.GPR 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@76 -- # keys[3]=/tmp/spdk.key-sha384.GPR 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@76 -- # gen_dhchap_key null 32 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=null 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=32 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 16 /dev/urandom 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=577ae1625149885a5ee9286e1f68f7a7 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-null.XXX 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-null.xZP 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 577ae1625149885a5ee9286e1f68f7a7 0 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 577ae1625149885a5ee9286e1f68f7a7 0 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=577ae1625149885a5ee9286e1f68f7a7 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=0 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-null.xZP 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-null.xZP 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@76 -- # ckeys[3]=/tmp/spdk.key-null.xZP 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@77 -- # gen_dhchap_key sha512 64 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@521 -- # local digest len file key 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # digests=(['null']='0' ['sha256']='1' ['sha384']='2' ['sha512']='3') 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@522 -- # local -A digests 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # digest=sha512 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@524 -- # len=64 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # xxd -p -c0 -l 32 /dev/urandom 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@525 -- # key=105f41ecb33a2c5b1d42a05d0e8f007c9b0c3002db1bea70cccac826c548c3cb 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # mktemp -t spdk.key-sha512.XXX 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@526 -- # file=/tmp/spdk.key-sha512.GPy 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@527 -- # format_dhchap_key 105f41ecb33a2c5b1d42a05d0e8f007c9b0c3002db1bea70cccac826c548c3cb 3 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@517 -- # format_key DHHC-1 105f41ecb33a2c5b1d42a05d0e8f007c9b0c3002db1bea70cccac826c548c3cb 3 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@500 -- # local prefix key digest 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # prefix=DHHC-1 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # key=105f41ecb33a2c5b1d42a05d0e8f007c9b0c3002db1bea70cccac826c548c3cb 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@502 -- # digest=3 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@503 -- # python - 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@528 -- # chmod 0600 /tmp/spdk.key-sha512.GPy 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@530 -- # echo /tmp/spdk.key-sha512.GPy 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@77 -- # keys[4]=/tmp/spdk.key-sha512.GPy 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@77 -- # ckeys[4]= 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@79 -- # waitforlisten 1857835 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@831 -- # '[' -z 1857835 ']' 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:56.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:56.099 15:24:57 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@864 -- # return 0 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@80 -- # for i in "${!keys[@]}" 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@81 -- # rpc_cmd keyring_file_add_key key0 /tmp/spdk.key-null.MgL 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # [[ -n /tmp/spdk.key-sha512.xIU ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # rpc_cmd keyring_file_add_key ckey0 /tmp/spdk.key-sha512.xIU 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@80 -- # for i in "${!keys[@]}" 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@81 -- # rpc_cmd keyring_file_add_key key1 /tmp/spdk.key-null.w0T 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # [[ -n /tmp/spdk.key-sha384.ecD ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # rpc_cmd keyring_file_add_key ckey1 /tmp/spdk.key-sha384.ecD 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@80 -- # for i in "${!keys[@]}" 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@81 -- # rpc_cmd keyring_file_add_key key2 /tmp/spdk.key-sha256.urL 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # [[ -n /tmp/spdk.key-sha256.OUs ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # rpc_cmd keyring_file_add_key ckey2 /tmp/spdk.key-sha256.OUs 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@80 -- # for i in "${!keys[@]}" 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@81 -- # rpc_cmd keyring_file_add_key key3 /tmp/spdk.key-sha384.GPR 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # [[ -n /tmp/spdk.key-null.xZP ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # rpc_cmd keyring_file_add_key ckey3 /tmp/spdk.key-null.xZP 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@80 -- # for i in "${!keys[@]}" 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@81 -- # rpc_cmd keyring_file_add_key key4 /tmp/spdk.key-sha512.GPy 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@82 -- # [[ -n '' ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@85 -- # nvmet_auth_init 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@35 -- # configure_kernel_target nqn.2024-02.io.spdk:cnode0 10.0.0.2 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@430 -- # local kernel_name=nqn.2024-02.io.spdk:cnode0 kernel_target_ip=10.0.0.2 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@432 -- # nvmet=/sys/kernel/config/nvmet 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@433 -- # kernel_subsystem=/sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@434 -- # kernel_namespace=/sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0/namespaces/1 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@435 -- # kernel_port=/sys/kernel/config/nvmet/ports/1 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@437 -- # local block nvme 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@439 -- # [[ ! -e /sys/module/nvmet ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@440 -- # modprobe nvmet 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@443 -- # [[ -e /sys/kernel/config/nvmet ]] 00:19:56.359 15:24:58 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@445 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh reset 00:19:59.648 Waiting for block devices as requested 00:19:59.648 0000:5e:00.0 (144d a80a): vfio-pci -> nvme 00:19:59.648 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:19:59.906 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:19:59.906 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:19:59.906 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:20:00.165 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:20:00.165 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:20:00.165 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:20:00.165 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:20:00.424 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:20:00.424 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:20:00.424 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:20:00.683 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:20:00.683 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:20:00.683 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:20:00.941 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:20:00.942 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@448 -- # for block in /sys/block/nvme* 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@449 -- # [[ -e /sys/block/nvme0n1 ]] 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@450 -- # is_block_zoned nvme0n1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@451 -- # block_in_use nvme0n1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@390 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:20:01.878 No valid GPT data, bailing 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@394 -- # pt= 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- scripts/common.sh@395 -- # return 1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@451 -- # nvme=/dev/nvme0n1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@454 -- # [[ -b /dev/nvme0n1 ]] 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@456 -- # mkdir /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@457 -- # mkdir /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0/namespaces/1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@458 -- # mkdir /sys/kernel/config/nvmet/ports/1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@463 -- # echo SPDK-nqn.2024-02.io.spdk:cnode0 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@465 -- # echo 1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@466 -- # echo /dev/nvme0n1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@467 -- # echo 1 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@469 -- # echo 10.0.0.2 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@470 -- # echo rdma 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@471 -- # echo 4420 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@472 -- # echo ipv4 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@475 -- # ln -s /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0 /sys/kernel/config/nvmet/ports/1/subsystems/ 00:20:01.878 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@478 -- # nvme discover --hostnqn=nqn.2014-08.org.nvmexpress:uuid:00e1c02b-5999-e811-99d6-a4bf01488b4e --hostid=00e1c02b-5999-e811-99d6-a4bf01488b4e -a 10.0.0.2 -t rdma -s 4420 00:20:02.137 00:20:02.137 Discovery Log Number of Records 2, Generation counter 2 00:20:02.137 =====Discovery Log Entry 0====== 00:20:02.137 trtype: rdma 00:20:02.137 adrfam: ipv4 00:20:02.137 subtype: current discovery subsystem 00:20:02.137 treq: not specified, sq flow control disable supported 00:20:02.137 portid: 1 00:20:02.137 trsvcid: 4420 00:20:02.137 subnqn: nqn.2014-08.org.nvmexpress.discovery 00:20:02.137 traddr: 10.0.0.2 00:20:02.137 eflags: none 00:20:02.137 rdma_prtype: not specified 00:20:02.137 rdma_qptype: connected 00:20:02.138 rdma_cms: rdma-cm 00:20:02.138 rdma_pkey: 0x0000 00:20:02.138 =====Discovery Log Entry 1====== 00:20:02.138 trtype: rdma 00:20:02.138 adrfam: ipv4 00:20:02.138 subtype: nvme subsystem 00:20:02.138 treq: not specified, sq flow control disable supported 00:20:02.138 portid: 1 00:20:02.138 trsvcid: 4420 00:20:02.138 subnqn: nqn.2024-02.io.spdk:cnode0 00:20:02.138 traddr: 10.0.0.2 00:20:02.138 eflags: none 00:20:02.138 rdma_prtype: not specified 00:20:02.138 rdma_qptype: connected 00:20:02.138 rdma_cms: rdma-cm 00:20:02.138 rdma_pkey: 0x0000 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@36 -- # mkdir /sys/kernel/config/nvmet/hosts/nqn.2024-02.io.spdk:host0 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@37 -- # echo 0 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@38 -- # ln -s /sys/kernel/config/nvmet/hosts/nqn.2024-02.io.spdk:host0 /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0/allowed_hosts/nqn.2024-02.io.spdk:host0 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@88 -- # nvmet_auth_set_key sha256 ffdhe2048 1 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@93 -- # IFS=, 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@94 -- # printf %s sha256,sha384,sha512 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@93 -- # IFS=, 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@94 -- # printf %s ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@93 -- # connect_authenticate sha256,sha384,sha512 ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 1 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256,sha384,sha512 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256,sha384,sha512 --dhchap-dhgroups ffdhe2048,ffdhe3072,ffdhe4096,ffdhe6144,ffdhe8192 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.138 nvme0n1 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:02.138 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.397 15:25:03 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@100 -- # for digest in "${digests[@]}" 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe2048 0 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe2048 0 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.398 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.658 nvme0n1 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe2048 1 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe2048 1 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.658 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.918 nvme0n1 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe2048 2 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe2048 2 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:02.918 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.178 nvme0n1 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe2048 3 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe2048 3 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.178 15:25:04 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.438 nvme0n1 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe2048 4 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe2048 4 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.438 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.698 nvme0n1 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.698 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe3072 0 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe3072 0 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.699 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.958 nvme0n1 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:03.958 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe3072 1 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe3072 1 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:03.959 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.223 nvme0n1 00:20:04.223 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.223 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:04.223 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:04.223 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.223 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.223 15:25:05 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe3072 2 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe3072 2 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:04.223 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.224 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.486 nvme0n1 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.486 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe3072 3 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe3072 3 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.745 nvme0n1 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:04.745 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe3072 4 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe3072 4 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe3072 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.004 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.262 nvme0n1 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe4096 0 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe4096 0 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.262 15:25:06 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.520 nvme0n1 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe4096 1 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe4096 1 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.520 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.778 nvme0n1 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:05.778 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe4096 2 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe4096 2 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.037 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.295 nvme0n1 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.295 15:25:07 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe4096 3 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:06.295 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe4096 3 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.296 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.554 nvme0n1 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe4096 4 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe4096 4 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe4096 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.554 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:06.813 nvme0n1 00:20:06.813 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:06.813 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:06.813 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:06.813 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:06.813 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe6144 0 00:20:07.071 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe6144 0 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.072 15:25:08 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.331 nvme0n1 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.331 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe6144 1 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe6144 1 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.590 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.850 nvme0n1 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe6144 2 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe6144 2 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:07.850 15:25:09 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.418 nvme0n1 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe6144 3 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:08.418 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe6144 3 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.419 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.992 nvme0n1 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe6144 4 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:08.992 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe6144 4 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe6144 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:08.993 15:25:10 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:09.253 nvme0n1 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe8192 0 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe8192 0 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:09.253 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:09.254 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:09.824 nvme0n1 00:20:09.824 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:09.824 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:09.824 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:09.824 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:09.824 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.084 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.084 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:10.084 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe8192 1 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe8192 1 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.085 15:25:11 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.656 nvme0n1 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe8192 2 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe8192 2 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:10.656 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:11.227 nvme0n1 00:20:11.227 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:11.227 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:11.227 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:11.227 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:11.227 15:25:12 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe8192 3 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe8192 3 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:11.227 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:11.488 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:11.488 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:11.488 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:11.488 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.078 nvme0n1 00:20:12.078 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha256 ffdhe8192 4 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha256 ffdhe8192 4 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha256 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe8192 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.079 15:25:13 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.672 nvme0n1 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@100 -- # for digest in "${digests[@]}" 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe2048 0 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe2048 0 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:12.672 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.673 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.934 nvme0n1 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe2048 1 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe2048 1 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:12.934 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.195 nvme0n1 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe2048 2 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:13.195 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe2048 2 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.196 15:25:14 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.456 nvme0n1 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe2048 3 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe2048 3 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.456 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.716 nvme0n1 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe2048 4 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe2048 4 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe2048 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.716 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.977 nvme0n1 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe3072 0 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe3072 0 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:13.977 15:25:15 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.237 nvme0n1 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.237 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.497 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.497 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:14.497 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe3072 1 00:20:14.497 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe3072 1 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.498 nvme0n1 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.498 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe3072 2 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe3072 2 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:14.758 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:14.759 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.019 nvme0n1 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe3072 3 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe3072 3 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:20:15.019 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.020 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.020 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.020 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:15.020 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.020 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.280 nvme0n1 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe3072 4 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe3072 4 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe3072 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.280 15:25:16 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.280 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.280 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:15.280 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.280 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.540 nvme0n1 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe4096 0 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe4096 0 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.540 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.801 nvme0n1 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe4096 1 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe4096 1 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:15.801 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.061 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.061 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:16.061 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.061 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.322 nvme0n1 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe4096 2 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe4096 2 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:20:16.322 15:25:17 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.322 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.322 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.322 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:16.322 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.322 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.583 nvme0n1 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe4096 3 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe4096 3 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.583 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.843 nvme0n1 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:16.843 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe4096 4 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:17.103 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe4096 4 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe4096 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.104 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.364 nvme0n1 00:20:17.364 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.364 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:17.364 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.364 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:17.364 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.364 15:25:18 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe6144 0 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe6144 0 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.364 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.625 nvme0n1 00:20:17.625 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.625 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:17.625 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.625 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:17.625 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.625 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe6144 1 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe6144 1 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:17.885 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.145 nvme0n1 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe6144 2 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe6144 2 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.145 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.405 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.405 15:25:19 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:18.405 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.405 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.664 nvme0n1 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.664 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe6144 3 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe6144 3 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:18.665 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.235 nvme0n1 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe6144 4 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe6144 4 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe6144 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.235 15:25:20 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.806 nvme0n1 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe8192 0 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe8192 0 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:19.806 15:25:21 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.377 nvme0n1 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe8192 1 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:20.377 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe8192 1 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.378 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.947 nvme0n1 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe8192 2 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe8192 2 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:20.947 15:25:22 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:21.519 nvme0n1 00:20:21.519 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:21.519 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:21.519 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:21.519 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:21.519 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:21.519 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe8192 3 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe8192 3 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:21.779 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:21.780 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:21.780 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:21.780 15:25:23 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.350 nvme0n1 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha384 ffdhe8192 4 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha384 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha384)' 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha384 ffdhe8192 4 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha384 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha384 --dhchap-dhgroups ffdhe8192 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.350 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.920 nvme0n1 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@100 -- # for digest in "${digests[@]}" 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe2048 0 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe2048 0 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:22.920 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.180 nvme0n1 00:20:23.180 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.180 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:23.180 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.180 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:23.180 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.180 15:25:24 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.181 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:23.181 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:23.181 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.181 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe2048 1 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe2048 1 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.441 nvme0n1 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.441 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.701 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.701 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:23.701 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe2048 2 00:20:23.701 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:23.701 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:23.701 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe2048 2 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.702 nvme0n1 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.702 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.962 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:23.962 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:23.962 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.962 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.962 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.962 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe2048 3 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe2048 3 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.963 nvme0n1 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:23.963 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.221 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe2048 4 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe2048 4 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe2048 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe2048 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.222 15:25:25 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.482 nvme0n1 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe3072 0 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe3072 0 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.482 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.743 nvme0n1 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe3072 1 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe3072 1 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:24.743 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:24.744 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.005 nvme0n1 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe3072 2 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe3072 2 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.005 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.266 nvme0n1 00:20:25.266 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.266 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:25.266 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.266 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.266 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:25.266 15:25:26 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:25.266 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe3072 3 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe3072 3 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.267 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.528 nvme0n1 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe3072 4 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe3072 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe3072 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe3072 4 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe3072 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe3072 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.528 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.790 nvme0n1 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:25.790 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe4096 0 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe4096 0 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.051 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.312 nvme0n1 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe4096 1 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:26.312 15:25:27 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe4096 1 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.312 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.574 nvme0n1 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe4096 2 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe4096 2 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.574 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.834 nvme0n1 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:26.834 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe4096 3 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe4096 3 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.095 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.356 nvme0n1 00:20:27.356 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.356 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:27.356 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.356 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:27.356 15:25:28 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe4096 4 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe4096 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe4096 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe4096 4 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe4096 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe4096 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.356 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.617 nvme0n1 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe6144 0 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe6144 0 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:27.617 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.193 nvme0n1 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe6144 1 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe6144 1 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.193 15:25:29 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.767 nvme0n1 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe6144 2 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe6144 2 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:28.767 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:28.768 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.031 nvme0n1 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe6144 3 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe6144 3 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.031 15:25:30 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.601 nvme0n1 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe6144 4 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:29.601 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe6144 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe6144 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe6144 4 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe6144 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe6144 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:29.602 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.172 nvme0n1 00:20:30.172 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.172 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:30.172 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:30.172 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@101 -- # for dhgroup in "${dhgroups[@]}" 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe8192 0 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=0 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:NjM5ODExOTYzNTNhMzhjZWYwY2NhMzU2NTNlNTU4YWXmGyzp: 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: ]] 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:03:ODRlOTcyOGRiODY1NzBhZjFmMjEwYmIwMjJjNTI3ZjlkMjJmYzQ1ZDMxZGE3NTE0YjNhZWQ0M2NiMDdkMTUyMpMRLD0=: 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe8192 0 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=0 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key0 --dhchap-ctrlr-key ckey0 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.173 15:25:31 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.744 nvme0n1 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe8192 1 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe8192 1 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=1 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:30.744 15:25:32 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:31.315 nvme0n1 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe8192 2 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=2 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:01:ZTA2NmM3MzU1NmNmZmMwODQxYTJkMmJlZDU1ZmNlMzN4CHA2: 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: ]] 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:01:NTUzZTQxYTU1N2FmZTFiYjM4Y2M5ZWRmYzY2YzRiZDWGDs/c: 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe8192 2 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=2 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 --dhchap-ctrlr-key ckey2 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:31.315 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:31.885 nvme0n1 00:20:31.885 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:31.885 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:31.885 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:31.885 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:31.885 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe8192 3 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:32.146 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=3 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:02:MGY0YzAzMWViYTM2ZWQyMWY1MzExNTZjYzZjZGIyMDUxZDJiMTMzOGRhZGJiYmQ0LliitA==: 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: ]] 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:00:NTc3YWUxNjI1MTQ5ODg1YTVlZTkyODZlMWY2OGY3YTcy1uWK: 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe8192 3 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=3 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key3 --dhchap-ctrlr-key ckey3 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.147 15:25:33 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.717 nvme0n1 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@102 -- # for keyid in "${!keys[@]}" 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@103 -- # nvmet_auth_set_key sha512 ffdhe8192 4 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha512 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe8192 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=4 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey= 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha512)' 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe8192 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:03:MTA1ZjQxZWNiMzNhMmM1YjFkNDJhMDVkMGU4ZjAwN2M5YjBjMzAwMmRiMWJlYTcwY2NjYWM4MjZjNTQ4YzNjYjRoFC0=: 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z '' ]] 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@104 -- # connect_authenticate sha512 ffdhe8192 4 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@55 -- # local digest dhgroup keyid ckey 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # digest=sha512 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # dhgroup=ffdhe8192 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@57 -- # keyid=4 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@58 -- # ckey=(${ckeys[keyid]:+--dhchap-ctrlr-key "ckey${keyid}"}) 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@60 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha512 --dhchap-dhgroups ffdhe8192 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@61 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key4 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:32.717 15:25:34 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.286 nvme0n1 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # rpc_cmd bdev_nvme_get_controllers 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # jq -r '.[].name' 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@64 -- # [[ nvme0 == \n\v\m\e\0 ]] 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@65 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.286 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@110 -- # nvmet_auth_set_key sha256 ffdhe2048 1 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@42 -- # local digest dhgroup keyid key ckey 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # digest=sha256 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # dhgroup=ffdhe2048 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@44 -- # keyid=1 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@45 -- # key=DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@46 -- # ckey=DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@48 -- # echo 'hmac(sha256)' 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@49 -- # echo ffdhe2048 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@50 -- # echo DHHC-1:00:MmE2ZGMzZDMzNWJhMWZiZmZiMThiYTY1ZTM4Mzg0MDI3MDZkZjc2NTVjNWJhZmIw51v6cw==: 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # [[ -z DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: ]] 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@51 -- # echo DHHC-1:02:ZTA2MWE3MDRiNDBiNzU5NTJkMzc1YjA1YzE0ZmJhZmQ3NmIwYWMzNTZlYjNhMTM3jfuN5w==: 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@111 -- # rpc_cmd bdev_nvme_set_options --dhchap-digests sha256 --dhchap-dhgroups ffdhe2048 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@112 -- # NOT rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@650 -- # local es=0 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:33.546 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@653 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.547 request: 00:20:33.547 { 00:20:33.547 "name": "nvme0", 00:20:33.547 "trtype": "rdma", 00:20:33.547 "traddr": "10.0.0.2", 00:20:33.547 "adrfam": "ipv4", 00:20:33.547 "trsvcid": "4420", 00:20:33.547 "subnqn": "nqn.2024-02.io.spdk:cnode0", 00:20:33.547 "hostnqn": "nqn.2024-02.io.spdk:host0", 00:20:33.547 "prchk_reftag": false, 00:20:33.547 "prchk_guard": false, 00:20:33.547 "hdgst": false, 00:20:33.547 "ddgst": false, 00:20:33.547 "allow_unrecognized_csi": false, 00:20:33.547 "method": "bdev_nvme_attach_controller", 00:20:33.547 "req_id": 1 00:20:33.547 } 00:20:33.547 Got JSON-RPC error response 00:20:33.547 response: 00:20:33.547 { 00:20:33.547 "code": -5, 00:20:33.547 "message": "Input/output error" 00:20:33.547 } 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@653 -- # es=1 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@114 -- # rpc_cmd bdev_nvme_get_controllers 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@114 -- # jq length 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@114 -- # (( 0 == 0 )) 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@117 -- # NOT rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@650 -- # local es=0 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@653 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key2 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.547 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.843 request: 00:20:33.843 { 00:20:33.843 "name": "nvme0", 00:20:33.843 "trtype": "rdma", 00:20:33.843 "traddr": "10.0.0.2", 00:20:33.843 "adrfam": "ipv4", 00:20:33.843 "trsvcid": "4420", 00:20:33.843 "subnqn": "nqn.2024-02.io.spdk:cnode0", 00:20:33.843 "hostnqn": "nqn.2024-02.io.spdk:host0", 00:20:33.843 "prchk_reftag": false, 00:20:33.843 "prchk_guard": false, 00:20:33.843 "hdgst": false, 00:20:33.843 "ddgst": false, 00:20:33.843 "dhchap_key": "key2", 00:20:33.843 "allow_unrecognized_csi": false, 00:20:33.843 "method": "bdev_nvme_attach_controller", 00:20:33.843 "req_id": 1 00:20:33.843 } 00:20:33.843 Got JSON-RPC error response 00:20:33.843 response: 00:20:33.843 { 00:20:33.843 "code": -5, 00:20:33.843 "message": "Input/output error" 00:20:33.843 } 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@653 -- # es=1 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@120 -- # rpc_cmd bdev_nvme_get_controllers 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@120 -- # jq length 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@120 -- # (( 0 == 0 )) 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@123 -- # NOT rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@650 -- # local es=0 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@653 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a 10.0.0.2 -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey2 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.843 request: 00:20:33.843 { 00:20:33.843 "name": "nvme0", 00:20:33.843 "trtype": "rdma", 00:20:33.843 "traddr": "10.0.0.2", 00:20:33.843 "adrfam": "ipv4", 00:20:33.843 "trsvcid": "4420", 00:20:33.843 "subnqn": "nqn.2024-02.io.spdk:cnode0", 00:20:33.843 "hostnqn": "nqn.2024-02.io.spdk:host0", 00:20:33.843 "prchk_reftag": false, 00:20:33.843 "prchk_guard": false, 00:20:33.843 "hdgst": false, 00:20:33.843 "ddgst": false, 00:20:33.843 "dhchap_key": "key1", 00:20:33.843 "dhchap_ctrlr_key": "ckey2", 00:20:33.843 "allow_unrecognized_csi": false, 00:20:33.843 "method": "bdev_nvme_attach_controller", 00:20:33.843 "req_id": 1 00:20:33.843 } 00:20:33.843 Got JSON-RPC error response 00:20:33.843 response: 00:20:33.843 { 00:20:33.843 "code": -5, 00:20:33.843 "message": "Input/output error" 00:20:33.843 } 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@653 -- # es=1 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@128 -- # get_main_ns_ip 00:20:33.843 /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh: line 128: get_main_ns_ip: command not found 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@128 -- # trap - ERR 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@128 -- # print_backtrace 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1153 -- # [[ ehxBET =~ e ]] 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1155 -- # args=('--transport=rdma') 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1155 -- # local args 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1157 -- # xtrace_disable 00:20:33.843 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.843 ========== Backtrace start: ========== 00:20:33.843 00:20:33.843 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh:128 -> main(["--transport=rdma"]) 00:20:33.843 ... 00:20:33.843 123 NOT rpc_cmd bdev_nvme_attach_controller -b nvme0 -t "$TEST_TRANSPORT" -f ipv4 \ 00:20:33.844 124 -a "$NVMF_FIRST_INITIATOR_IP" -s "$NVMF_PORT" -q "$hostnqn" -n "$subnqn" \ 00:20:33.844 125 --dhchap-key "key1" --dhchap-ctrlr-key "ckey2" 00:20:33.844 126 00:20:33.844 127 # Check reauthentication 00:20:33.844 => 128 rpc_cmd bdev_nvme_attach_controller -b nvme0 -t "$TEST_TRANSPORT" -f ipv4 \ 00:20:33.844 129 -a "$(get_main_ns_ip)" -s "$NVMF_PORT" -q "$hostnqn" -n "$subnqn" \ 00:20:33.844 130 --dhchap-key "key1" --dhchap-ctrlr-key "ckey1" --ctrlr-loss-timeout-sec 1 \ 00:20:33.844 131 --reconnect-delay-sec 1 00:20:33.844 132 nvmet_auth_set_key "sha256" "ffdhe2048" 2 00:20:33.844 133 rpc_cmd bdev_nvme_set_keys "nvme0" --dhchap-key "key2" --dhchap-ctrlr-key "ckey2" 00:20:33.844 ... 00:20:33.844 00:20:33.844 ========== Backtrace end ========== 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1194 -- # return 0 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@128 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t rdma -f ipv4 -a '' -s 4420 -q nqn.2024-02.io.spdk:host0 -n nqn.2024-02.io.spdk:cnode0 --dhchap-key key1 --dhchap-ctrlr-key ckey1 --ctrlr-loss-timeout-sec 1 --reconnect-delay-sec 1 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@561 -- # xtrace_disable 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.844 usage: rpc.py [options] bdev_nvme_attach_controller [-h] -b NAME -t TRTYPE -a 00:20:33.844 TRADDR [-f ADRFAM] 00:20:33.844 [-s TRSVCID] [-p PRIORITY] 00:20:33.844 [-n SUBNQN] [-q HOSTNQN] 00:20:33.844 [-i HOSTADDR] 00:20:33.844 [-c HOSTSVCID] [-r] [-g] 00:20:33.844 [-e] [-d] 00:20:33.844 [--fabrics-timeout FABRICS_CONNECT_TIMEOUT_US] 00:20:33.844 [-x MULTIPATH] 00:20:33.844 [--num-io-queues NUM_IO_QUEUES] 00:20:33.844 [-l CTRLR_LOSS_TIMEOUT_SEC] 00:20:33.844 [-o RECONNECT_DELAY_SEC] 00:20:33.844 [-u FAST_IO_FAIL_TIMEOUT_SEC] 00:20:33.844 [-k PSK] [-m MAX_BDEVS] 00:20:33.844 [--dhchap-key DHCHAP_KEY] 00:20:33.844 [--dhchap-ctrlr-key DHCHAP_CTRLR_KEY] 00:20:33.844 [-U] 00:20:33.844 rpc.py [options] bdev_nvme_attach_controller: error: argument -a/--traddr: expected one argument 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # trap - ERR 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@589 -- # print_backtrace 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1153 -- # [[ ehxBET =~ e ]] 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1155 -- # args=('1' '--reconnect-delay-sec' '1' '--ctrlr-loss-timeout-sec' 'ckey1' '--dhchap-ctrlr-key' 'key1' '--dhchap-key' 'nqn.2024-02.io.spdk:cnode0' '-n' 'nqn.2024-02.io.spdk:host0' '-q' '4420' '-s' '' '-a' 'ipv4' '-f' 'rdma' '-t' 'nvme0' '-b' 'bdev_nvme_attach_controller' '--transport=rdma') 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1155 -- # local args 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1157 -- # xtrace_disable 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:33.844 ========== Backtrace start: ========== 00:20:33.844 00:20:33.844 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh:589 -> rpc_cmd(["bdev_nvme_attach_controller"],["-b"],["nvme0"],["-t"],["rdma"],["-f"],["ipv4"],["-a"],[""],["-s"],["4420"],["-q"],["nqn.2024-02.io.spdk:host0"],["-n"],["nqn.2024-02.io.spdk:cnode0"],["--dhchap-key"],["key1"],["--dhchap-ctrlr-key"],["ckey1"],["--ctrlr-loss-timeout-sec"],["1"],["--reconnect-delay-sec"],["1"]) 00:20:33.844 ... 00:20:33.844 584 echo "$rsp" 00:20:33.844 585 done 00:20:33.844 586 00:20:33.844 587 rc=${!status[*]} 00:20:33.844 588 xtrace_restore 00:20:33.844 => 589 [[ $rc == 0 ]] 00:20:33.844 590 } 00:20:33.844 591 00:20:33.844 592 function rpc_cmd_simple_data_json() { 00:20:33.844 593 00:20:33.844 594 local elems="$1[@]" elem 00:20:33.844 ... 00:20:33.844 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh:128 -> main(["--transport=rdma"]) 00:20:33.844 ... 00:20:33.844 123 NOT rpc_cmd bdev_nvme_attach_controller -b nvme0 -t "$TEST_TRANSPORT" -f ipv4 \ 00:20:33.844 124 -a "$NVMF_FIRST_INITIATOR_IP" -s "$NVMF_PORT" -q "$hostnqn" -n "$subnqn" \ 00:20:33.844 125 --dhchap-key "key1" --dhchap-ctrlr-key "ckey2" 00:20:33.844 126 00:20:33.844 127 # Check reauthentication 00:20:33.844 => 128 rpc_cmd bdev_nvme_attach_controller -b nvme0 -t "$TEST_TRANSPORT" -f ipv4 \ 00:20:33.844 129 -a "$(get_main_ns_ip)" -s "$NVMF_PORT" -q "$hostnqn" -n "$subnqn" \ 00:20:33.844 130 --dhchap-key "key1" --dhchap-ctrlr-key "ckey1" --ctrlr-loss-timeout-sec 1 \ 00:20:33.844 131 --reconnect-delay-sec 1 00:20:33.844 132 nvmet_auth_set_key "sha256" "ffdhe2048" 2 00:20:33.844 133 rpc_cmd bdev_nvme_set_keys "nvme0" --dhchap-key "key2" --dhchap-ctrlr-key "ckey2" 00:20:33.844 ... 00:20:33.844 00:20:33.844 ========== Backtrace end ========== 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1194 -- # return 0 00:20:33.844 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1 -- # cat /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/nvme-auth.log 00:20:33.844 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:20:33.844 [2024-09-27 15:24:56.353114] Starting SPDK v25.01-pre git sha1 71dc0c1e9 / DPDK 24.03.0 initialization... 00:20:33.844 [2024-09-27 15:24:56.353177] [ DPDK EAL parameters: nvmf -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:33.844 [2024-09-27 15:24:56.438925] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:33.844 [2024-09-27 15:24:56.521671] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:20:33.844 [2024-09-27 15:24:56.521715] app.c: 611:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:20:33.844 [2024-09-27 15:24:56.521726] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:20:33.844 [2024-09-27 15:24:56.521735] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:20:33.844 [2024-09-27 15:24:56.521742] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:20:33.844 [2024-09-27 15:24:56.521774] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:20:33.844 [2024-09-27 15:25:03.840865] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.844 [2024-09-27 15:25:03.840895] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.844 [2024-09-27 15:25:03.840902] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.844 [2024-09-27 15:25:03.840908] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.844 [2024-09-27 15:25:03.840914] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.844 [2024-09-27 15:25:03.840920] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.844 [2024-09-27 15:25:03.840925] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.844 [2024-09-27 15:25:03.840931] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.844 [2024-09-27 15:25:03.840936] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.844 [2024-09-27 15:25:03.840982] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.844 [2024-09-27 15:25:03.841004] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.844 ctrlr pubkey: 00:20:33.844 00000000 ca a2 10 8b 81 6f 3f 87 d6 8c c2 13 d4 b8 42 3e .....o?.......B> 00:20:33.844 00000010 9c fc 6f 05 7f 89 ed 76 b5 54 c4 57 1e e8 5c 31 ..o....v.T.W..\1 00:20:33.844 00000020 b1 65 75 a4 61 1e 43 45 2f 20 c4 1a 2b 43 da 62 .eu.a.CE/ ..+C.b 00:20:33.844 00000030 9a a1 1a e3 69 0d 23 6c 36 dc bc a4 ac 60 da d2 ....i.#l6....`.. 00:20:33.844 00000040 b5 67 57 e3 bb 57 2a cb 47 d5 21 c8 77 87 3d bb .gW..W*.G.!.w.=. 00:20:33.844 00000050 59 8b d2 a6 99 6c 9c c7 b3 8e f1 45 81 3d ee d4 Y....l.....E.=.. 00:20:33.844 00000060 60 ac 78 4e 85 21 7b 09 00 f0 f8 f6 f5 9b cd 1a `.xN.!{......... 00:20:33.844 00000070 e1 67 3a 20 39 3b c1 fa 36 af 30 25 0f 87 70 18 .g: 9;..6.0%..p. 00:20:33.844 00000080 b7 b4 30 e7 06 9b aa 7c 20 b0 de 4c 0b a1 a6 01 ..0....| ..L.... 00:20:33.844 00000090 06 b7 ee 23 78 55 37 2e 7f 16 ee 33 d5 e1 7c d5 ...#xU7....3..|. 00:20:33.844 000000a0 fd 93 24 22 4c 5a dd 2d 14 1e 5d d7 97 b8 e2 86 ..$"LZ.-..]..... 00:20:33.844 000000b0 7f 94 87 0b a3 09 88 e9 56 30 b0 36 5e ae c4 7e ........V0.6^..~ 00:20:33.844 000000c0 ae 26 e1 6d 36 14 12 43 85 fb 48 8d 7b 68 10 15 .&.m6..C..H.{h.. 00:20:33.844 000000d0 c5 50 2f 1a d8 7d 72 b1 73 0b 40 a4 52 04 4c 97 .P/..}r.s.@.R.L. 00:20:33.844 000000e0 34 08 d6 e7 26 71 88 e9 59 35 72 12 6a 83 c8 ac 4...&q..Y5r.j... 00:20:33.844 000000f0 ed a5 6a 53 8e e5 bc b3 55 66 5c f7 6d 15 67 4a ..jS....Uf\.m.gJ 00:20:33.844 host pubkey: 00:20:33.845 00000000 24 0e 98 48 f9 ae 00 e6 f9 be 9c 27 97 0b 7f 8a $..H.......'.... 00:20:33.845 00000010 53 6d d1 b0 79 8e ed ef e5 a8 d5 67 29 47 f5 51 Sm..y......g)G.Q 00:20:33.845 00000020 96 b1 85 f6 2a bd 6b 3f cd 0a e2 59 f9 65 01 c7 ....*.k?...Y.e.. 00:20:33.845 00000030 34 f0 01 4e 1e 08 5b aa 2b 90 28 a9 e2 1f 69 e7 4..N..[.+.(...i. 00:20:33.845 00000040 45 99 31 6f 3e 40 2e 5b 62 79 3c e2 b0 c6 a2 d5 E.1o>@.[by<..... 00:20:33.845 00000050 aa 5c c3 7c 13 b8 10 dc 82 79 f5 f4 06 87 c1 ee .\.|.....y...... 00:20:33.845 00000060 6d ff 12 cf 47 95 05 d9 19 f1 5a 66 21 ca 61 3c m...G.....Zf!.a< 00:20:33.845 00000070 58 06 86 53 e1 2b bf 09 ca 9a e3 62 80 92 88 b3 X..S.+.....b.... 00:20:33.845 00000080 74 62 b0 a7 3d 21 e5 3a f9 12 b2 e4 03 de b9 1a tb..=!.:........ 00:20:33.845 00000090 a1 bd b1 84 bd 02 b3 86 ef 28 ec c5 44 69 6d 6f .........(..Dimo 00:20:33.845 000000a0 bc 80 54 81 10 78 a7 2a b5 f5 a7 17 1c 4b d3 de ..T..x.*.....K.. 00:20:33.845 000000b0 86 44 d7 6f e2 0d 23 c0 88 81 96 06 70 c3 6a 83 .D.o..#.....p.j. 00:20:33.845 000000c0 fb d8 e9 53 75 74 3c 12 55 3d ff ab 86 52 3b 56 ...Sut<.U=...R;V 00:20:33.845 000000d0 36 3d 37 4e 2e c0 88 fe 44 cd 51 7d 19 3f 0a 28 6=7N....D.Q}.?.( 00:20:33.845 000000e0 5e 2f 62 aa a7 5d 0d 0f b9 0b 31 bc 87 e5 72 c4 ^/b..]....1...r. 00:20:33.845 000000f0 2a ef eb 18 6f 47 6f 29 90 49 0a c8 18 d1 3e fe *...oGo).I....>. 00:20:33.845 dh secret: 00:20:33.845 00000000 b1 98 cc b6 95 c2 6f c9 6f fb 20 fc 7e 27 08 7c ......o.o. .~'.| 00:20:33.845 00000010 45 8f a7 7e fa fb cb 2c 91 97 44 1f be a2 b0 f9 E..~...,..D..... 00:20:33.845 00000020 ad 1c 49 8e 30 f5 1c 65 90 14 99 81 80 e0 3b 16 ..I.0..e......;. 00:20:33.845 00000030 d6 c9 72 3a 7f 8b 49 ca 80 15 47 32 20 7b 9d 87 ..r:..I...G2 {.. 00:20:33.845 00000040 59 66 8c e7 1b 33 5f f5 6b 43 ca 7d 2c 45 50 6a Yf...3_.kC.},EPj 00:20:33.845 00000050 6d fe d9 54 2d 4f 8e 0b a6 e7 49 29 29 17 72 71 m..T-O....I)).rq 00:20:33.845 00000060 8a 9f de 0e dd fd f3 1d 3b d4 c2 75 99 e8 6a 90 ........;..u..j. 00:20:33.845 00000070 81 ac 7d b0 e9 ba 07 ca 5a 9f c1 96 3d 7c aa 51 ..}.....Z...=|.Q 00:20:33.845 00000080 5d 6b 9d 86 26 7e 20 ff 59 f5 37 d1 e3 7f 85 81 ]k..&~ .Y.7..... 00:20:33.845 00000090 11 d5 8b 5f b9 27 f6 1a 24 f6 51 4c bf 07 66 92 ..._.'..$.QL..f. 00:20:33.845 000000a0 04 54 ce 07 df 0d c9 2c 30 e3 91 4c 60 86 c0 45 .T.....,0..L`..E 00:20:33.845 000000b0 70 ab f0 2b 3a 65 d1 10 85 ef fa c9 d9 85 db 40 p..+:e.........@ 00:20:33.845 000000c0 93 6e 50 f8 2a 9d cc e3 7f 78 e1 bc ea b5 b2 45 .nP.*....x.....E 00:20:33.845 000000d0 ef 33 ba bd 0d e7 e8 dc f3 5f e7 4b b0 c4 47 e2 .3......._.K..G. 00:20:33.845 000000e0 3a c1 7e b9 4d 39 61 33 e0 69 14 43 2f 8e eb 48 :.~.M9a3.i.C/..H 00:20:33.845 000000f0 4f fe 63 7d 27 b8 54 af 25 77 14 01 cc c9 e0 a4 O.c}'.T.%w...... 00:20:33.845 [2024-09-27 15:25:03.844490] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=1, seq=3428451693, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.845 [2024-09-27 15:25:03.847288] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.845 [2024-09-27 15:25:03.847337] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.845 [2024-09-27 15:25:03.847359] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.845 [2024-09-27 15:25:03.847386] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.845 [2024-09-27 15:25:03.847397] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.845 [2024-09-27 15:25:03.954509] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.845 [2024-09-27 15:25:03.954530] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.845 [2024-09-27 15:25:03.954537] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.845 [2024-09-27 15:25:03.954546] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.845 [2024-09-27 15:25:03.954553] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.845 [2024-09-27 15:25:03.954559] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.845 [2024-09-27 15:25:03.954565] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.845 [2024-09-27 15:25:03.954570] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.845 [2024-09-27 15:25:03.954576] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.845 [2024-09-27 15:25:03.954586] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.845 [2024-09-27 15:25:03.954640] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.845 ctrlr pubkey: 00:20:33.845 00000000 ca a2 10 8b 81 6f 3f 87 d6 8c c2 13 d4 b8 42 3e .....o?.......B> 00:20:33.845 00000010 9c fc 6f 05 7f 89 ed 76 b5 54 c4 57 1e e8 5c 31 ..o....v.T.W..\1 00:20:33.845 00000020 b1 65 75 a4 61 1e 43 45 2f 20 c4 1a 2b 43 da 62 .eu.a.CE/ ..+C.b 00:20:33.845 00000030 9a a1 1a e3 69 0d 23 6c 36 dc bc a4 ac 60 da d2 ....i.#l6....`.. 00:20:33.845 00000040 b5 67 57 e3 bb 57 2a cb 47 d5 21 c8 77 87 3d bb .gW..W*.G.!.w.=. 00:20:33.845 00000050 59 8b d2 a6 99 6c 9c c7 b3 8e f1 45 81 3d ee d4 Y....l.....E.=.. 00:20:33.845 00000060 60 ac 78 4e 85 21 7b 09 00 f0 f8 f6 f5 9b cd 1a `.xN.!{......... 00:20:33.845 00000070 e1 67 3a 20 39 3b c1 fa 36 af 30 25 0f 87 70 18 .g: 9;..6.0%..p. 00:20:33.845 00000080 b7 b4 30 e7 06 9b aa 7c 20 b0 de 4c 0b a1 a6 01 ..0....| ..L.... 00:20:33.845 00000090 06 b7 ee 23 78 55 37 2e 7f 16 ee 33 d5 e1 7c d5 ...#xU7....3..|. 00:20:33.845 000000a0 fd 93 24 22 4c 5a dd 2d 14 1e 5d d7 97 b8 e2 86 ..$"LZ.-..]..... 00:20:33.845 000000b0 7f 94 87 0b a3 09 88 e9 56 30 b0 36 5e ae c4 7e ........V0.6^..~ 00:20:33.845 000000c0 ae 26 e1 6d 36 14 12 43 85 fb 48 8d 7b 68 10 15 .&.m6..C..H.{h.. 00:20:33.845 000000d0 c5 50 2f 1a d8 7d 72 b1 73 0b 40 a4 52 04 4c 97 .P/..}r.s.@.R.L. 00:20:33.845 000000e0 34 08 d6 e7 26 71 88 e9 59 35 72 12 6a 83 c8 ac 4...&q..Y5r.j... 00:20:33.845 000000f0 ed a5 6a 53 8e e5 bc b3 55 66 5c f7 6d 15 67 4a ..jS....Uf\.m.gJ 00:20:33.845 host pubkey: 00:20:33.845 00000000 ed 9e bb 2f 27 6d 1d fa da be 55 78 d8 b7 e0 17 .../'m....Ux.... 00:20:33.845 00000010 c9 de 8f cc 9d 5c 76 e5 3b 6f 1b 85 6a 04 30 ae .....\v.;o..j.0. 00:20:33.845 00000020 b2 db d1 01 b9 31 ad 75 a7 1c 9d 71 03 ed c9 e5 .....1.u...q.... 00:20:33.845 00000030 b7 f2 2e 50 c6 aa ee a4 89 75 6f d2 19 8c bd 68 ...P.....uo....h 00:20:33.845 00000040 6d bb d2 1f 4f 73 1a bb cd 5d 58 b9 b5 59 99 cd m...Os...]X..Y.. 00:20:33.845 00000050 84 c2 c5 7e 64 07 c5 ac 15 69 29 0b a2 0c d8 1b ...~d....i)..... 00:20:33.845 00000060 60 61 83 b7 bc 34 f1 c1 29 3b 44 6b b6 6e 31 9c `a...4..);Dk.n1. 00:20:33.845 00000070 32 73 85 69 5f 1c 8b b6 89 65 81 f4 a3 3e 1c b5 2s.i_....e...>.. 00:20:33.845 00000080 40 e0 6e 63 cd e3 82 db 43 99 64 e9 fe 66 f5 d9 @.nc....C.d..f.. 00:20:33.845 00000090 a0 2c 80 59 be 0d c5 f2 89 a2 9e bc 27 a6 bf 8d .,.Y........'... 00:20:33.845 000000a0 44 3d de 93 e7 87 a2 e1 72 0e 18 e5 d0 17 79 3a D=......r.....y: 00:20:33.845 000000b0 ce bf 50 d1 42 88 f8 68 37 18 95 10 e7 7e a9 2c ..P.B..h7....~., 00:20:33.845 000000c0 9c f6 a3 fd 5b 26 af 23 98 56 b5 6b ec 63 9d 19 ....[&.#.V.k.c.. 00:20:33.845 000000d0 cb 67 27 45 24 05 d1 d9 69 95 18 be 41 fa 67 c3 .g'E$...i...A.g. 00:20:33.845 000000e0 36 3a e7 32 fe fb 5c f6 9c 70 e4 49 4a 2e d6 61 6:.2..\..p.IJ..a 00:20:33.845 000000f0 02 94 a9 b8 df e8 b8 75 a7 cf 59 51 ca 46 ef ce .......u..YQ.F.. 00:20:33.845 dh secret: 00:20:33.845 00000000 80 d9 ea 4a 8e b2 cc 8f 10 15 45 f3 ed 28 e9 ea ...J......E..(.. 00:20:33.845 00000010 80 0c 0a 38 00 05 d5 dc 91 4d 98 c3 c1 aa 7c 69 ...8.....M....|i 00:20:33.845 00000020 f5 47 11 96 d9 c8 55 fe fb 75 05 90 d4 06 f9 d6 .G....U..u...... 00:20:33.845 00000030 be bc 03 cb 9e 1e e7 e9 23 a8 84 a3 17 de 71 99 ........#.....q. 00:20:33.845 00000040 c5 68 ba 5a 2e b6 1e 74 96 8e cf 3f ed b4 a1 cb .h.Z...t...?.... 00:20:33.845 00000050 a8 5b f6 0e ac 60 fc bf 06 99 77 60 aa 90 14 1e .[...`....w`.... 00:20:33.845 00000060 fe cb b4 54 51 c4 8f af 9c f8 99 74 25 80 cb 70 ...TQ......t%..p 00:20:33.845 00000070 0a 2e b6 7f c3 19 2b ba 9f ee ec 21 1e 04 c4 76 ......+....!...v 00:20:33.845 00000080 bc e5 79 c8 56 d4 5b b9 88 ec e5 1d fc 1e 4f a4 ..y.V.[.......O. 00:20:33.845 00000090 fa 57 f3 c3 c2 96 91 f9 18 81 50 d4 c2 2a 24 d9 .W........P..*$. 00:20:33.845 000000a0 36 a2 a8 d3 7b e8 9b d5 f2 0d 9b db ec e6 9d 21 6...{..........! 00:20:33.845 000000b0 14 23 9b 90 fc c4 e0 05 b8 d4 c0 5f e4 55 6c 27 .#........._.Ul' 00:20:33.845 000000c0 8e 4c 67 aa ee a4 e6 47 bb 24 40 69 91 bc c1 bc .Lg....G.$@i.... 00:20:33.845 000000d0 94 e0 cd 10 99 d5 83 e6 f8 1b f1 cd b8 a9 88 24 ...............$ 00:20:33.845 000000e0 51 fe d1 04 d7 83 d2 1d 08 c2 c9 fb 5e d6 5b 69 Q...........^.[i 00:20:33.845 000000f0 e4 a1 05 16 3a a1 ef 15 b4 da ef 64 3e 1d 75 1f ....:......d>.u. 00:20:33.845 [2024-09-27 15:25:03.957206] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=1, dhgroup=1, seq=3428451694, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.845 [2024-09-27 15:25:03.957306] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.845 [2024-09-27 15:25:03.966321] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.845 [2024-09-27 15:25:03.966407] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.845 [2024-09-27 15:25:03.966418] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.845 [2024-09-27 15:25:03.966457] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.845 [2024-09-27 15:25:04.127388] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.845 [2024-09-27 15:25:04.127409] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.845 [2024-09-27 15:25:04.127416] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.846 [2024-09-27 15:25:04.127462] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.846 [2024-09-27 15:25:04.127486] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.846 ctrlr pubkey: 00:20:33.846 00000000 a6 28 a8 d8 f0 5f d8 94 5f 9b 17 f5 64 06 4d c8 .(..._.._...d.M. 00:20:33.846 00000010 16 e5 26 a4 90 0c cf 63 fe f1 9f 49 b1 4d 53 35 ..&....c...I.MS5 00:20:33.846 00000020 14 2b ba 50 cc d5 d8 bb f0 9b 5e 7a ec 43 34 f5 .+.P......^z.C4. 00:20:33.846 00000030 58 c4 d3 a6 e5 cf e1 33 43 1c cd 1c b8 26 83 ee X......3C....&.. 00:20:33.846 00000040 b6 72 86 3d 7a b1 85 76 7b 35 23 4d a7 dd 37 47 .r.=z..v{5#M..7G 00:20:33.846 00000050 28 56 a7 93 ed 32 c3 f9 25 81 ad b3 be c4 7c 51 (V...2..%.....|Q 00:20:33.846 00000060 5f 1b f9 4c 3c b3 3a 4a 48 7d db fb 4d 94 d9 3d _..L<.:JH}..M..= 00:20:33.846 00000070 70 de 26 39 23 0f 46 b1 02 e1 79 69 2c cf dc 92 p.&9#.F...yi,... 00:20:33.846 00000080 34 18 5d a9 68 32 a0 ef 27 0d 97 09 0e 6c 69 20 4.].h2..'....li 00:20:33.846 00000090 9d 96 a4 be 5d eb e9 d7 36 06 7e d2 ee 79 4f 37 ....]...6.~..yO7 00:20:33.846 000000a0 ef 1d da e0 2e ab 43 e5 cc 13 14 9e bd f3 b6 17 ......C......... 00:20:33.846 000000b0 fc 9c e3 cc dd a2 8a 9c 3e 55 d8 c9 dc 31 7c 86 ........>U...1|. 00:20:33.846 000000c0 05 e1 4a 9c 58 ad 46 49 67 b0 89 b9 6e 5f 35 a8 ..J.X.FIg...n_5. 00:20:33.846 000000d0 e0 c1 d8 69 2a b1 34 9c a2 83 5d 2a b1 4e cc ee ...i*.4...]*.N.. 00:20:33.846 000000e0 e0 c8 ef 88 a1 65 b1 e9 91 4e fd 4a 5c 91 ff e6 .....e...N.J\... 00:20:33.846 000000f0 fa bf 9e 1f 47 1c 2c ac 2a f3 68 ca 91 4a 75 9c ....G.,.*.h..Ju. 00:20:33.846 host pubkey: 00:20:33.846 00000000 2b 87 af 24 1f bf a1 5c 1a 1c 75 3c 48 75 77 32 +..$...\..u...N.... 00:20:33.846 dh secret: 00:20:33.846 00000000 68 0e 36 d5 2a 92 b3 7a 39 e4 c8 9f c4 4e dd ac h.6.*..z9....N.. 00:20:33.846 00000010 2c 56 4f c2 4b 36 a2 a7 8b 89 76 df 9d d5 9a 53 ,VO.K6....v....S 00:20:33.846 00000020 a0 af af b6 ef 2e 4b 63 18 99 97 71 b4 61 81 76 ......Kc...q.a.v 00:20:33.846 00000030 43 e1 c5 78 27 84 c1 6b bf 5f 2e 56 52 88 9e 7b C..x'..k._.VR..{ 00:20:33.846 00000040 e4 fa ba 30 a6 fd d9 f2 71 49 fc d1 84 19 de fb ...0....qI...... 00:20:33.846 00000050 45 23 d8 cb e5 08 84 2a e5 8a 69 80 30 c8 06 9e E#.....*..i.0... 00:20:33.846 00000060 db d8 e2 bd 3a a0 77 69 65 f2 9f 16 3b ac db 24 ....:.wie...;..$ 00:20:33.846 00000070 64 6b 83 dd 82 fe 11 85 3d 6d b9 50 b3 5d 8b a2 dk......=m.P.].. 00:20:33.846 00000080 ba b7 b1 61 2b c1 6d cc 6f aa 33 6b 42 f9 49 e2 ...a+.m.o.3kB.I. 00:20:33.846 00000090 6c 1c 7e 92 35 a9 cd fc 61 9d 43 2e 6b 01 a0 70 l.~.5...a.C.k..p 00:20:33.846 000000a0 cd a9 85 87 b3 44 0e 14 74 ea ba b0 60 fc 47 e1 .....D..t...`.G. 00:20:33.846 000000b0 9c c9 68 28 91 4a 5d a1 8b 44 02 05 5b 11 d8 30 ..h(.J]..D..[..0 00:20:33.846 000000c0 f1 30 12 43 f3 2a 15 bf 73 7a 01 05 f2 9d 87 26 .0.C.*..sz.....& 00:20:33.846 000000d0 9b 26 c3 a9 a3 af 19 91 a3 0b 67 58 87 21 21 78 .&........gX.!!x 00:20:33.846 000000e0 bf 19 25 09 cc 14 d0 e1 42 88 0e 44 91 52 47 7b ..%.....B..D.RG{ 00:20:33.846 000000f0 8a e4 35 b3 47 51 40 36 a7 52 7b a3 77 38 0b 16 ..5.GQ@6.R{.w8.. 00:20:33.846 [2024-09-27 15:25:04.130118] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=1, dhgroup=1, seq=3428451695, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.846 [2024-09-27 15:25:04.132747] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.846 [2024-09-27 15:25:04.132789] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.846 [2024-09-27 15:25:04.132805] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.846 [2024-09-27 15:25:04.132825] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.846 [2024-09-27 15:25:04.132839] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.846 [2024-09-27 15:25:04.239527] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.846 [2024-09-27 15:25:04.239575] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.846 [2024-09-27 15:25:04.239598] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.846 [2024-09-27 15:25:04.239631] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.846 [2024-09-27 15:25:04.239743] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.846 ctrlr pubkey: 00:20:33.846 00000000 a6 28 a8 d8 f0 5f d8 94 5f 9b 17 f5 64 06 4d c8 .(..._.._...d.M. 00:20:33.846 00000010 16 e5 26 a4 90 0c cf 63 fe f1 9f 49 b1 4d 53 35 ..&....c...I.MS5 00:20:33.846 00000020 14 2b ba 50 cc d5 d8 bb f0 9b 5e 7a ec 43 34 f5 .+.P......^z.C4. 00:20:33.846 00000030 58 c4 d3 a6 e5 cf e1 33 43 1c cd 1c b8 26 83 ee X......3C....&.. 00:20:33.846 00000040 b6 72 86 3d 7a b1 85 76 7b 35 23 4d a7 dd 37 47 .r.=z..v{5#M..7G 00:20:33.846 00000050 28 56 a7 93 ed 32 c3 f9 25 81 ad b3 be c4 7c 51 (V...2..%.....|Q 00:20:33.846 00000060 5f 1b f9 4c 3c b3 3a 4a 48 7d db fb 4d 94 d9 3d _..L<.:JH}..M..= 00:20:33.846 00000070 70 de 26 39 23 0f 46 b1 02 e1 79 69 2c cf dc 92 p.&9#.F...yi,... 00:20:33.846 00000080 34 18 5d a9 68 32 a0 ef 27 0d 97 09 0e 6c 69 20 4.].h2..'....li 00:20:33.846 00000090 9d 96 a4 be 5d eb e9 d7 36 06 7e d2 ee 79 4f 37 ....]...6.~..yO7 00:20:33.846 000000a0 ef 1d da e0 2e ab 43 e5 cc 13 14 9e bd f3 b6 17 ......C......... 00:20:33.846 000000b0 fc 9c e3 cc dd a2 8a 9c 3e 55 d8 c9 dc 31 7c 86 ........>U...1|. 00:20:33.846 000000c0 05 e1 4a 9c 58 ad 46 49 67 b0 89 b9 6e 5f 35 a8 ..J.X.FIg...n_5. 00:20:33.846 000000d0 e0 c1 d8 69 2a b1 34 9c a2 83 5d 2a b1 4e cc ee ...i*.4...]*.N.. 00:20:33.846 000000e0 e0 c8 ef 88 a1 65 b1 e9 91 4e fd 4a 5c 91 ff e6 .....e...N.J\... 00:20:33.846 000000f0 fa bf 9e 1f 47 1c 2c ac 2a f3 68 ca 91 4a 75 9c ....G.,.*.h..Ju. 00:20:33.846 host pubkey: 00:20:33.846 00000000 c3 ec fb 57 e1 5a 63 28 81 a0 82 61 0d 14 ad 71 ...W.Zc(...a...q 00:20:33.846 00000010 65 a9 f4 63 1a 55 13 e8 44 8c a1 2a 3c c6 ad 32 e..c.U..D..*<..2 00:20:33.846 00000020 20 8a fe aa c7 de b2 44 3f fb b3 85 e2 6f 85 29 ......D?....o.) 00:20:33.846 00000030 34 fd c6 fb 58 5d 09 78 59 9f cc 1a 5f d1 72 7e 4...X].xY..._.r~ 00:20:33.846 00000040 0e b2 ea 5a ae c6 6d e5 a6 64 70 fc a4 3a 25 ce ...Z..m..dp..:%. 00:20:33.846 00000050 43 06 81 6a 74 07 53 96 b5 f5 c6 50 03 f4 79 1b C..jt.S....P..y. 00:20:33.846 00000060 cf 19 0f 60 d6 5c fd f6 a3 c2 b5 e8 d4 70 07 f6 ...`.\.......p.. 00:20:33.846 00000070 7f 28 6d e2 95 16 22 dd 8b ab e2 ab 8e fe 3b 10 .(m...".......;. 00:20:33.846 00000080 47 4f 3b 2f 19 03 14 47 62 99 82 13 8d dd fa 11 GO;/...Gb....... 00:20:33.846 00000090 87 d2 1a ad ff 28 39 c8 b7 ee 57 08 39 1d bf e0 .....(9...W.9... 00:20:33.846 000000a0 c0 e1 bc 02 39 20 bc 72 c2 0e f2 51 15 fb 4e f4 ....9 .r...Q..N. 00:20:33.846 000000b0 76 35 35 3c 82 f4 32 4e 0a ba a6 47 89 8e ae 00 v55<..2N...G.... 00:20:33.846 000000c0 84 3d 64 f5 ee a7 2c 78 98 53 88 57 93 e6 e4 ca .=d...,x.S.W.... 00:20:33.846 000000d0 a9 87 dd bb 2b 55 d4 57 33 0e fb 07 4d 68 d3 fc ....+U.W3...Mh.. 00:20:33.846 000000e0 06 50 35 5b 8e 06 e6 6c 8d 83 40 97 9e 3b c7 35 .P5[...l..@..;.5 00:20:33.846 000000f0 8d 10 92 aa e7 b7 35 66 6f 16 ae 37 42 01 86 e0 ......5fo..7B... 00:20:33.846 dh secret: 00:20:33.846 00000000 77 ad ee 09 a7 9c b9 f5 cb 0c 10 d3 3f b7 e0 64 w...........?..d 00:20:33.846 00000010 dc 01 ed 84 21 fa 8b c1 15 7c 95 49 a5 72 c9 74 ....!....|.I.r.t 00:20:33.846 00000020 16 98 7c 59 8a 7a 70 63 a1 0f 48 47 0a 54 aa 9d ..|Y.zpc..HG.T.. 00:20:33.846 00000030 ea af b3 50 2c a3 f1 70 07 f6 7b 57 47 48 83 03 ...P,..p..{WGH.. 00:20:33.846 00000040 42 a3 44 e8 3a 4b 05 84 ac bd 0e a4 3f 88 29 f4 B.D.:K......?.). 00:20:33.846 00000050 24 0d db 0c ec 57 87 45 ff 21 a5 f8 61 41 01 80 $....W.E.!..aA.. 00:20:33.846 00000060 f9 dd f3 fd 14 61 17 57 a1 3d 66 a7 fa c7 d1 5b .....a.W.=f....[ 00:20:33.846 00000070 9b 3f 2c 75 c0 e8 14 4c 0c a6 cf 39 5b 2a 3d 7e .?,u...L...9[*=~ 00:20:33.846 00000080 ff de 60 27 26 cb a6 4f 92 50 e5 78 3b e0 d8 39 ..`'&..O.P.x;..9 00:20:33.846 00000090 c2 70 fc c9 4a 6d 34 ed 6f 98 5e cb eb 92 53 6d .p..Jm4.o.^...Sm 00:20:33.846 000000a0 90 e2 05 0d 89 70 a1 10 64 42 f5 d1 bc f2 04 b5 .....p..dB...... 00:20:33.846 000000b0 b7 3a 1a 85 09 c8 7a c0 b3 b1 63 b4 16 1c 0f c2 .:....z...c..... 00:20:33.846 000000c0 76 3a 68 b4 cd b1 20 60 47 64 00 48 f9 4c 7f 94 v:h... `Gd.H.L.. 00:20:33.846 000000d0 70 67 1f 3d 1b 19 c0 3d ce 1f 85 a1 65 b8 2a 91 pg.=...=....e.*. 00:20:33.846 000000e0 26 e8 89 8e ad cf 77 a2 f8 90 94 20 62 e0 17 12 &.....w.... b... 00:20:33.846 000000f0 92 99 0c 16 1f b7 57 b4 c8 45 1f 43 5a fa f7 82 ......W..E.CZ... 00:20:33.846 [2024-09-27 15:25:04.242475] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=1, dhgroup=1, seq=3428451696, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.846 [2024-09-27 15:25:04.242582] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.846 [2024-09-27 15:25:04.251580] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.846 [2024-09-27 15:25:04.251659] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.847 [2024-09-27 15:25:04.251668] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.847 [2024-09-27 15:25:04.251707] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.847 [2024-09-27 15:25:04.408730] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.847 [2024-09-27 15:25:04.408749] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.847 [2024-09-27 15:25:04.408756] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.847 [2024-09-27 15:25:04.408801] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.847 [2024-09-27 15:25:04.408823] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.847 ctrlr pubkey: 00:20:33.847 00000000 d9 81 9b bf d2 23 10 c0 20 1e 4c 33 ee fd cc 88 .....#.. .L3.... 00:20:33.847 00000010 3e 52 84 fa b7 82 7c ce 86 20 0b 8c dc e6 a6 93 >R....|.. ...... 00:20:33.847 00000020 90 c3 50 aa e9 b9 38 b0 22 9a a4 24 05 f4 62 c1 ..P...8."..$..b. 00:20:33.847 00000030 61 48 ad a9 df e3 df d4 eb dc ac 7b 02 77 12 27 aH.........{.w.' 00:20:33.847 00000040 5d a1 58 8d 14 6e ef 2b 4f 40 78 c6 db ba f5 48 ].X..n.+O@x....H 00:20:33.847 00000050 09 f9 d5 b9 ef 25 b8 ff 5e b1 a7 a5 89 a3 81 f5 .....%..^....... 00:20:33.847 00000060 ad 24 f9 76 df e2 cd 06 7b f8 3d c6 ad 1d 71 eb .$.v....{.=...q. 00:20:33.847 00000070 e5 75 7f 5a 5f 13 18 17 88 e2 cd 72 9f 4c e5 9c .u.Z_......r.L.. 00:20:33.847 00000080 f9 f6 2a a2 36 31 04 b2 24 cf c0 0b f7 56 bf 1e ..*.61..$....V.. 00:20:33.847 00000090 c6 6e 1a 78 dc ce ab 0f 67 ef ef 53 05 86 d3 aa .n.x....g..S.... 00:20:33.847 000000a0 c7 72 81 63 f5 7b 9f 6a 97 2c ee 93 93 30 2d 87 .r.c.{.j.,...0-. 00:20:33.847 000000b0 a1 3a 75 f9 b0 ff 42 1e 73 d8 12 f8 da a3 fd fe .:u...B.s....... 00:20:33.847 000000c0 1f 31 d0 03 b3 32 86 51 95 4e da 02 dc 4b 2d 19 .1...2.Q.N...K-. 00:20:33.847 000000d0 ab c2 72 99 04 ae f9 6f 7a f8 32 34 f0 90 71 fb ..r....oz.24..q. 00:20:33.847 000000e0 d9 10 ef 4a ee 33 3c 4d 40 e7 17 3c 6d 55 23 41 ...J.3f.i.n).s.....Y. 00:20:33.847 [2024-09-27 15:25:04.411426] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=1, seq=3428451697, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.847 [2024-09-27 15:25:04.414132] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.847 [2024-09-27 15:25:04.414174] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.847 [2024-09-27 15:25:04.414191] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.847 [2024-09-27 15:25:04.414216] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.847 [2024-09-27 15:25:04.414227] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.847 [2024-09-27 15:25:04.520991] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.847 [2024-09-27 15:25:04.521009] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.847 [2024-09-27 15:25:04.521016] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.847 [2024-09-27 15:25:04.521026] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.847 [2024-09-27 15:25:04.521083] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.847 ctrlr pubkey: 00:20:33.847 00000000 d9 81 9b bf d2 23 10 c0 20 1e 4c 33 ee fd cc 88 .....#.. .L3.... 00:20:33.847 00000010 3e 52 84 fa b7 82 7c ce 86 20 0b 8c dc e6 a6 93 >R....|.. ...... 00:20:33.847 00000020 90 c3 50 aa e9 b9 38 b0 22 9a a4 24 05 f4 62 c1 ..P...8."..$..b. 00:20:33.847 00000030 61 48 ad a9 df e3 df d4 eb dc ac 7b 02 77 12 27 aH.........{.w.' 00:20:33.847 00000040 5d a1 58 8d 14 6e ef 2b 4f 40 78 c6 db ba f5 48 ].X..n.+O@x....H 00:20:33.847 00000050 09 f9 d5 b9 ef 25 b8 ff 5e b1 a7 a5 89 a3 81 f5 .....%..^....... 00:20:33.847 00000060 ad 24 f9 76 df e2 cd 06 7b f8 3d c6 ad 1d 71 eb .$.v....{.=...q. 00:20:33.847 00000070 e5 75 7f 5a 5f 13 18 17 88 e2 cd 72 9f 4c e5 9c .u.Z_......r.L.. 00:20:33.847 00000080 f9 f6 2a a2 36 31 04 b2 24 cf c0 0b f7 56 bf 1e ..*.61..$....V.. 00:20:33.847 00000090 c6 6e 1a 78 dc ce ab 0f 67 ef ef 53 05 86 d3 aa .n.x....g..S.... 00:20:33.847 000000a0 c7 72 81 63 f5 7b 9f 6a 97 2c ee 93 93 30 2d 87 .r.c.{.j.,...0-. 00:20:33.847 000000b0 a1 3a 75 f9 b0 ff 42 1e 73 d8 12 f8 da a3 fd fe .:u...B.s....... 00:20:33.847 000000c0 1f 31 d0 03 b3 32 86 51 95 4e da 02 dc 4b 2d 19 .1...2.Q.N...K-. 00:20:33.847 000000d0 ab c2 72 99 04 ae f9 6f 7a f8 32 34 f0 90 71 fb ..r....oz.24..q. 00:20:33.847 000000e0 d9 10 ef 4a ee 33 3c 4d 40 e7 17 3c 6d 55 23 41 ...J.3.. 00:20:33.848 00000040 6e f1 5d 82 63 9e ca bd 7b 4f 56 66 83 eb c9 31 n.].c...{OVf...1 00:20:33.848 00000050 54 f4 d9 2b 83 8e f8 78 fe 54 67 a8 20 af 71 2f T..+...x.Tg. .q/ 00:20:33.848 00000060 93 61 d0 30 ef a0 54 a1 fb e9 d4 4f 1d 92 c3 7a .a.0..T....O...z 00:20:33.848 00000070 a2 70 64 25 e5 e4 22 88 28 8e 18 e3 bd 66 25 b3 .pd%..".(....f%. 00:20:33.848 00000080 23 04 21 4f 90 82 82 5a f3 18 b9 c4 73 5e 96 b4 #.!O...Z....s^.. 00:20:33.848 00000090 dd da af 71 e0 76 02 c1 22 7e 80 ca 68 bf f2 6a ...q.v.."~..h..j 00:20:33.848 000000a0 ab 1d bb d6 14 04 5b 4c b7 11 8a d1 b5 a7 56 e1 ......[L......V. 00:20:33.848 000000b0 e5 4e 2e bf f3 f9 71 0a 1d 92 d2 2d fd 90 c8 46 .N....q....-...F 00:20:33.848 000000c0 02 7f 8d 92 20 d2 c9 dc 87 14 3c fb 1c 26 c3 0f .... .....<..&.. 00:20:33.848 000000d0 3b 2f df 69 bb 82 2d 05 a0 25 d6 17 90 0b 29 47 ;/.i..-..%....)G 00:20:33.848 000000e0 00 db 8a 75 cb d7 42 49 56 ff 40 da 9e fa ff 19 ...u..BIV.@..... 00:20:33.848 000000f0 d1 10 25 49 ce 6e ca ff 4a a6 ab e4 98 18 58 1e ..%I.n..J.....X. 00:20:33.848 host pubkey: 00:20:33.848 00000000 2e 84 63 1e ed d0 63 61 3a 1e a2 87 78 87 a9 52 ..c...ca:...x..R 00:20:33.848 00000010 33 63 83 01 08 04 f3 00 ca 8f c9 10 4d ec 54 90 3c..........M.T. 00:20:33.848 00000020 ef 02 ba bf 1b cd 41 15 63 c5 3b d7 ab e5 e2 76 ......A.c.;....v 00:20:33.848 00000030 dd 2a fd ff 54 27 8a 9e 85 b6 4b 2a 84 ec 4a 07 .*..T'....K*..J. 00:20:33.848 00000040 01 ed 93 f5 0b 2a 02 4d 14 f9 86 ae 0b 72 07 ff .....*.M.....r.. 00:20:33.848 00000050 e0 80 9e fd cd 82 ff 34 47 f4 b1 2d f1 4c a1 65 .......4G..-.L.e 00:20:33.848 00000060 b4 f5 4e b5 9d b7 6f 60 4d dd 12 61 7b d3 d7 ca ..N...o`M..a{... 00:20:33.848 00000070 a9 21 b8 d2 a7 2d 91 03 a8 8f de 50 31 b5 b6 bf .!...-.....P1... 00:20:33.848 00000080 43 c7 4d 04 f2 89 a0 6b 56 47 df c8 51 67 d7 90 C.M....kVG..Qg.. 00:20:33.848 00000090 5c b8 91 6b 84 72 96 8e bf bf f6 09 17 7d 30 85 \..k.r.......}0. 00:20:33.848 000000a0 58 6a ee ea 66 a7 29 8f ee 04 85 48 e2 95 52 b6 Xj..f.)....H..R. 00:20:33.848 000000b0 3e dd 27 85 28 a4 21 9b 5e a2 b5 bc bc 8e 58 34 >.'.(.!.^.....X4 00:20:33.848 000000c0 54 ad 9e 02 dc 2b 02 8c db d7 6a bf 0c 16 84 4f T....+....j....O 00:20:33.848 000000d0 e3 38 38 bb 9a 46 c6 d4 88 37 86 3d ef d3 f8 f7 .88..F...7.=.... 00:20:33.848 000000e0 ec e1 69 94 5a 33 a6 84 ac 50 1b 06 7f 33 ef 90 ..i.Z3...P...3.. 00:20:33.848 000000f0 cf ef 2e 8a 55 f2 f9 18 46 ec 62 a3 91 ff b8 af ....U...F.b..... 00:20:33.848 dh secret: 00:20:33.848 00000000 49 5a 71 b8 d8 6a 1a 69 40 e1 36 06 41 06 c3 1b IZq..j.i@.6.A... 00:20:33.848 00000010 38 30 c1 15 0f a1 a9 66 93 30 c1 ea 6e 47 79 c1 80.....f.0..nGy. 00:20:33.848 00000020 b3 30 6a 18 21 d6 29 cd 03 09 c4 db 6a d9 f0 d9 .0j.!.).....j... 00:20:33.848 00000030 e0 a8 82 50 63 d2 69 d6 a7 b4 02 af 71 b4 16 7c ...Pc.i.....q..| 00:20:33.848 00000040 c6 a3 2c 42 d5 22 0f 2c c9 21 55 81 46 8b 17 74 ..,B.".,.!U.F..t 00:20:33.848 00000050 72 53 78 68 58 df 5b fd b2 f6 9c 3a e4 6b bc cb rSxhX.[....:.k.. 00:20:33.848 00000060 dd 99 e8 17 7d 04 d1 f6 b3 0d 12 f4 47 92 cf 3a ....}.......G..: 00:20:33.848 00000070 19 86 8e fa 32 f5 bb cf 90 e2 c7 8b 0b 0f da 77 ....2..........w 00:20:33.848 00000080 5d 96 d2 83 83 31 ea a3 ef 1e 82 92 32 68 d8 56 ]....1......2h.V 00:20:33.848 00000090 4e c0 0c 41 b5 23 13 90 69 fb 9d 38 4a d5 91 8f N..A.#..i..8J... 00:20:33.848 000000a0 32 18 87 4f 54 62 d2 ce 42 dc 2b 60 b4 56 7b 82 2..OTb..B.+`.V{. 00:20:33.848 000000b0 c3 99 45 08 17 ca 2d 07 6e a9 cd 7f 80 52 67 d7 ..E...-.n....Rg. 00:20:33.848 000000c0 7a 02 13 74 09 29 20 d2 39 e1 8e 43 e1 9d 04 e4 z..t.) .9..C.... 00:20:33.848 000000d0 78 70 5c 59 ff 6c 25 59 c0 3f b3 2d f1 e5 68 33 xp\Y.l%Y.?.-..h3 00:20:33.848 000000e0 44 a0 c1 2e c9 32 b2 3c 19 aa 29 9e b1 19 89 5d D....2.<..)....] 00:20:33.848 000000f0 9b 82 c8 92 f2 3c f9 15 2c ea f9 3d 23 e4 6d f4 .....<..,..=#.m. 00:20:33.848 [2024-09-27 15:25:04.690854] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=1, dhgroup=1, seq=3428451699, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.848 [2024-09-27 15:25:04.693592] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.848 [2024-09-27 15:25:04.693631] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.848 [2024-09-27 15:25:04.693647] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.848 [2024-09-27 15:25:04.693672] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.848 [2024-09-27 15:25:04.693683] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.848 [2024-09-27 15:25:04.799518] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.848 [2024-09-27 15:25:04.799535] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.848 [2024-09-27 15:25:04.799542] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.848 [2024-09-27 15:25:04.799552] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.848 [2024-09-27 15:25:04.799606] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.848 ctrlr pubkey: 00:20:33.848 00000000 81 6d cd bc 40 c8 fc e6 11 81 c8 28 e7 60 97 3d .m..@......(.`.= 00:20:33.848 00000010 d0 31 10 dd c0 16 7d bb ef 08 db 74 fa b0 db a6 .1....}....t.... 00:20:33.848 00000020 9d 84 39 f6 dc a1 dc 54 4c 7d 34 85 ff d9 a2 47 ..9....TL}4....G 00:20:33.848 00000030 be 9d 77 44 fa 93 ce 5d 36 c9 ea b5 6d 3e df 8a ..wD...]6...m>.. 00:20:33.848 00000040 6e f1 5d 82 63 9e ca bd 7b 4f 56 66 83 eb c9 31 n.].c...{OVf...1 00:20:33.848 00000050 54 f4 d9 2b 83 8e f8 78 fe 54 67 a8 20 af 71 2f T..+...x.Tg. .q/ 00:20:33.848 00000060 93 61 d0 30 ef a0 54 a1 fb e9 d4 4f 1d 92 c3 7a .a.0..T....O...z 00:20:33.848 00000070 a2 70 64 25 e5 e4 22 88 28 8e 18 e3 bd 66 25 b3 .pd%..".(....f%. 00:20:33.849 00000080 23 04 21 4f 90 82 82 5a f3 18 b9 c4 73 5e 96 b4 #.!O...Z....s^.. 00:20:33.849 00000090 dd da af 71 e0 76 02 c1 22 7e 80 ca 68 bf f2 6a ...q.v.."~..h..j 00:20:33.849 000000a0 ab 1d bb d6 14 04 5b 4c b7 11 8a d1 b5 a7 56 e1 ......[L......V. 00:20:33.849 000000b0 e5 4e 2e bf f3 f9 71 0a 1d 92 d2 2d fd 90 c8 46 .N....q....-...F 00:20:33.849 000000c0 02 7f 8d 92 20 d2 c9 dc 87 14 3c fb 1c 26 c3 0f .... .....<..&.. 00:20:33.849 000000d0 3b 2f df 69 bb 82 2d 05 a0 25 d6 17 90 0b 29 47 ;/.i..-..%....)G 00:20:33.849 000000e0 00 db 8a 75 cb d7 42 49 56 ff 40 da 9e fa ff 19 ...u..BIV.@..... 00:20:33.849 000000f0 d1 10 25 49 ce 6e ca ff 4a a6 ab e4 98 18 58 1e ..%I.n..J.....X. 00:20:33.849 host pubkey: 00:20:33.849 00000000 b8 0a 82 2d 7a db 73 26 9e 72 47 5b c3 81 7d 08 ...-z.s&.rG[..}. 00:20:33.849 00000010 07 d6 47 1e 9a a9 8b 22 ba 64 b5 d1 6f ac 34 46 ..G....".d..o.4F 00:20:33.849 00000020 c0 0f 60 45 1f 78 7f 1e fa a8 a9 2d 2d 89 db 32 ..`E.x.....--..2 00:20:33.849 00000030 eb 14 1a 48 15 bb be 4a b9 42 ed 8e 58 6a dc a6 ...H...J.B..Xj.. 00:20:33.849 00000040 80 98 52 c1 c8 91 2a ce 3d 12 18 9c 65 95 d1 80 ..R...*.=...e... 00:20:33.849 00000050 17 79 80 9c 4a 9d 48 45 dd d5 31 02 e6 c3 9d d8 .y..J.HE..1..... 00:20:33.849 00000060 12 67 ec e8 46 16 65 ea d0 13 6a 94 4f 0d c9 da .g..F.e...j.O... 00:20:33.849 00000070 af ba 7e 15 94 af 39 a4 d8 f5 14 37 29 19 f3 68 ..~...9....7)..h 00:20:33.849 00000080 be 78 86 b4 e6 24 ee 2c b9 df c4 06 9c 7a 96 af .x...$.,.....z.. 00:20:33.849 00000090 a0 6e b1 dd 46 b4 5e 78 42 94 6f cf 9c 13 cd 60 .n..F.^xB.o....` 00:20:33.849 000000a0 04 91 30 54 96 9b 8b 47 81 c2 44 0a 74 5a 74 4c ..0T...G..D.tZtL 00:20:33.849 000000b0 c8 f9 c2 f9 d7 b3 d0 8e 5f c7 ef 0c a9 f9 94 56 ........_......V 00:20:33.849 000000c0 cf d7 8b c2 1c 66 33 70 e4 0b 10 f4 86 f5 7d 48 .....f3p......}H 00:20:33.849 000000d0 94 27 2b ee 09 b6 29 a9 36 d5 7f d5 43 61 1b 32 .'+...).6...Ca.2 00:20:33.849 000000e0 51 4a b9 e2 da fa 45 c0 99 5b c7 0e 2d 94 6c 93 QJ....E..[..-.l. 00:20:33.849 000000f0 bb 4f 70 35 ce 2b ba bd 95 8b db c2 d4 c0 f9 af .Op5.+.......... 00:20:33.849 dh secret: 00:20:33.849 00000000 8c 10 4f b0 fc 94 10 17 97 0c e0 b4 ca dd 9c 7f ..O............. 00:20:33.849 00000010 86 73 25 d8 8e 19 4e 43 c3 d0 3a 83 f8 23 af fb .s%...NC..:..#.. 00:20:33.849 00000020 f4 b8 9d c0 1c 13 a1 63 a6 47 1b 65 e9 48 27 c2 .......c.G.e.H'. 00:20:33.849 00000030 ec 85 ad 4f dd cf 2d f8 a1 3e fd d0 09 e7 1b bc ...O..-..>...... 00:20:33.849 00000040 8e 03 e7 c7 0b 3a da 69 b1 89 84 c9 f5 f5 af 61 .....:.i.......a 00:20:33.849 00000050 34 7f 77 ef 79 7e 43 3c a8 89 d6 05 f6 e8 3c 74 4.w.y~C<........../.. 00:20:33.849 000000d0 b4 10 1a ea 94 89 a3 16 2b d4 38 6e ee 87 f0 15 ........+.8n.... 00:20:33.849 000000e0 a7 4d 7b 3a 0f 2f 0b 58 11 79 62 cb 5e 8a 50 d5 .M{:./.X.yb.^.P. 00:20:33.849 000000f0 d7 31 81 ff a7 d2 04 24 a9 57 5c c3 6a e4 92 29 .1.....$.W\.j..) 00:20:33.849 [2024-09-27 15:25:04.802374] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=1, dhgroup=1, seq=3428451700, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.849 [2024-09-27 15:25:04.802470] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.849 [2024-09-27 15:25:04.811615] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.849 [2024-09-27 15:25:04.811688] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.849 [2024-09-27 15:25:04.811698] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.849 [2024-09-27 15:25:04.811737] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.849 [2024-09-27 15:25:04.963012] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.849 [2024-09-27 15:25:04.963031] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.849 [2024-09-27 15:25:04.963038] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.849 [2024-09-27 15:25:04.963083] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.849 [2024-09-27 15:25:04.963106] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.849 ctrlr pubkey: 00:20:33.849 00000000 d4 a4 c7 18 74 e8 fb b6 96 38 cd 90 49 a9 86 db ....t....8..I... 00:20:33.849 00000010 af a5 bd ae 24 3b 11 bc 91 f3 d7 72 24 1a 68 73 ....$;.....r$.hs 00:20:33.849 00000020 56 44 77 0b 4d 46 63 10 7f b4 0f 44 de 1a 38 fd VDw.MFc....D..8. 00:20:33.849 00000030 98 71 f7 f1 42 52 51 f1 2f 55 49 de 7a 4a 95 1d .q..BRQ./UI.zJ.. 00:20:33.849 00000040 c8 28 ac b1 88 8d ce ec ba fb 73 a7 69 3c 8c 77 .(........s.i<.w 00:20:33.849 00000050 71 d4 16 03 59 2b f1 4a d2 d7 6d 71 4f 9c e7 f5 q...Y+.J..mqO... 00:20:33.849 00000060 aa 2b 57 bc 40 61 d3 0c 2f 06 40 e6 a0 a6 01 9b .+W.@a../.@..... 00:20:33.849 00000070 fb e4 54 cd f3 98 11 5a 0b cb 85 1b f2 df 7e 5a ..T....Z......~Z 00:20:33.849 00000080 f4 21 aa b8 b7 b1 3d b1 46 3d e7 f6 c8 88 b5 f2 .!....=.F=...... 00:20:33.849 00000090 48 40 92 2b a7 22 2b 62 92 55 48 ae 9d 49 b8 0f H@.+."+b.UH..I.. 00:20:33.849 000000a0 18 3c 33 a9 6b 2a 72 81 ec c6 52 4f a8 07 11 00 .<3.k*r...RO.... 00:20:33.849 000000b0 52 7b 76 71 59 5c e4 c6 0f 13 a5 1b 03 ed 7c 49 R{vqY\........|I 00:20:33.849 000000c0 21 06 ba ca f8 92 0f 2e bb 6d 4e c1 2f 10 71 41 !........mN./.qA 00:20:33.849 000000d0 41 12 5b a6 b5 b5 40 69 e4 6d f1 5c 7c 13 e7 67 A.[...@i.m.\|..g 00:20:33.849 000000e0 b9 53 3f 9c 53 c3 8f e0 e3 7f ac 4b ff ed cb 22 .S?.S......K..." 00:20:33.849 000000f0 52 82 6f ae c1 7b 97 d3 48 47 e3 63 23 4e cd f0 R.o..{..HG.c#N.. 00:20:33.849 host pubkey: 00:20:33.849 00000000 71 55 97 95 9a 98 a7 99 c9 80 c9 95 ed bb 22 70 qU............"p 00:20:33.849 00000010 c3 5a 85 fe 05 2c e5 f0 87 d4 65 8e 15 0c 34 21 .Z...,....e...4! 00:20:33.849 00000020 ed 87 90 f5 88 65 fc 1e 56 73 92 57 37 c5 54 57 .....e..Vs.W7.TW 00:20:33.849 00000030 11 a0 1c 95 07 c3 25 39 d5 6c d5 76 d5 78 80 c2 ......%9.l.v.x.. 00:20:33.849 00000040 85 e3 8e 27 eb cd fb 5e 60 e2 15 76 82 31 6c b7 ...'...^`..v.1l. 00:20:33.849 00000050 92 43 98 88 7d ba 0c be 22 4c ed 7f 5d 9d 41 84 .C..}..."L..].A. 00:20:33.849 00000060 2d bf c9 a6 e9 a7 d5 2c 1c e3 2f 2f ff ec e5 64 -......,..//...d 00:20:33.849 00000070 25 c2 23 f2 8f 77 de 12 97 1f cf a5 7f 6b d5 7e %.#..w.......k.~ 00:20:33.849 00000080 99 ca 55 7a 1f 21 bc 2d 9c 15 5c da 0e 2f 44 d7 ..Uz.!.-..\../D. 00:20:33.849 00000090 48 f8 f2 c4 30 7e 59 cb d5 e5 e8 eb 5d 8c 73 31 H...0~Y.....].s1 00:20:33.849 000000a0 98 7f af 3a 65 93 b9 c4 2d 88 42 89 b9 64 42 bb ...:e...-.B..dB. 00:20:33.849 000000b0 63 61 dc e9 eb 7c 3b d4 d4 a7 6a 7c 50 93 71 47 ca...|;...j|P.qG 00:20:33.849 000000c0 37 d8 28 a7 b0 4d 62 98 4c 12 12 03 9d 15 fc 70 7.(..Mb.L......p 00:20:33.849 000000d0 53 dc 7c 41 62 8c 60 e7 7b 17 7a ce c7 7a dd 51 S.|Ab.`.{.z..z.Q 00:20:33.849 000000e0 3d 69 b0 b9 60 a0 77 90 97 88 67 f0 ef 8d 0c 17 =i..`.w...g..... 00:20:33.849 000000f0 dd 0b 86 de c1 d8 d4 5c fd 79 25 ae 37 9c 87 c1 .......\.y%.7... 00:20:33.849 dh secret: 00:20:33.849 00000000 f9 ec 1d 6a f9 8e 4b 1c b6 b9 27 30 4e 35 a6 ac ...j..K...'0N5.. 00:20:33.849 00000010 cd 82 28 3d 0e 29 a5 a6 98 bf a6 bb b5 4f aa 9a ..(=.).......O.. 00:20:33.849 00000020 17 53 a9 91 cf bf 5b 7b b0 0c 7c c0 ed e9 60 1b .S....[{..|...`. 00:20:33.849 00000030 68 f3 99 6f 52 13 ee 59 c6 e6 d7 19 f2 b1 33 3c h..oR..Y......3< 00:20:33.849 00000040 b5 31 01 86 42 d9 27 78 ba 3e d6 73 d1 06 52 f0 .1..B.'x.>.s..R. 00:20:33.849 00000050 fe b0 45 b5 91 43 97 cc d0 6d e5 df a0 ed e2 04 ..E..C...m...... 00:20:33.849 00000060 da c3 0a 41 2e 76 5d 54 6d ca f8 bf b8 01 3a 44 ...A.v]Tm.....:D 00:20:33.849 00000070 16 1c 3b 7e ac f0 8f 82 b5 a6 55 84 2b 19 74 8e ..;~......U.+.t. 00:20:33.849 00000080 71 d0 d3 72 17 70 82 0b e7 43 f6 6b da c0 48 87 q..r.p...C.k..H. 00:20:33.849 00000090 61 7a c4 22 64 64 1e d8 46 e9 7a 55 97 a8 53 af az."dd..F.zU..S. 00:20:33.849 000000a0 ea f0 20 d8 c6 2c 0a c7 c7 15 8c 40 ba b6 79 18 .. ..,.....@..y. 00:20:33.849 000000b0 d3 cd 93 22 a7 15 5d 75 a9 54 b5 c5 c6 77 01 2c ..."..]u.T...w., 00:20:33.849 000000c0 ee a1 1b 87 81 63 55 f0 bb f6 fa 41 ab 51 4d 57 .....cU....A.QMW 00:20:33.849 000000d0 34 b4 6b eb 49 eb f6 19 d4 54 0d cb 3e ee 04 24 4.k.I....T..>..$ 00:20:33.849 000000e0 89 2b a8 4f 89 20 80 cc 48 06 38 f3 e4 d3 99 05 .+.O. ..H.8..... 00:20:33.849 000000f0 7a ff 40 70 3f 8a 86 63 6f 39 3e ef c0 c3 15 95 z.@p?..co9>..... 00:20:33.849 [2024-09-27 15:25:04.965757] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=1, dhgroup=1, seq=3428451701, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.849 [2024-09-27 15:25:04.968610] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.849 [2024-09-27 15:25:04.968648] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.849 [2024-09-27 15:25:04.968666] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.849 [2024-09-27 15:25:04.968695] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.849 [2024-09-27 15:25:04.968706] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.849 [2024-09-27 15:25:05.074945] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.849 [2024-09-27 15:25:05.074962] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.849 [2024-09-27 15:25:05.074969] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.849 [2024-09-27 15:25:05.074979] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.849 [2024-09-27 15:25:05.075033] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.849 ctrlr pubkey: 00:20:33.850 00000000 d4 a4 c7 18 74 e8 fb b6 96 38 cd 90 49 a9 86 db ....t....8..I... 00:20:33.850 00000010 af a5 bd ae 24 3b 11 bc 91 f3 d7 72 24 1a 68 73 ....$;.....r$.hs 00:20:33.850 00000020 56 44 77 0b 4d 46 63 10 7f b4 0f 44 de 1a 38 fd VDw.MFc....D..8. 00:20:33.850 00000030 98 71 f7 f1 42 52 51 f1 2f 55 49 de 7a 4a 95 1d .q..BRQ./UI.zJ.. 00:20:33.850 00000040 c8 28 ac b1 88 8d ce ec ba fb 73 a7 69 3c 8c 77 .(........s.i<.w 00:20:33.850 00000050 71 d4 16 03 59 2b f1 4a d2 d7 6d 71 4f 9c e7 f5 q...Y+.J..mqO... 00:20:33.850 00000060 aa 2b 57 bc 40 61 d3 0c 2f 06 40 e6 a0 a6 01 9b .+W.@a../.@..... 00:20:33.850 00000070 fb e4 54 cd f3 98 11 5a 0b cb 85 1b f2 df 7e 5a ..T....Z......~Z 00:20:33.850 00000080 f4 21 aa b8 b7 b1 3d b1 46 3d e7 f6 c8 88 b5 f2 .!....=.F=...... 00:20:33.850 00000090 48 40 92 2b a7 22 2b 62 92 55 48 ae 9d 49 b8 0f H@.+."+b.UH..I.. 00:20:33.850 000000a0 18 3c 33 a9 6b 2a 72 81 ec c6 52 4f a8 07 11 00 .<3.k*r...RO.... 00:20:33.850 000000b0 52 7b 76 71 59 5c e4 c6 0f 13 a5 1b 03 ed 7c 49 R{vqY\........|I 00:20:33.850 000000c0 21 06 ba ca f8 92 0f 2e bb 6d 4e c1 2f 10 71 41 !........mN./.qA 00:20:33.850 000000d0 41 12 5b a6 b5 b5 40 69 e4 6d f1 5c 7c 13 e7 67 A.[...@i.m.\|..g 00:20:33.850 000000e0 b9 53 3f 9c 53 c3 8f e0 e3 7f ac 4b ff ed cb 22 .S?.S......K..." 00:20:33.850 000000f0 52 82 6f ae c1 7b 97 d3 48 47 e3 63 23 4e cd f0 R.o..{..HG.c#N.. 00:20:33.850 host pubkey: 00:20:33.850 00000000 a0 d0 3a e4 43 2d 09 5d 26 a1 32 c5 3c 2f 13 9a ..:.C-.]&.2........d..5..G| 00:20:33.850 00000020 f7 66 a4 6d b4 40 85 8b af d0 bb a2 f5 f1 86 d4 .f.m.@.......... 00:20:33.850 00000030 1c 90 98 0d 15 e2 aa 7f f7 ed de 4d 9c c8 59 16 ...........M..Y. 00:20:33.850 00000040 b8 a7 35 f9 59 be 7d 12 91 61 3a c2 5c 9a dc bd ..5.Y.}..a:.\... 00:20:33.850 00000050 a4 2a 30 7f 19 3b cd d2 f1 aa ac 4f 05 8f e0 8c .*0..;.....O.... 00:20:33.850 00000060 29 a4 08 d1 1a 4d 69 0c 13 5a 93 ec 2d 87 51 78 )....Mi..Z..-.Qx 00:20:33.850 00000070 81 28 f6 0d fb 0c 4a d4 bd 48 46 0e a4 a9 5e 27 .(....J..HF...^' 00:20:33.850 00000080 15 cd a3 9e 1e 92 2e fd ae a3 36 bf 81 05 1e b9 ..........6..... 00:20:33.850 00000090 94 b3 2d b7 cb 46 c2 87 1e 15 ea 18 8d 4c 9a dd ..-..F.......L.. 00:20:33.850 000000a0 ff 7d c6 6c c4 9b db c3 cf a2 ae 1c 00 86 e8 28 .}.l...........( 00:20:33.850 000000b0 79 f1 96 17 fe 92 c0 c0 18 c5 37 56 64 4b f9 4f y.........7VdK.O 00:20:33.850 000000c0 cc 83 26 56 e9 7b e7 17 40 09 3e 60 19 bf 2e 03 ..&V.{..@.>`.... 00:20:33.850 000000d0 0c 58 b3 82 2f a0 8d 2d a7 36 02 45 16 f6 42 bc .X../..-.6.E..B. 00:20:33.850 000000e0 cf 93 39 1b 7b 71 f0 bf d0 ca e1 2f 03 56 25 62 ..9.{q...../.V%b 00:20:33.850 000000f0 a8 78 8e 60 9c 5e f3 40 a8 fa 46 9e b6 96 bb 44 .x.`.^.@..F....D 00:20:33.850 dh secret: 00:20:33.850 00000000 9c 07 8b 7c cf 98 95 ac a4 c2 2b 80 48 0a c7 50 ...|......+.H..P 00:20:33.850 00000010 41 75 40 d2 4c f3 70 88 92 49 64 00 e8 3a f9 8f Au@.L.p..Id..:.. 00:20:33.850 00000020 c2 bb b4 27 f8 79 f8 6c 2f 8b d0 40 3d dd 3d 92 ...'.y.l/..@=.=. 00:20:33.850 00000030 32 b2 92 cb 7d 3c 27 64 fc a3 ff fe 5c a7 20 6a 2...}<'d....\. j 00:20:33.850 00000040 59 6c 30 b9 82 d2 7f d8 09 de 41 3b 39 9f e6 61 Yl0.......A;9..a 00:20:33.850 00000050 e8 33 df d8 72 f0 78 94 a2 72 d7 2a 78 f5 f9 52 .3..r.x..r.*x..R 00:20:33.850 00000060 42 59 f4 5b ac f3 6b 1b e6 98 d3 b9 0d 23 63 3f BY.[..k......#c? 00:20:33.850 00000070 1c 22 5d 09 b9 6e 0e 39 2c 0b 7c e1 35 57 be 08 ."]..n.9,.|.5W.. 00:20:33.850 00000080 e3 52 91 a9 5c 31 87 a5 c3 cf 2d 04 93 29 b7 d4 .R..\1....-..).. 00:20:33.850 00000090 83 59 4f 0a 97 79 37 72 fe f2 43 d2 ae 11 eb 7f .YO..y7r..C..... 00:20:33.850 000000a0 49 d7 ae 30 db 82 bb b0 2c 2f e4 56 88 4b b1 c1 I..0....,/.V.K.. 00:20:33.850 000000b0 33 77 61 4f 1c ad 53 fc 3b b8 b1 cb c1 68 b5 2b 3waO..S.;....h.+ 00:20:33.850 000000c0 a1 1f 06 0b f4 9c 55 94 9f 1a c2 e0 a1 43 cd 8c ......U......C.. 00:20:33.850 000000d0 6c 29 71 e5 00 7d 35 18 ae 5f 01 b0 0c 2e e9 b3 l)q..}5.._...... 00:20:33.850 000000e0 5e 87 db 1b 65 99 ff e3 14 3e ee ca f3 32 25 94 ^...e....>...2%. 00:20:33.850 000000f0 30 5f f0 7f 4c e0 89 77 bd 13 24 a5 c6 3d a5 85 0_..L..w..$..=.. 00:20:33.850 [2024-09-27 15:25:05.077667] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=1, dhgroup=1, seq=3428451702, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.850 [2024-09-27 15:25:05.077760] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.850 [2024-09-27 15:25:05.087128] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.850 [2024-09-27 15:25:05.087200] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.850 [2024-09-27 15:25:05.087210] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.850 [2024-09-27 15:25:05.087250] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.850 [2024-09-27 15:25:05.238475] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.850 [2024-09-27 15:25:05.238494] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.850 [2024-09-27 15:25:05.238501] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.850 [2024-09-27 15:25:05.238518] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.850 [2024-09-27 15:25:05.242501] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.850 ctrlr pubkey: 00:20:33.850 00000000 2f 21 3b 95 70 e8 bb f9 b5 7e bb 44 3d 04 73 72 /!;.p....~.D=.sr 00:20:33.850 00000010 31 16 20 a2 8a b7 6c 48 34 1a ed fe be d0 13 cb 1. ...lH4....... 00:20:33.850 00000020 b9 09 c5 da 29 39 32 80 e7 e0 fd 19 77 be a7 93 ....)92.....w... 00:20:33.850 00000030 47 1b b1 1b e9 52 19 57 c6 d2 aa 2e 85 6c ba 82 G....R.W.....l.. 00:20:33.850 00000040 95 67 c9 70 49 8e cb 48 f8 97 bf 91 e2 48 31 aa .g.pI..H.....H1. 00:20:33.850 00000050 53 27 cb cf 79 63 e3 54 c4 dd 0e 6f 6b 22 0e 28 S'..yc.T...ok".( 00:20:33.850 00000060 38 f4 ee 8e ea 29 3b d5 64 15 dc c6 dd f5 5c 54 8....);.d.....\T 00:20:33.850 00000070 aa 09 64 e6 e1 61 39 ee f9 61 4a 90 4b 84 bc 7c ..d..a9..aJ.K..| 00:20:33.850 00000080 04 60 70 b4 1f ec ab 36 d6 f3 18 0d 3c 33 91 85 .`p....6....<3.. 00:20:33.850 00000090 63 41 74 95 3f e2 48 08 be 78 0f 58 2d 50 d1 78 cAt.?.H..x.X-P.x 00:20:33.850 000000a0 d0 99 f7 a9 04 d0 7b 68 4c 7f e0 64 66 23 91 51 ......{hL..df#.Q 00:20:33.850 000000b0 32 2a fe 37 1c 9f 04 c6 62 08 00 36 b5 2a bc f4 2*.7....b..6.*.. 00:20:33.850 000000c0 cd fc 45 5b 1f b1 c5 0a 9b d6 ad 6e 71 cf db 96 ..E[.......nq... 00:20:33.850 000000d0 d4 7c d7 0c 54 74 0d c5 7c 98 fc 55 b5 85 3e 97 .|..Tt..|..U..>. 00:20:33.850 000000e0 22 f1 c7 df 3c 39 2a 29 6a e3 cb aa 06 ed 62 43 "...<9*)j.....bC 00:20:33.850 000000f0 d6 47 11 1d 5d 56 d1 38 4a 41 e7 8c 8f 45 df 81 .G..]V.8JA...E.. 00:20:33.850 host pubkey: 00:20:33.850 00000000 fb 45 a9 52 43 89 e7 17 29 d5 32 79 c0 59 98 c8 .E.RC...).2y.Y.. 00:20:33.850 00000010 e9 c2 02 00 3c ea 48 9c 69 d3 09 e9 be 5d ac 47 ....<.H.i....].G 00:20:33.850 00000020 e0 bd e8 4a 22 9d 61 82 8b 25 9d b5 b3 f6 b8 d4 ...J".a..%...... 00:20:33.850 00000030 8d ae 6b bf 4f 53 4b f2 29 48 8c 91 8c 9a a8 69 ..k.OSK.)H.....i 00:20:33.850 00000040 86 2a af 5a cd b6 2e 76 fa 95 47 9a a5 9d ce 81 .*.Z...v..G..... 00:20:33.850 00000050 2b 58 51 c6 77 c5 c0 7e e2 11 f7 a3 5b 26 04 90 +XQ.w..~....[&.. 00:20:33.850 00000060 22 e8 71 43 dc ba da 62 d2 8c 8f 12 1d 98 1f be ".qC...b........ 00:20:33.850 00000070 62 dd 3e c0 b3 64 62 6c d0 38 98 77 42 34 df 23 b.>..dbl.8.wB4.# 00:20:33.850 00000080 df 97 44 fc 8f 4e b0 e4 cf 4a 9f 61 f1 0e 7b 92 ..D..N...J.a..{. 00:20:33.850 00000090 ab a6 c2 8b 8f 83 34 6a 79 40 91 13 92 6f 34 c1 ......4jy@...o4. 00:20:33.850 000000a0 98 71 8f f5 61 79 91 e0 fa 9d a6 24 d2 e0 3c c8 .q..ay.....$..<. 00:20:33.850 000000b0 10 4b 42 02 b7 56 4e 7b 7c 99 46 bf 33 c4 c1 7a .KB..VN{|.F.3..z 00:20:33.850 000000c0 4e 6c e4 4d 05 18 62 cb 95 83 b4 84 a7 81 68 eb Nl.M..b.......h. 00:20:33.850 000000d0 37 b4 9b e9 e6 14 66 3e 46 4d 30 3c 8c 25 60 18 7.....f>FM0<.%`. 00:20:33.850 000000e0 30 96 08 89 0c 74 23 87 c3 0e 77 1b b9 79 6a c6 0....t#...w..yj. 00:20:33.850 000000f0 d9 22 1b 16 8f c0 06 6b 05 09 8b af 62 39 e3 5a .".....k....b9.Z 00:20:33.850 dh secret: 00:20:33.850 00000000 08 36 f1 5d 0b 35 7b 12 83 94 f1 4d e3 2b 80 2b .6.].5{....M.+.+ 00:20:33.850 00000010 81 c5 74 85 50 69 ce a7 0b 77 76 d9 d6 d6 39 f8 ..t.Pi...wv...9. 00:20:33.850 00000020 3e 66 2a 8d e8 03 55 ec c3 83 60 36 27 b4 43 d1 >f*...U...`6'.C. 00:20:33.850 00000030 f5 22 d8 62 b5 05 98 59 0c 5a 24 01 0e 23 2c fe .".b...Y.Z$..#,. 00:20:33.850 00000040 d6 58 c2 8d 2a 3d cb ef 2f ab 61 e0 a5 07 30 fd .X..*=../.a...0. 00:20:33.850 00000050 1b 39 fb e5 fd 30 d2 af 6a ff a2 71 12 c8 dd 96 .9...0..j..q.... 00:20:33.850 00000060 5f 28 85 e6 51 a8 47 ea 4f 87 e7 e8 60 9e ce e1 _(..Q.G.O...`... 00:20:33.850 00000070 68 87 fd 1f 13 5d 6c 2b 91 a0 76 a9 3c 0d 4a bb h....]l+..v.<.J. 00:20:33.850 00000080 c0 be 88 45 c2 5d e5 52 91 db 4f de 9a 7f 65 6e ...E.].R..O...en 00:20:33.850 00000090 e1 41 a3 7f d3 74 a7 be d5 db 5a 34 5d 7f 38 96 .A...t....Z4].8. 00:20:33.850 000000a0 84 c6 a3 b3 18 68 0b 29 29 71 85 a8 20 78 a3 5a .....h.))q.. x.Z 00:20:33.850 000000b0 e0 44 13 c1 cc d3 1e 3b e5 b6 ab 98 c0 7b d6 6a .D.....;.....{.j 00:20:33.850 000000c0 82 0f 66 2e 64 18 f5 75 c4 35 3f 5b 0f 0c 89 58 ..f.d..u.5?[...X 00:20:33.850 000000d0 9e 06 d4 bf 70 13 fb 68 44 d4 1b 7c 6c cb 86 30 ....p..hD..|l..0 00:20:33.850 000000e0 bf 64 a0 6e 36 35 bf 11 2e 6a cf 26 92 61 63 76 .d.n65...j.&.acv 00:20:33.850 000000f0 e2 24 cb 8c 4c e3 e6 89 e6 24 86 ab 12 8e 01 4a .$..L....$.....J 00:20:33.850 [2024-09-27 15:25:05.245107] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=1, dhgroup=1, seq=3428451703, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.850 [2024-09-27 15:25:05.247827] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.850 [2024-09-27 15:25:05.247853] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.850 [2024-09-27 15:25:05.247870] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.850 [2024-09-27 15:25:05.247876] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.850 [2024-09-27 15:25:05.353217] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.851 [2024-09-27 15:25:05.353263] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.851 [2024-09-27 15:25:05.353284] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.851 [2024-09-27 15:25:05.353294] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.851 [2024-09-27 15:25:05.353356] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.851 ctrlr pubkey: 00:20:33.851 00000000 2f 21 3b 95 70 e8 bb f9 b5 7e bb 44 3d 04 73 72 /!;.p....~.D=.sr 00:20:33.851 00000010 31 16 20 a2 8a b7 6c 48 34 1a ed fe be d0 13 cb 1. ...lH4....... 00:20:33.851 00000020 b9 09 c5 da 29 39 32 80 e7 e0 fd 19 77 be a7 93 ....)92.....w... 00:20:33.851 00000030 47 1b b1 1b e9 52 19 57 c6 d2 aa 2e 85 6c ba 82 G....R.W.....l.. 00:20:33.851 00000040 95 67 c9 70 49 8e cb 48 f8 97 bf 91 e2 48 31 aa .g.pI..H.....H1. 00:20:33.851 00000050 53 27 cb cf 79 63 e3 54 c4 dd 0e 6f 6b 22 0e 28 S'..yc.T...ok".( 00:20:33.851 00000060 38 f4 ee 8e ea 29 3b d5 64 15 dc c6 dd f5 5c 54 8....);.d.....\T 00:20:33.851 00000070 aa 09 64 e6 e1 61 39 ee f9 61 4a 90 4b 84 bc 7c ..d..a9..aJ.K..| 00:20:33.851 00000080 04 60 70 b4 1f ec ab 36 d6 f3 18 0d 3c 33 91 85 .`p....6....<3.. 00:20:33.851 00000090 63 41 74 95 3f e2 48 08 be 78 0f 58 2d 50 d1 78 cAt.?.H..x.X-P.x 00:20:33.851 000000a0 d0 99 f7 a9 04 d0 7b 68 4c 7f e0 64 66 23 91 51 ......{hL..df#.Q 00:20:33.851 000000b0 32 2a fe 37 1c 9f 04 c6 62 08 00 36 b5 2a bc f4 2*.7....b..6.*.. 00:20:33.851 000000c0 cd fc 45 5b 1f b1 c5 0a 9b d6 ad 6e 71 cf db 96 ..E[.......nq... 00:20:33.851 000000d0 d4 7c d7 0c 54 74 0d c5 7c 98 fc 55 b5 85 3e 97 .|..Tt..|..U..>. 00:20:33.851 000000e0 22 f1 c7 df 3c 39 2a 29 6a e3 cb aa 06 ed 62 43 "...<9*)j.....bC 00:20:33.851 000000f0 d6 47 11 1d 5d 56 d1 38 4a 41 e7 8c 8f 45 df 81 .G..]V.8JA...E.. 00:20:33.851 host pubkey: 00:20:33.851 00000000 a7 b1 8c 0e 01 69 b5 1d b3 4c 47 74 fa bf 13 85 .....i...LGt.... 00:20:33.851 00000010 cd 84 21 fc 5c 10 75 68 42 e4 51 66 a9 02 61 78 ..!.\.uhB.Qf..ax 00:20:33.851 00000020 eb 46 d1 35 fe 7d 00 54 af dd e5 55 7f 07 37 0a .F.5.}.T...U..7. 00:20:33.851 00000030 5f 6f c2 e8 ba 35 7d c0 54 b6 09 66 19 fc 2c 76 _o...5}.T..f..,v 00:20:33.851 00000040 c3 50 74 f1 c3 86 1f 04 74 f9 e2 62 4c 47 18 6d .Pt.....t..bLG.m 00:20:33.851 00000050 1f 31 ac df 2b 8c f1 a8 d7 ed 9c cd bc 6c 2b 8f .1..+........l+. 00:20:33.851 00000060 63 91 03 b7 11 5e e3 39 1f c0 36 fb 8a 06 18 fe c....^.9..6..... 00:20:33.851 00000070 57 7a c6 bd 2d 87 8e b4 47 2f 8d 74 e5 ef a4 f0 Wz..-...G/.t.... 00:20:33.851 00000080 08 28 62 22 c9 73 a7 e5 58 08 77 c4 ce c6 5d f5 .(b".s..X.w...]. 00:20:33.851 00000090 c5 44 f7 04 79 ff e0 bf 93 a2 c5 36 7c 4c 01 d6 .D..y......6|L.. 00:20:33.851 000000a0 02 97 47 ff 5c b5 c4 a5 cb 8c e7 29 c3 42 d9 03 ..G.\......).B.. 00:20:33.851 000000b0 ba cf b9 6e 0d 7e 33 27 5d 48 c4 0b 10 9d 74 0d ...n.~3']H....t. 00:20:33.851 000000c0 ad 07 4e 40 c1 33 5c 25 9d 76 fb 0c ae 6f 2b 12 ..N@.3\%.v...o+. 00:20:33.851 000000d0 ec ae 72 e2 2b e3 b5 b4 51 ca 9b b5 4b b7 bd b8 ..r.+...Q...K... 00:20:33.851 000000e0 51 ad 72 7b 8f 6f 21 06 32 ea c7 21 3e 51 e2 39 Q.r{.o!.2..!>Q.9 00:20:33.851 000000f0 1a 88 d9 af d9 78 68 69 92 00 c0 f4 c1 63 e8 bd .....xhi.....c.. 00:20:33.851 dh secret: 00:20:33.851 00000000 29 d6 87 b6 aa a2 fc 87 d1 f1 af 3f 6e d9 56 2b )..........?n.V+ 00:20:33.851 00000010 de 67 3e ab 91 48 83 95 84 d1 d1 b6 e4 2f 51 d5 .g>..H......./Q. 00:20:33.851 00000020 db d4 e6 87 ec 2c f1 f4 81 82 f6 f1 be 60 58 93 .....,.......`X. 00:20:33.851 00000030 ec d5 dc 06 f2 46 17 49 21 a3 07 2f a7 e8 2e 3a .....F.I!../...: 00:20:33.851 00000040 55 7e 31 16 11 6f 20 f2 fe 67 52 66 e5 a3 d3 78 U~1..o ..gRf...x 00:20:33.851 00000050 c0 ef 46 19 b1 be 22 37 bd f9 d7 30 24 ce 7c 44 ..F..."7...0$.|D 00:20:33.851 00000060 74 e7 07 43 8d 29 aa 60 42 d1 58 83 f5 01 c6 ae t..C.).`B.X..... 00:20:33.851 00000070 55 c4 9e 96 72 f3 44 3e b1 25 4f aa 82 ff 58 98 U...r.D>.%O...X. 00:20:33.851 00000080 ec 5b 6c d9 6b 0e 66 75 bf e7 2f 95 a9 30 b9 57 .[l.k.fu../..0.W 00:20:33.851 00000090 69 2d 59 66 8b 53 db 8a fd 0b e6 d9 39 3a b5 da i-Yf.S......9:.. 00:20:33.851 000000a0 8c f0 17 ef 40 52 f9 86 73 e0 ea 70 89 8d c8 d4 ....@R..s..p.... 00:20:33.851 000000b0 e8 66 6d e4 e8 0c e6 0f b4 f9 a5 24 fe 58 52 57 .fm........$.XRW 00:20:33.851 000000c0 bb bc 3c 6e 87 de 7b 65 b7 35 d9 b5 5d 4b f3 c5 ..7..W..V..5.. 00:20:33.851 000000f0 76 17 42 a7 8b 02 e3 5f b8 b4 f2 0a 2d 1c 71 cb v.B...._....-.q. 00:20:33.851 [2024-09-27 15:25:05.355983] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=1, dhgroup=1, seq=3428451704, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.851 [2024-09-27 15:25:05.356044] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.851 [2024-09-27 15:25:05.365346] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.851 [2024-09-27 15:25:05.365392] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.851 [2024-09-27 15:25:05.365399] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.851 [2024-09-27 15:25:05.522887] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.851 [2024-09-27 15:25:05.522907] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.851 [2024-09-27 15:25:05.522914] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.851 [2024-09-27 15:25:05.522959] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.851 [2024-09-27 15:25:05.522983] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.851 ctrlr pubkey: 00:20:33.851 00000000 c5 df 5b 74 5f 52 df 1f 64 b7 b1 83 eb 86 c6 2c ..[t_R..d......, 00:20:33.851 00000010 6c c6 1f a7 b6 ba f8 5e 76 b6 d2 a1 5f 68 e9 fb l......^v..._h.. 00:20:33.851 00000020 45 04 46 e2 13 c3 91 b4 fd af d7 6e f3 37 e1 76 E.F........n.7.v 00:20:33.851 00000030 1d 44 43 c0 c2 e4 1d 6e 86 9d 01 17 6c ba c1 aa .DC....n....l... 00:20:33.851 00000040 12 e9 3f 3a d0 00 15 c7 43 dd 15 f1 91 85 b6 ab ..?:....C....... 00:20:33.851 00000050 4c 64 b6 ec 9d 45 55 48 5a dc ec 6f b4 ab e2 d7 Ld...EUHZ..o.... 00:20:33.851 00000060 fe 11 93 76 ba 99 6d 75 38 88 2e ed 40 2f 74 01 ...v..mu8...@/t. 00:20:33.851 00000070 c9 3c 9b 04 09 8f e5 32 fb 50 f6 0b c5 7e b3 82 .<.....2.P...~.. 00:20:33.851 00000080 ee 27 4e eb 74 0f 76 4c 9a ee 58 e6 a1 74 2c f8 .'N.t.vL..X..t,. 00:20:33.851 00000090 97 7e 2c 5e 1c eb 16 9f a0 51 c4 26 55 d9 81 2d .~,^.....Q.&U..- 00:20:33.851 000000a0 44 6d 5b b5 18 be 54 88 c1 57 61 c6 b1 8a 6d 7b Dm[...T..Wa...m{ 00:20:33.851 000000b0 20 c6 fe 0c 44 57 71 95 46 4d 21 fd c9 c3 29 54 ...DWq.FM!...)T 00:20:33.851 000000c0 3c 88 4a 39 c4 58 1c de ed 54 c3 69 38 02 d2 a4 <.J9.X...T.i8... 00:20:33.851 000000d0 01 e8 53 b6 08 06 c1 d7 46 9d 33 ce 93 3f c0 40 ..S.....F.3..?.@ 00:20:33.851 000000e0 24 86 d8 5b f1 0d 78 9f cc 34 08 a4 c9 e8 8e b6 $..[..x..4...... 00:20:33.851 000000f0 21 64 6c a1 25 f0 5c f6 06 f8 2d d6 27 f5 04 8e !dl.%.\...-.'... 00:20:33.851 00000100 14 32 7a ac 15 8e 90 53 bf e4 71 56 1c 92 0d 63 .2z....S..qV...c 00:20:33.851 00000110 a0 1f c8 20 dd 72 3e d6 34 ad f1 3f 0d 1e 11 09 ... .r>.4..?.... 00:20:33.851 00000120 5f df 2e 45 20 3e 88 61 f1 95 02 44 07 9f 89 7d _..E >.a...D...} 00:20:33.851 00000130 99 1c 76 48 35 3b e1 4e e3 83 a8 b5 7c 11 23 73 ..vH5;.N....|.#s 00:20:33.851 00000140 9c 4d dd 89 2c 80 65 61 4f 93 99 d1 f1 7c e6 94 .M..,.eaO....|.. 00:20:33.851 00000150 4e f2 fc 59 fe 85 f8 a9 40 f1 09 c2 93 fb 7b c7 N..Y....@.....{. 00:20:33.851 00000160 ec e5 b6 c2 49 66 82 0c d9 d3 d7 2d 91 87 0a bc ....If.....-.... 00:20:33.851 00000170 81 4a c2 86 5d 80 15 eb a2 ed a2 06 67 70 66 77 .J..].......gpfw 00:20:33.851 host pubkey: 00:20:33.851 00000000 c3 6f 5d fc a0 2b b8 0f 8e 46 ba 25 43 f7 63 ca .o]..+...F.%C.c. 00:20:33.851 00000010 50 66 ae 1d b4 df 67 d4 df 64 69 cc ac 7d dd cb Pf....g..di..}.. 00:20:33.851 00000020 db 4a 2d 8b 42 80 76 af 16 b2 18 04 9f 1c 20 48 .J-.B.v....... H 00:20:33.851 00000030 f9 b4 a0 25 b5 55 d1 79 05 0f 2f 62 56 a5 22 48 ...%.U.y../bV."H 00:20:33.851 00000040 fd 19 34 7b 93 29 60 7c 5d 36 89 8c 0b 93 11 78 ..4{.)`|]6.....x 00:20:33.851 00000050 62 b9 26 22 81 b6 77 a4 6f 43 ce 60 1d d7 ba 6d b.&"..w.oC.`...m 00:20:33.851 00000060 d9 c6 ad bc a0 2c 7c d6 d7 69 b0 83 90 bd d9 b0 .....,|..i...... 00:20:33.851 00000070 2c a5 0d 4e 9c d4 d3 d1 98 a3 21 80 72 ba ce f5 ,..N......!.r... 00:20:33.851 00000080 79 a2 c9 6b 12 8a 46 f2 dd b7 80 0b dc 75 0f de y..k..F......u.. 00:20:33.851 00000090 a2 ad 62 57 83 77 d1 ba 15 32 6d ee d0 71 6e b8 ..bW.w...2m..qn. 00:20:33.851 000000a0 0a 2b c1 02 fc f1 92 ca 92 2d 33 eb b0 18 51 a5 .+.......-3...Q. 00:20:33.851 000000b0 0e b2 72 2a 7d 14 8f 81 97 ba 0e a1 04 a4 7c 18 ..r*}.........|. 00:20:33.851 000000c0 30 c1 54 1b d3 84 03 96 cf 50 1b 40 a5 91 93 a7 0.T......P.@.... 00:20:33.851 000000d0 5c 99 12 e4 84 9d 11 87 60 96 44 1f 65 48 63 6c \.......`.D.eHcl 00:20:33.851 000000e0 68 cf 51 90 f4 9c 23 f2 cc 12 50 ed 81 73 ca 53 h.Q...#...P..s.S 00:20:33.851 000000f0 ea 9c 61 8c 61 a8 6c a7 c9 d4 b3 c6 b5 1a f4 7c ..a.a.l........| 00:20:33.851 00000100 06 a5 a8 4d 86 46 2c 54 05 63 99 30 fe f6 ac 8b ...M.F,T.c.0.... 00:20:33.851 00000110 3f d7 e4 a6 97 0a f4 cf c7 a4 40 ff c9 96 7a 0e ?.........@...z. 00:20:33.851 00000120 8e 6d 56 22 71 dc 71 ef 5c 3a 0d 4b 8f eb 79 9a .mV"q.q.\:.K..y. 00:20:33.852 00000130 b6 5d b7 2d f8 21 53 91 55 6d 78 b1 24 05 26 c4 .].-.!S.Umx.$.&. 00:20:33.852 00000140 c9 14 64 fe 5d ab ca 6e aa 80 62 8e 1b 4e 21 88 ..d.]..n..b..N!. 00:20:33.852 00000150 d7 ee 64 14 ec 92 2c b3 4c 43 df c1 02 ed 19 d2 ..d...,.LC...... 00:20:33.852 00000160 00 90 5c b5 b5 74 a9 3c a8 30 29 c0 65 a0 cd 1b ..\..t.<.0).e... 00:20:33.852 00000170 06 6b 3d cd 2f 2b 6c ee bb d9 f6 40 61 2c b4 8f .k=./+l....@a,.. 00:20:33.852 dh secret: 00:20:33.852 00000000 e7 26 0c d8 11 db 54 52 ff 65 c8 25 d7 01 8a e9 .&....TR.e.%.... 00:20:33.852 00000010 31 66 e5 4c 76 1e 1c e6 7e 09 a5 bd cf 2b 0b 36 1f.Lv...~....+.6 00:20:33.852 00000020 83 62 da 83 e1 fc d1 c6 de 62 62 8b 14 1d 10 4d .b.......bb....M 00:20:33.852 00000030 67 69 4b 33 71 58 3f eb c6 23 ff 31 8f 9b e2 87 giK3qX?..#.1.... 00:20:33.852 00000040 17 a9 94 b2 9d bf b6 3b 55 39 97 b3 0f 64 c1 9b .......;U9...d.. 00:20:33.852 00000050 e7 04 d2 81 01 ce 14 43 03 a2 3b b8 61 c1 36 88 .......C..;.a.6. 00:20:33.852 00000060 54 1d 26 d6 c7 84 1a e5 6a 2f 5a c5 44 0b 35 6f T.&.....j/Z.D.5o 00:20:33.852 00000070 d8 96 7e 7b bd a1 c2 98 1f 90 ce 15 fc 17 05 a1 ..~{............ 00:20:33.852 00000080 72 8a 18 61 24 e0 ec 8e af 2f 86 2a 74 62 0d 7f r..a$..../.*tb.. 00:20:33.852 00000090 9f 3a 2d 45 31 1b b2 81 df 1f 0e d3 2b 22 14 d5 .:-E1.......+".. 00:20:33.852 000000a0 ad 1e a0 ce 46 cc f5 23 ca dd 7f 33 b9 3b 4f 22 ....F..#...3.;O" 00:20:33.852 000000b0 17 10 25 c8 cb b7 70 eb 9c f3 86 f3 88 4e e2 47 ..%...p......N.G 00:20:33.852 000000c0 90 f1 87 9b d0 c9 ab 23 9e 43 0b 4c 14 b0 f2 26 .......#.C.L...& 00:20:33.852 000000d0 7f e2 bd f9 89 3d cf 16 96 84 39 27 fd ce f9 0d .....=....9'.... 00:20:33.852 000000e0 86 28 bb d5 15 f3 ba 6e 32 cf 19 b6 be 0e 34 d9 .(.....n2.....4. 00:20:33.852 000000f0 8d a8 3c 42 2f 5c cd 2a 2c f1 b4 69 4a 1a c8 01 ...`.k.hI...... 00:20:33.852 00000170 db b5 e0 7d 1e e3 e6 d6 ba 68 8b af 08 59 16 c4 ...}.....h...Y.. 00:20:33.852 [2024-09-27 15:25:05.530116] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=1, dhgroup=2, seq=3428451705, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.852 [2024-09-27 15:25:05.535262] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.852 [2024-09-27 15:25:05.535303] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.852 [2024-09-27 15:25:05.535319] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.852 [2024-09-27 15:25:05.535334] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.852 [2024-09-27 15:25:05.535356] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.852 [2024-09-27 15:25:05.641437] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.852 [2024-09-27 15:25:05.641454] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.852 [2024-09-27 15:25:05.641465] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.852 [2024-09-27 15:25:05.641475] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.852 [2024-09-27 15:25:05.641529] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.852 ctrlr pubkey: 00:20:33.852 00000000 c5 df 5b 74 5f 52 df 1f 64 b7 b1 83 eb 86 c6 2c ..[t_R..d......, 00:20:33.852 00000010 6c c6 1f a7 b6 ba f8 5e 76 b6 d2 a1 5f 68 e9 fb l......^v..._h.. 00:20:33.852 00000020 45 04 46 e2 13 c3 91 b4 fd af d7 6e f3 37 e1 76 E.F........n.7.v 00:20:33.852 00000030 1d 44 43 c0 c2 e4 1d 6e 86 9d 01 17 6c ba c1 aa .DC....n....l... 00:20:33.852 00000040 12 e9 3f 3a d0 00 15 c7 43 dd 15 f1 91 85 b6 ab ..?:....C....... 00:20:33.852 00000050 4c 64 b6 ec 9d 45 55 48 5a dc ec 6f b4 ab e2 d7 Ld...EUHZ..o.... 00:20:33.852 00000060 fe 11 93 76 ba 99 6d 75 38 88 2e ed 40 2f 74 01 ...v..mu8...@/t. 00:20:33.852 00000070 c9 3c 9b 04 09 8f e5 32 fb 50 f6 0b c5 7e b3 82 .<.....2.P...~.. 00:20:33.852 00000080 ee 27 4e eb 74 0f 76 4c 9a ee 58 e6 a1 74 2c f8 .'N.t.vL..X..t,. 00:20:33.852 00000090 97 7e 2c 5e 1c eb 16 9f a0 51 c4 26 55 d9 81 2d .~,^.....Q.&U..- 00:20:33.852 000000a0 44 6d 5b b5 18 be 54 88 c1 57 61 c6 b1 8a 6d 7b Dm[...T..Wa...m{ 00:20:33.852 000000b0 20 c6 fe 0c 44 57 71 95 46 4d 21 fd c9 c3 29 54 ...DWq.FM!...)T 00:20:33.852 000000c0 3c 88 4a 39 c4 58 1c de ed 54 c3 69 38 02 d2 a4 <.J9.X...T.i8... 00:20:33.852 000000d0 01 e8 53 b6 08 06 c1 d7 46 9d 33 ce 93 3f c0 40 ..S.....F.3..?.@ 00:20:33.852 000000e0 24 86 d8 5b f1 0d 78 9f cc 34 08 a4 c9 e8 8e b6 $..[..x..4...... 00:20:33.852 000000f0 21 64 6c a1 25 f0 5c f6 06 f8 2d d6 27 f5 04 8e !dl.%.\...-.'... 00:20:33.852 00000100 14 32 7a ac 15 8e 90 53 bf e4 71 56 1c 92 0d 63 .2z....S..qV...c 00:20:33.852 00000110 a0 1f c8 20 dd 72 3e d6 34 ad f1 3f 0d 1e 11 09 ... .r>.4..?.... 00:20:33.852 00000120 5f df 2e 45 20 3e 88 61 f1 95 02 44 07 9f 89 7d _..E >.a...D...} 00:20:33.852 00000130 99 1c 76 48 35 3b e1 4e e3 83 a8 b5 7c 11 23 73 ..vH5;.N....|.#s 00:20:33.852 00000140 9c 4d dd 89 2c 80 65 61 4f 93 99 d1 f1 7c e6 94 .M..,.eaO....|.. 00:20:33.852 00000150 4e f2 fc 59 fe 85 f8 a9 40 f1 09 c2 93 fb 7b c7 N..Y....@.....{. 00:20:33.852 00000160 ec e5 b6 c2 49 66 82 0c d9 d3 d7 2d 91 87 0a bc ....If.....-.... 00:20:33.852 00000170 81 4a c2 86 5d 80 15 eb a2 ed a2 06 67 70 66 77 .J..].......gpfw 00:20:33.852 host pubkey: 00:20:33.852 00000000 21 83 f6 f1 f6 27 7c 9e 47 dd 70 70 e8 68 61 ff !....'|.G.pp.ha. 00:20:33.852 00000010 a4 3c 15 a2 80 eb 7b 8b 81 0f 2d 97 a7 b1 3a 92 .<....{...-...:. 00:20:33.852 00000020 dc 55 93 6b 04 f6 9f ca db 1f 35 97 dd 98 d2 d7 .U.k......5..... 00:20:33.852 00000030 43 03 d2 1c f9 c4 f0 6e e4 5f 11 bb c1 39 c1 2c C......n._...9., 00:20:33.852 00000040 1b 81 63 08 b1 f3 ac 74 35 48 81 25 ee 22 3c a1 ..c....t5H.%."<. 00:20:33.852 00000050 ae d0 73 18 d9 89 6f 42 14 db 4c 1c 16 3a 99 c5 ..s...oB..L..:.. 00:20:33.852 00000060 ea 9a 47 37 7c 9c 34 b9 e5 a9 5b d4 ae 55 31 97 ..G7|.4...[..U1. 00:20:33.852 00000070 f8 59 e9 a1 1a 69 89 80 ba e1 c0 e5 67 00 81 f7 .Y...i......g... 00:20:33.852 00000080 11 17 0c e7 61 46 f7 dd 1b e1 94 a1 40 12 db b3 ....aF......@... 00:20:33.852 00000090 52 75 f8 8a b1 ec 72 3c 19 06 af 78 ff ee 4b 80 Ru....r<...x..K. 00:20:33.852 000000a0 1a c6 85 be 69 e6 94 00 70 f2 8b 97 0d d7 6b 5e ....i...p.....k^ 00:20:33.852 000000b0 f7 bc d9 b6 30 70 2f 59 91 95 d1 33 d6 f6 4d 45 ....0p/Y...3..ME 00:20:33.852 000000c0 a8 fe bc 7d 38 f6 28 4c c0 42 ab db 04 dd ce 3d ...}8.(L.B.....= 00:20:33.852 000000d0 1f 03 31 e9 1d ea 92 cd 8a f9 ad 7b 6b fc 41 1f ..1........{k.A. 00:20:33.852 000000e0 eb e6 3f 8f ed 56 64 7b 45 38 d1 d9 e8 af c3 12 ..?..Vd{E8...... 00:20:33.852 000000f0 4d fd 42 4f 5e a4 8b 1b 19 5e 9a af d1 d0 07 c5 M.BO^....^...... 00:20:33.852 00000100 ca 24 97 80 41 c4 98 79 45 8c 85 1f 23 d1 ea e4 .$..A..yE...#... 00:20:33.852 00000110 8a d5 fc 9f 30 42 46 22 dc cc fb 43 04 1f c8 95 ....0BF"...C.... 00:20:33.852 00000120 0c 1b 24 0c c7 4b 8b af 47 8d 48 6c 8f 34 58 cf ..$..K..G.Hl.4X. 00:20:33.852 00000130 ff 40 e7 bc 95 6e 1d 8e d9 5c 44 a0 a2 ed e6 3d .@...n...\D....= 00:20:33.852 00000140 89 f0 38 6d 8e b8 70 fa fd fa 3f 3f 41 b1 29 69 ..8m..p...??A.)i 00:20:33.852 00000150 cc 3b 9a 6e db a4 2a 9f 7b ab 39 36 5e 06 d9 7d .;.n..*.{.96^..} 00:20:33.852 00000160 8e 52 74 54 a4 52 2f 25 63 c7 6a 4f c0 7b a3 67 .RtT.R/%c.jO.{.g 00:20:33.852 00000170 88 9e f9 75 98 b9 53 41 3e ce 1f 6f 56 8f 13 c2 ...u..SA>..oV... 00:20:33.852 dh secret: 00:20:33.852 00000000 68 92 03 cb 64 dd 13 b9 b0 1f 45 ae d1 f3 2e 32 h...d.....E....2 00:20:33.852 00000010 da e1 39 f6 f6 85 ae 3a 0b 48 81 0c 9e ff 51 4e ..9....:.H....QN 00:20:33.852 00000020 74 fe bb be de 3d 97 ce ad 95 c3 a5 09 92 61 b5 t....=........a. 00:20:33.852 00000030 c6 64 e2 05 b1 27 3b ef b0 a6 34 c8 66 5e 29 18 .d...';...4.f^). 00:20:33.852 00000040 24 a4 71 37 05 aa 4a b5 76 a0 fc 02 6e 0c ed 63 $.q7..J.v...n..c 00:20:33.852 00000050 9d 1b 02 ae 06 44 7a ae 97 51 41 7e b0 68 53 3a .....Dz..QA~.hS: 00:20:33.852 00000060 07 6f cd 96 4e 69 01 2c df 4f 64 6d e2 c9 72 5d .o..Ni.,.Odm..r] 00:20:33.852 00000070 a8 a2 83 51 4e e8 15 16 c6 de 1b 5c dd ba cf 8d ...QN......\.... 00:20:33.852 00000080 e5 64 06 87 cc b7 44 94 03 0a d5 12 dd 60 ec ae .d....D......`.. 00:20:33.852 00000090 47 11 e9 b4 db f2 36 66 3b fa c4 c6 21 da ea 0c G.....6f;...!... 00:20:33.852 000000a0 bb bb eb e3 14 2b a0 99 9c 5f 0f 9a e4 59 91 8e .....+..._...Y.. 00:20:33.852 000000b0 15 e9 86 bc 06 f0 dc c4 23 20 41 e8 a7 72 74 6a ........# A..rtj 00:20:33.852 000000c0 0a c5 91 62 10 8e bc 55 a2 ab 40 02 02 66 19 ae ...b...U..@..f.. 00:20:33.852 000000d0 df 62 28 f6 3f b0 2b c8 67 67 e3 31 5c 25 a0 0c .b(.?.+.gg.1\%.. 00:20:33.852 000000e0 c9 51 a4 0a 89 51 95 7c 13 85 82 d3 66 d9 9e 81 .Q...Q.|....f... 00:20:33.852 000000f0 41 42 dd 1e 1f 84 9d 39 af 83 f2 02 da 05 78 d0 AB.....9......x. 00:20:33.852 00000100 18 b3 95 d2 7a b7 df 81 08 d7 59 e3 a6 e9 51 5f ....z.....Y...Q_ 00:20:33.852 00000110 3f fd 5f af e2 93 65 f5 65 09 c7 a2 4a 40 10 54 ?._...e.e...J@.T 00:20:33.852 00000120 e5 6e 2c 0d 49 b6 fd af 50 2a 48 de 83 c3 8a fc .n,.I...P*H..... 00:20:33.852 00000130 84 ab 70 13 15 b5 6a 9e 37 a6 a4 7e fb 20 c3 e8 ..p...j.7..~. .. 00:20:33.852 00000140 0b b3 e7 ab 97 1e 0e d2 3f e4 ff 27 8c 90 9c 46 ........?..'...F 00:20:33.853 00000150 07 d7 63 b4 af 63 bc b7 bc cc c5 cb ff a1 84 a8 ..c..c.......... 00:20:33.853 00000160 64 62 77 f6 c7 24 18 3b bb ed f9 1e 99 50 a2 72 dbw..$.;.....P.r 00:20:33.853 00000170 28 8a 4e 77 f3 35 26 5a f7 fb ee 75 1e 5e e7 b6 (.Nw.5&Z...u.^.. 00:20:33.853 [2024-09-27 15:25:05.648714] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=1, dhgroup=2, seq=3428451706, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.853 [2024-09-27 15:25:05.648808] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.853 [2024-09-27 15:25:05.664680] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.853 [2024-09-27 15:25:05.664743] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.853 [2024-09-27 15:25:05.664753] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.853 [2024-09-27 15:25:05.664786] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.853 [2024-09-27 15:25:05.821459] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.853 [2024-09-27 15:25:05.821478] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.853 [2024-09-27 15:25:05.821485] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.853 [2024-09-27 15:25:05.821530] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.853 [2024-09-27 15:25:05.821552] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.853 ctrlr pubkey: 00:20:33.853 00000000 e7 df f9 23 16 fd a5 90 7d ca 3d 88 95 42 1d c9 ...#....}.=..B.. 00:20:33.853 00000010 c1 ec 5d 63 13 ff 48 03 0a 92 a3 a8 7c 99 52 a1 ..]c..H.....|.R. 00:20:33.853 00000020 4c f2 03 66 2a 99 77 15 b2 f7 9f b9 a3 ad 55 cf L..f*.w.......U. 00:20:33.853 00000030 4f d1 ed 65 37 f4 87 51 a0 30 03 6a e1 b3 0d e7 O..e7..Q.0.j.... 00:20:33.853 00000040 39 08 a7 fe 61 8c d2 d4 4b 05 36 d8 2f 60 b8 79 9...a...K.6./`.y 00:20:33.853 00000050 ba bd ec 37 3d 97 7f 47 6b 56 58 e2 20 67 0f 0d ...7=..GkVX. g.. 00:20:33.853 00000060 4c de 8e a2 8e 91 f2 47 a2 96 ca e9 d5 8a db df L......G........ 00:20:33.853 00000070 a1 4f d6 08 df 44 d3 ff 4c 5f a6 18 6c 74 c7 4d .O...D..L_..lt.M 00:20:33.853 00000080 10 60 90 25 f9 19 57 17 8e b0 f0 9a 52 3c c0 f1 .`.%..W.....R<.. 00:20:33.853 00000090 95 e3 84 43 21 f5 e9 8a f0 5f 36 a3 28 5d 2c da ...C!...._6.(],. 00:20:33.853 000000a0 77 bf 20 33 8a 0e 3d 08 10 83 f4 a6 4f 37 ca 81 w. 3..=.....O7.. 00:20:33.853 000000b0 6f dd ca a7 a5 ac ab b7 15 95 21 52 71 92 15 5d o.........!Rq..] 00:20:33.853 000000c0 18 d2 d9 49 5d d0 3d af 8d ca a7 89 f9 24 bb 04 ...I].=......$.. 00:20:33.853 000000d0 e2 da 51 cc 90 f6 e6 a8 71 ed 9c 53 fb 55 a7 a9 ..Q.....q..S.U.. 00:20:33.853 000000e0 07 63 e5 7d 11 9e 4d 50 d0 96 b8 84 58 87 18 f2 .c.}..MP....X... 00:20:33.853 000000f0 ef cd 37 ec f8 f0 a8 c0 81 9e 41 90 f5 f8 5b 5c ..7.......A...[\ 00:20:33.853 00000100 3b ed 2a 77 9a 63 38 2d 47 a1 bd c4 a0 b3 5f 86 ;.*w.c8-G....._. 00:20:33.853 00000110 ea 0a 8b 5e 44 06 8d b2 10 f4 4d 28 e6 a3 5f fc ...^D.....M(.._. 00:20:33.853 00000120 1b c8 86 ca 95 ca 97 c5 c9 fd 73 6e 55 5f 93 8b ..........snU_.. 00:20:33.853 00000130 ae 5f f8 f4 ba e3 a1 6d 12 ee 95 97 93 4b 64 2a ._.....m.....Kd* 00:20:33.853 00000140 ae 1c 06 34 48 c4 2c c3 1b c3 a4 78 4e ad 99 c5 ...4H.,....xN... 00:20:33.853 00000150 68 80 0b 42 84 fc c5 d4 7c 65 a0 fe de 2e a8 b9 h..B....|e...... 00:20:33.853 00000160 9d f1 a2 68 87 d8 14 11 8a 88 2a 89 03 f0 3b 63 ...h......*...;c 00:20:33.853 00000170 14 ad 3b f0 ff 9c 63 77 cf 35 28 2b b9 d8 a0 a7 ..;...cw.5(+.... 00:20:33.853 host pubkey: 00:20:33.853 00000000 b1 86 a0 38 3b ad 83 49 7d 8f f5 95 c3 2f 1f 41 ...8;..I}..../.A 00:20:33.853 00000010 13 93 40 ec 3e 27 bd e3 9b 15 ac c4 70 3e 27 b5 ..@.>'......p>'. 00:20:33.853 00000020 4d 4c b5 87 c0 47 5f 48 f5 7a 41 80 5f 8d 48 f5 ML...G_H.zA._.H. 00:20:33.853 00000030 b2 8b 9b d4 26 67 cb 08 ec 9d 5c 24 45 e8 e6 24 ....&g....\$E..$ 00:20:33.853 00000040 e2 eb 7e cf 03 53 1f b0 19 36 3f bb 3b 9a c9 1d ..~..S...6?.;... 00:20:33.853 00000050 bb 86 f8 2a ae 9e 06 5d ab e2 36 9f a4 53 42 5b ...*...]..6..SB[ 00:20:33.853 00000060 4a 1b 84 ab 9a a2 70 29 53 4d 85 44 60 43 6a a3 J.....p)SM.D`Cj. 00:20:33.853 00000070 24 47 f1 4c 9f dc a0 c8 28 41 70 da d5 1d f1 58 $G.L....(Ap....X 00:20:33.853 00000080 c2 fc da 91 3e 17 f3 d1 ce 94 33 2b c2 4b 55 05 ....>.....3+.KU. 00:20:33.853 00000090 bd bf 8a d6 b7 e0 dc cf cd 59 3d 9f 23 fd 68 c8 .........Y=.#.h. 00:20:33.853 000000a0 11 2b 6c a2 29 94 96 5b 09 2c cd 45 05 75 02 61 .+l.)..[.,.E.u.a 00:20:33.853 000000b0 83 2e 91 c6 a0 d3 92 90 a9 2f 63 cf 22 24 1c 45 ........./c."$.E 00:20:33.853 000000c0 25 7d 91 56 21 5a 34 7d 2b be b5 a1 f1 09 d9 b8 %}.V!Z4}+....... 00:20:33.853 000000d0 b5 90 9c 23 7e 95 7c d8 67 f0 12 9a 8e 2a a3 39 ...#~.|.g....*.9 00:20:33.853 000000e0 72 1c b4 d4 ec ae fc 73 7d 22 cb c2 44 8a c0 c9 r......s}"..D... 00:20:33.853 000000f0 a7 1b 47 c7 78 66 10 87 e4 99 b3 71 5e 8c dd 75 ..G.xf.....q^..u 00:20:33.853 00000100 2d 2a e2 7d 17 b5 3d b2 83 71 54 58 d2 9f a8 35 -*.}..=..qTX...5 00:20:33.853 00000110 48 0a d5 fd 0f 84 15 8e 2d 12 4a 7f 50 da dd 58 H.......-.J.P..X 00:20:33.853 00000120 ba 92 43 a2 c0 cf ec 1e 75 f9 28 41 7a 71 ce 52 ..C.....u.(Azq.R 00:20:33.853 00000130 13 7a 82 88 22 f8 23 7e b8 7e 65 64 15 cc 77 1b .z..".#~.~ed..w. 00:20:33.853 00000140 60 ca 59 be b5 ad d9 99 fd 44 1a 8d 31 08 a4 87 `.Y......D..1... 00:20:33.853 00000150 b7 b5 d3 17 cc 3d 36 b4 d5 89 45 6a 12 4f 89 f0 .....=6...Ej.O.. 00:20:33.853 00000160 00 0f 6d bb 76 aa 02 3f 27 b3 7a 19 29 d8 12 4b ..m.v..?'.z.)..K 00:20:33.853 00000170 3d 92 a6 e7 69 2a ce a0 31 57 2a ae c6 28 c7 a8 =...i*..1W*..(.. 00:20:33.853 dh secret: 00:20:33.853 00000000 01 47 c1 a4 13 2c 8d e2 f2 49 49 44 a4 47 7b ed .G...,...IID.G{. 00:20:33.853 00000010 b1 8f d2 a5 57 f2 69 8c cb 8a 44 12 bc 80 57 a7 ....W.i...D...W. 00:20:33.853 00000020 f7 99 67 1b f2 59 b7 aa bc 02 8c a1 db 47 ec c1 ..g..Y.......G.. 00:20:33.853 00000030 f1 08 3c 00 02 e3 a4 25 b2 74 55 46 e6 f2 c7 60 ..<....%.tUF...` 00:20:33.853 00000040 82 75 55 f6 8e 1a 04 1f 80 4f b4 eb b4 24 ac d4 .uU......O...$.. 00:20:33.853 00000050 72 27 9f fd 8b 3c 30 b4 99 22 f4 11 ef 82 ba 84 r'...<0.."...... 00:20:33.853 00000060 03 56 f3 57 57 d8 80 fa f2 88 8c 75 51 23 95 59 .V.WW......uQ#.Y 00:20:33.853 00000070 be 0b a7 2b 43 57 58 f1 9e 0a 50 de ca 0a fb 05 ...+CWX...P..... 00:20:33.853 00000080 70 34 cb 75 aa b3 1e 83 81 e0 82 76 5c f1 98 e9 p4.u.......v\... 00:20:33.853 00000090 14 37 5f 38 63 a4 98 28 d9 0f 40 91 51 8a 52 31 .7_8c..(..@.Q.R1 00:20:33.853 000000a0 29 01 ce 20 99 c8 8a 68 d0 a6 a0 a9 76 87 17 6d ).. ...h....v..m 00:20:33.853 000000b0 d9 5e 75 be 68 e3 28 47 62 05 64 b0 9d 20 dc 17 .^u.h.(Gb.d.. .. 00:20:33.853 000000c0 51 c3 a3 30 da 07 bd 16 8f 0f 04 fd 26 e9 3c 7e Q..0........&.<~ 00:20:33.853 000000d0 f2 56 f5 93 02 99 a7 0f 28 91 a6 3f f9 70 80 d7 .V......(..?.p.. 00:20:33.853 000000e0 85 99 b4 a5 ae be e4 9c cd a2 64 a4 38 67 d1 15 ..........d.8g.. 00:20:33.853 000000f0 74 0f ba 04 7b ea 5b e5 65 a7 ae 83 f9 0c 13 60 t...{.[.e......` 00:20:33.853 00000100 16 47 18 fd 06 6f d0 d1 4b f0 8c 80 08 3c f2 77 .G...o..K....<.w 00:20:33.853 00000110 7e f3 d4 b9 d9 31 a5 a3 80 7b 89 13 63 6b f8 c8 ~....1...{..ck.. 00:20:33.853 00000120 95 22 de a7 30 4c 0d 85 ef 38 7e 1d cc 13 cf 1a ."..0L...8~..... 00:20:33.853 00000130 5a 0a fa bb 46 dc d1 e7 67 50 7d 32 b5 98 c1 92 Z...F...gP}2.... 00:20:33.853 00000140 c4 50 46 7a c5 ff 24 f9 19 0d 6c 93 28 ce b5 c3 .PFz..$...l.(... 00:20:33.853 00000150 ec 2f 5d 7d 31 1e 60 75 ed ac 63 5b f9 56 b1 9a ./]}1.`u..c[.V.. 00:20:33.853 00000160 50 67 fc ae ae 02 6a d6 a1 79 7a e3 af 34 63 f6 Pg....j..yz..4c. 00:20:33.853 00000170 f7 22 da dd dc c5 04 ea d4 f8 af 92 c9 d0 ac 37 .".............7 00:20:33.853 [2024-09-27 15:25:05.828922] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=2, seq=3428451707, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.853 [2024-09-27 15:25:05.834070] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.853 [2024-09-27 15:25:05.834111] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.853 [2024-09-27 15:25:05.834128] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.853 [2024-09-27 15:25:05.834147] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.853 [2024-09-27 15:25:05.834162] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.853 [2024-09-27 15:25:05.940284] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.853 [2024-09-27 15:25:05.940303] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.853 [2024-09-27 15:25:05.940310] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.854 [2024-09-27 15:25:05.940320] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.854 [2024-09-27 15:25:05.940388] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.854 ctrlr pubkey: 00:20:33.854 00000000 e7 df f9 23 16 fd a5 90 7d ca 3d 88 95 42 1d c9 ...#....}.=..B.. 00:20:33.854 00000010 c1 ec 5d 63 13 ff 48 03 0a 92 a3 a8 7c 99 52 a1 ..]c..H.....|.R. 00:20:33.854 00000020 4c f2 03 66 2a 99 77 15 b2 f7 9f b9 a3 ad 55 cf L..f*.w.......U. 00:20:33.854 00000030 4f d1 ed 65 37 f4 87 51 a0 30 03 6a e1 b3 0d e7 O..e7..Q.0.j.... 00:20:33.854 00000040 39 08 a7 fe 61 8c d2 d4 4b 05 36 d8 2f 60 b8 79 9...a...K.6./`.y 00:20:33.854 00000050 ba bd ec 37 3d 97 7f 47 6b 56 58 e2 20 67 0f 0d ...7=..GkVX. g.. 00:20:33.854 00000060 4c de 8e a2 8e 91 f2 47 a2 96 ca e9 d5 8a db df L......G........ 00:20:33.854 00000070 a1 4f d6 08 df 44 d3 ff 4c 5f a6 18 6c 74 c7 4d .O...D..L_..lt.M 00:20:33.854 00000080 10 60 90 25 f9 19 57 17 8e b0 f0 9a 52 3c c0 f1 .`.%..W.....R<.. 00:20:33.854 00000090 95 e3 84 43 21 f5 e9 8a f0 5f 36 a3 28 5d 2c da ...C!...._6.(],. 00:20:33.854 000000a0 77 bf 20 33 8a 0e 3d 08 10 83 f4 a6 4f 37 ca 81 w. 3..=.....O7.. 00:20:33.854 000000b0 6f dd ca a7 a5 ac ab b7 15 95 21 52 71 92 15 5d o.........!Rq..] 00:20:33.854 000000c0 18 d2 d9 49 5d d0 3d af 8d ca a7 89 f9 24 bb 04 ...I].=......$.. 00:20:33.854 000000d0 e2 da 51 cc 90 f6 e6 a8 71 ed 9c 53 fb 55 a7 a9 ..Q.....q..S.U.. 00:20:33.854 000000e0 07 63 e5 7d 11 9e 4d 50 d0 96 b8 84 58 87 18 f2 .c.}..MP....X... 00:20:33.854 000000f0 ef cd 37 ec f8 f0 a8 c0 81 9e 41 90 f5 f8 5b 5c ..7.......A...[\ 00:20:33.854 00000100 3b ed 2a 77 9a 63 38 2d 47 a1 bd c4 a0 b3 5f 86 ;.*w.c8-G....._. 00:20:33.854 00000110 ea 0a 8b 5e 44 06 8d b2 10 f4 4d 28 e6 a3 5f fc ...^D.....M(.._. 00:20:33.854 00000120 1b c8 86 ca 95 ca 97 c5 c9 fd 73 6e 55 5f 93 8b ..........snU_.. 00:20:33.854 00000130 ae 5f f8 f4 ba e3 a1 6d 12 ee 95 97 93 4b 64 2a ._.....m.....Kd* 00:20:33.854 00000140 ae 1c 06 34 48 c4 2c c3 1b c3 a4 78 4e ad 99 c5 ...4H.,....xN... 00:20:33.854 00000150 68 80 0b 42 84 fc c5 d4 7c 65 a0 fe de 2e a8 b9 h..B....|e...... 00:20:33.854 00000160 9d f1 a2 68 87 d8 14 11 8a 88 2a 89 03 f0 3b 63 ...h......*...;c 00:20:33.854 00000170 14 ad 3b f0 ff 9c 63 77 cf 35 28 2b b9 d8 a0 a7 ..;...cw.5(+.... 00:20:33.854 host pubkey: 00:20:33.854 00000000 29 58 1a 5a 09 98 6f 4d b0 db a1 17 92 4c b0 60 )X.Z..oM.....L.` 00:20:33.854 00000010 cc 51 76 ed 0e 48 b7 74 e5 7d 9e ca 1d b0 94 b9 .Qv..H.t.}...... 00:20:33.854 00000020 82 56 7c f9 93 5c 41 b4 7a d9 0e 2d 22 d4 41 3d .V|..\A.z..-".A= 00:20:33.854 00000030 f1 3d bf 24 e3 0b f2 5d c3 5f fa 2a c2 a9 87 4b .=.$...]._.*...K 00:20:33.854 00000040 15 7d 37 59 66 98 7c 68 cb 98 97 ad 4a 11 a5 01 .}7Yf.|h....J... 00:20:33.854 00000050 29 ed 35 b4 b0 bf 2c 75 4d 2d b0 bf 65 3d e0 59 ).5...,uM-..e=.Y 00:20:33.854 00000060 c0 ee 28 a0 b7 87 d7 77 b3 f6 53 24 35 9a 21 18 ..(....w..S$5.!. 00:20:33.854 00000070 b4 f6 cc 2d 73 bd c6 84 5f b5 23 01 1e 32 c4 9c ...-s..._.#..2.. 00:20:33.854 00000080 92 a6 3f 4f b9 32 47 39 c2 f5 0e 56 a9 cc 9f 9f ..?O.2G9...V.... 00:20:33.854 00000090 ed a6 aa 60 73 e6 bb ec 03 73 6f d3 fa df 32 9d ...`s....so...2. 00:20:33.854 000000a0 86 ac 1c d2 c9 32 3a 34 0b cf 71 5d 5c 90 ce b7 .....2:4..q]\... 00:20:33.854 000000b0 38 a2 a4 61 e1 54 22 b3 55 3a 28 9e 2e cd 9c fa 8..a.T".U:(..... 00:20:33.854 000000c0 9d 55 c3 a2 cc 4b b7 42 06 5c b6 ff 43 49 2a 33 .U...K.B.\..CI*3 00:20:33.854 000000d0 d0 d3 2c 57 20 a5 eb 66 aa 71 69 e1 ce be 1c dc ..,W ..f.qi..... 00:20:33.854 000000e0 56 65 89 ce bd 7a a5 d6 0a c0 7e ba 64 cf a3 86 Ve...z....~.d... 00:20:33.854 000000f0 4b 9c 7f de 43 8a 7b 70 d9 8e c6 67 1d cd a2 80 K...C.{p...g.... 00:20:33.854 00000100 45 5f f1 a9 c2 24 93 78 da a0 41 bd dc 4f 20 4b E_...$.x..A..O K 00:20:33.854 00000110 e1 2b 62 6f 98 42 47 63 79 e1 ee 62 85 28 31 69 .+bo.BGcy..b.(1i 00:20:33.854 00000120 8d b5 28 d2 72 72 78 fe 9c 80 23 f3 b8 26 31 f9 ..(.rrx...#..&1. 00:20:33.854 00000130 a1 18 ea 24 9d f9 fc 5f 53 e9 18 e6 97 52 f9 14 ...$..._S....R.. 00:20:33.854 00000140 58 f8 2e 5e f5 4c d0 62 d0 8f fc b6 7b 15 9d aa X..^.L.b....{... 00:20:33.854 00000150 99 a8 0f b9 6b b1 a3 16 c6 c3 09 cb d8 3e aa e3 ....k........>.. 00:20:33.854 00000160 06 9c 43 a6 91 e5 bb a9 53 73 62 10 88 95 2a 8f ..C.....Ssb...*. 00:20:33.854 00000170 fc d0 8a 2a 52 13 3b ac 49 1c 3a 84 01 59 5f a8 ...*R.;.I.:..Y_. 00:20:33.854 dh secret: 00:20:33.854 00000000 6f 49 6b f5 a5 55 91 0f 18 2e 10 be 41 85 a8 b5 oIk..U......A... 00:20:33.854 00000010 27 32 75 96 d0 08 da 75 0e f2 dc 5c 06 f4 be 62 '2u....u...\...b 00:20:33.854 00000020 95 c5 57 95 a3 c8 fa 57 58 ec 88 2d 1e bd 85 94 ..W....WX..-.... 00:20:33.854 00000030 b7 b4 c5 fd 6a 6f 4e 9b 1a 15 68 f8 73 1f 90 bd ....joN...h.s... 00:20:33.854 00000040 b6 40 23 78 17 6c cd 03 4e 1a 03 39 32 31 f5 12 .@#x.l..N..921.. 00:20:33.854 00000050 b5 05 4a 78 10 f5 d3 8c c9 43 81 78 fa 2f b6 95 ..Jx.....C.x./.. 00:20:33.854 00000060 eb 10 fe 89 b9 e2 05 e6 62 95 fb 37 16 0b 73 cb ........b..7..s. 00:20:33.854 00000070 9e 07 e1 01 ac 51 40 74 d7 b3 97 10 96 6e 53 9f .....Q@t.....nS. 00:20:33.854 00000080 ca 62 6e 35 bf 7a 93 98 31 b4 a2 be 7c c9 55 83 .bn5.z..1...|.U. 00:20:33.854 00000090 3c d4 39 a1 4a 82 89 4f ff 3e 7f e0 b6 e9 11 96 <.9.J..O.>...... 00:20:33.854 000000a0 39 77 0b 25 27 79 9e f3 02 68 22 2f 5a 7d 50 1e 9w.%'y...h"/Z}P. 00:20:33.854 000000b0 5b 09 70 c9 32 ee db 19 13 26 a5 6f b2 c8 d6 90 [.p.2....&.o.... 00:20:33.854 000000c0 57 c6 95 a6 5c f1 87 7c fc 24 a0 94 1b 6f bd 89 W...\..|.$...o.. 00:20:33.854 000000d0 40 33 fa f5 35 ba 4b fe e8 7c 3d f8 7b c1 28 f4 @3..5.K..|=.{.(. 00:20:33.854 000000e0 05 59 36 91 e7 96 ea 62 3e a6 b1 a8 7c 3b 97 33 .Y6....b>...|;.3 00:20:33.854 000000f0 2c b4 ca 48 70 38 e5 f5 74 d8 01 86 8a 3d 9a bd ,..Hp8..t....=.. 00:20:33.854 00000100 f3 aa b7 bc 0e 84 4e 38 1c a6 da 54 9d 75 1e a0 ......N8...T.u.. 00:20:33.854 00000110 e1 00 3a 6f c4 12 3b f6 f7 6c 5b 95 d6 f1 5f 39 ..:o..;..l[..._9 00:20:33.854 00000120 65 ac d6 20 d8 29 d4 51 1a db fd 37 80 f8 77 ec e.. .).Q...7..w. 00:20:33.854 00000130 01 77 94 0d 9e b3 d6 a6 32 67 ce 44 f1 4b 56 16 .w......2g.D.KV. 00:20:33.854 00000140 f8 4a 92 c0 b6 2a ef 9c 75 3e 58 62 98 d7 a0 bc .J...*..u>Xb.... 00:20:33.854 00000150 ac ce dc 5d 89 21 88 c5 c1 81 fe 7b 67 2e e1 39 ...].!.....{g..9 00:20:33.854 00000160 40 29 e1 76 dc d4 5c a7 a9 1f 29 01 36 c3 03 c9 @).v..\...).6... 00:20:33.854 00000170 63 f4 5f 1b 25 b0 94 7c a8 b4 88 38 e9 5c 51 18 c._.%..|...8.\Q. 00:20:33.854 [2024-09-27 15:25:05.948376] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=1, dhgroup=2, seq=3428451708, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.854 [2024-09-27 15:25:05.948480] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.854 [2024-09-27 15:25:05.965319] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.854 [2024-09-27 15:25:05.965389] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.854 [2024-09-27 15:25:05.965400] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.854 [2024-09-27 15:25:05.965437] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.854 [2024-09-27 15:25:06.119504] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.854 [2024-09-27 15:25:06.119524] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.854 [2024-09-27 15:25:06.119531] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.854 [2024-09-27 15:25:06.119576] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.854 [2024-09-27 15:25:06.119598] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.854 ctrlr pubkey: 00:20:33.854 00000000 1d 9b 92 dc 0c 3e 0f c3 27 a3 2d 39 6c b6 3c a6 .....>..'.-9l.<. 00:20:33.854 00000010 4c bc 2b cb 6d e0 2e 0b 8d 5c 28 22 c7 0f 3f 0b L.+.m....\("..?. 00:20:33.854 00000020 5d 37 ab bd 30 09 ea 71 5c f0 64 8e b3 1d 45 bc ]7..0..q\.d...E. 00:20:33.854 00000030 ba 5f 74 d0 da 9d db 6b 30 b6 72 c1 5a 5a fc e0 ._t....k0.r.ZZ.. 00:20:33.854 00000040 00 b3 52 ed 9b 47 2b f7 e0 db b2 3f df 13 7c c3 ..R..G+....?..|. 00:20:33.854 00000050 79 3b 4a 2d f8 58 ce 7d b0 4d a9 f1 41 4d 58 26 y;J-.X.}.M..AMX& 00:20:33.854 00000060 69 d6 37 81 fc 03 47 79 93 f5 4c 3e 24 e8 84 27 i.7...Gy..L>$..' 00:20:33.854 00000070 e0 cb ca a4 25 7a aa 41 a5 cf 79 1a de 2a 1a 69 ....%z.A..y..*.i 00:20:33.854 00000080 a3 fb c3 7e b0 64 69 d4 7a 08 2d f5 b2 aa bd 3c ...~.di.z.-....< 00:20:33.854 00000090 77 34 fa fb de ee 8b 30 ad db f3 32 cc aa ef 3d w4.....0...2...= 00:20:33.854 000000a0 5f 67 81 3e e7 b0 33 8f 7a 02 1a a1 59 0f fc 6e _g.>..3.z...Y..n 00:20:33.854 000000b0 f5 b5 ce e6 6c f3 99 31 04 8c 83 6f 15 39 c2 3f ....l..1...o.9.? 00:20:33.854 000000c0 52 9a 5f 45 c7 83 c0 eb 5a 55 b7 12 ba 47 b0 7e R._E....ZU...G.~ 00:20:33.854 000000d0 39 ee 7c d4 b0 71 73 13 d5 5f 3c ff b2 c7 c7 33 9.|..qs.._<....3 00:20:33.854 000000e0 66 bd ad e8 6e bd 5f 6e 53 54 73 6e c5 32 41 65 f...n._nSTsn.2Ae 00:20:33.854 000000f0 3e b4 ad f0 34 7c 6c e5 76 cc f7 c2 6b 2c 90 55 >...4|l.v...k,.U 00:20:33.854 00000100 13 e9 77 b8 d7 db 65 6d 34 f8 c1 ac 45 12 44 2f ..w...em4...E.D/ 00:20:33.854 00000110 bb 6b 3d 12 a9 d9 ef 82 96 08 8f 32 42 98 a4 9b .k=........2B... 00:20:33.854 00000120 b1 ca 57 bf bb ec 72 e6 38 b9 4c 5f 1b ff 62 fc ..W...r.8.L_..b. 00:20:33.854 00000130 b0 bf 27 85 76 be 74 e8 5d e9 38 a6 e3 cc ce 91 ..'.v.t.].8..... 00:20:33.854 00000140 5f 96 88 f8 26 b1 ae d3 f5 6a b9 c9 7b 46 88 a2 _...&....j..{F.. 00:20:33.854 00000150 71 a5 83 f6 b0 d7 60 df 27 d2 07 f8 86 06 de 30 q.....`.'......0 00:20:33.854 00000160 dd cf 25 4b 01 9f 5b b8 57 0b 5d a9 83 41 c6 d5 ..%K..[.W.]..A.. 00:20:33.854 00000170 79 71 ff 24 6b 45 1c 7a fd 4d e9 d4 db 16 17 ba yq.$kE.z.M...... 00:20:33.854 host pubkey: 00:20:33.854 00000000 64 27 d8 04 8f ed 0a 49 60 78 b8 2e fa f7 e0 90 d'.....I`x...... 00:20:33.854 00000010 69 e0 ae 26 20 9e f9 5a 53 6d 50 e5 76 94 15 b3 i..& ..ZSmP.v... 00:20:33.854 00000020 f9 1f 35 d5 bf df 53 46 88 6e 19 7a 49 dc 5e c1 ..5...SF.n.zI.^. 00:20:33.854 00000030 9e 55 91 82 76 ef 0a f4 f6 ad e8 c1 d1 35 03 4f .U..v........5.O 00:20:33.854 00000040 65 00 8e 96 14 e1 c3 ea 0f da fc d7 84 60 d0 62 e............`.b 00:20:33.854 00000050 9f 30 a6 dd bb 4e 91 f6 00 5b a2 69 43 49 53 ff .0...N...[.iCIS. 00:20:33.854 00000060 af 13 e7 a9 19 c7 88 a6 63 e3 99 b7 6d 20 42 15 ........c...m B. 00:20:33.854 00000070 e9 a8 72 93 eb 11 3c 0b a6 76 b5 86 3c 36 b3 6b ..r...<..v..<6.k 00:20:33.855 00000080 f5 da 71 d9 0f 5a 31 b9 a4 fd 51 34 fc 90 b5 14 ..q..Z1...Q4.... 00:20:33.855 00000090 0d 18 8e b5 6c 01 77 ca e0 ac 4a eb c1 57 59 65 ....l.w...J..WYe 00:20:33.855 000000a0 70 9c 6d 0b 01 c5 92 e2 9c 54 8b 4c 7a 9a 10 ee p.m......T.Lz... 00:20:33.855 000000b0 7a 96 d1 c6 9d c7 6b 3e d4 75 c5 2c 1b bc 4d 26 z.....k>.u.,..M& 00:20:33.855 000000c0 a6 aa df 29 c5 03 5f b0 4c 98 7c a3 65 9a e7 80 ...).._.L.|.e... 00:20:33.855 000000d0 40 84 86 23 b7 04 3f 0b 8e 8b 26 4d 91 d8 2f d7 @..#..?...&M../. 00:20:33.855 000000e0 a6 03 ea fe 6f af c1 86 2c ed e9 36 7e 47 18 75 ....o...,..6~G.u 00:20:33.855 000000f0 74 cf 9b b1 51 d2 4e f4 43 2a 35 0d ea 94 bc cc t...Q.N.C*5..... 00:20:33.855 00000100 86 9e a0 91 75 0a 57 fd 00 cf 52 9b 83 49 b5 de ....u.W...R..I.. 00:20:33.855 00000110 2a db 47 28 69 ab 2f 8f f6 01 2c df a0 c3 dd b8 *.G(i./...,..... 00:20:33.855 00000120 15 82 50 91 d0 8a 10 2a e2 96 5e 11 11 1f a8 fc ..P....*..^..... 00:20:33.855 00000130 7c 06 f9 6d 06 f5 34 ed f6 1e c0 dd 71 78 f3 75 |..m..4.....qx.u 00:20:33.855 00000140 f5 2f b1 0a 41 41 e0 bd 95 97 b3 dc 3b 4b 50 6e ./..AA......;KPn 00:20:33.855 00000150 cb c4 89 9f e4 2e 93 6b d0 09 df d7 67 f4 c2 e6 .......k....g... 00:20:33.855 00000160 b2 8b 29 b1 33 9c 25 83 34 d8 fb 72 d0 30 3b 8c ..).3.%.4..r.0;. 00:20:33.855 00000170 42 07 f5 52 f0 e2 f5 5f a0 4a 45 36 fa 0e f6 b9 B..R..._.JE6.... 00:20:33.855 dh secret: 00:20:33.855 00000000 0f e5 63 30 31 e0 ed e1 e3 81 24 10 5a ad df e5 ..c01.....$.Z... 00:20:33.855 00000010 45 c2 07 a2 5a 14 67 32 53 85 f1 81 38 e9 a4 a3 E...Z.g2S...8... 00:20:33.855 00000020 ed eb f9 6e da 7f 0b 48 1b 4b 4a f2 ff 5a 0a eb ...n...H.KJ..Z.. 00:20:33.855 00000030 cb 2e 4f ed f2 51 20 60 de e5 e2 78 77 97 9c f7 ..O..Q `...xw... 00:20:33.855 00000040 82 3c 7f 71 b1 7b 6f e6 8d a9 30 52 18 56 d8 70 .<.q.{o...0R.V.p 00:20:33.855 00000050 78 c2 c1 85 33 ae 0f e7 62 a6 5e 84 2c c9 02 35 x...3...b.^.,..5 00:20:33.855 00000060 19 31 06 f4 a6 4c 54 d4 c5 66 3b 98 c1 aa b1 27 .1...LT..f;....' 00:20:33.855 00000070 39 5e e8 db 30 c9 2f 2f da 42 cc 64 5a 0e b7 eb 9^..0.//.B.dZ... 00:20:33.855 00000080 15 c1 ab a6 ce e2 fd a1 f8 61 c2 43 ec 0c c0 4a .........a.C...J 00:20:33.855 00000090 5b 94 73 22 c0 84 90 f3 32 6e 80 d1 09 6e ec e4 [.s"....2n...n.. 00:20:33.855 000000a0 b7 de 67 65 32 03 bf 7c f7 30 3c 26 a8 4c 99 cf ..ge2..|.0<&.L.. 00:20:33.855 000000b0 ca 37 47 ff 7f 75 b0 47 5f 33 df 0c 3e 57 ba 63 .7G..u.G_3..>W.c 00:20:33.855 000000c0 49 fd 3e 3e 9b 78 f3 07 2c 03 0c 33 7c 2a b2 05 I.>>.x..,..3|*.. 00:20:33.855 000000d0 93 04 3b 70 b8 5d 5b 07 7c bc 39 16 76 32 3b 80 ..;p.][.|.9.v2;. 00:20:33.855 000000e0 26 7b 24 76 d1 0b e7 04 88 9b 2c a8 a1 bd 23 28 &{$v......,...#( 00:20:33.855 000000f0 eb af 60 77 7f f9 9c 13 7a 37 af da 54 a9 9a 96 ..`w....z7..T... 00:20:33.855 00000100 56 56 d9 80 18 dc 27 6c fc 82 a6 eb aa f4 ae e9 VV....'l........ 00:20:33.855 00000110 f9 0f 63 35 e0 80 82 c7 ce 30 68 17 7e 30 72 8a ..c5.....0h.~0r. 00:20:33.855 00000120 7d 10 97 e3 f6 1b 69 e2 b5 bb 99 45 ed 3d 05 f3 }.....i....E.=.. 00:20:33.855 00000130 e2 bb c9 b2 df d4 1a a0 e2 57 0e 2d cc a6 30 14 .........W.-..0. 00:20:33.855 00000140 86 f8 a0 b6 bc 43 2e fe be 07 9c c5 fb 9b 48 ea .....C........H. 00:20:33.855 00000150 f1 b3 41 7f b2 56 66 ab f0 62 d0 97 75 a3 16 7d ..A..Vf..b..u..} 00:20:33.855 00000160 c4 15 a4 4d 62 9c 83 77 0e 9d 03 df e9 64 68 8c ...Mb..w.....dh. 00:20:33.855 00000170 99 2d 43 22 d4 69 1c 16 b6 d2 15 a5 10 c8 83 f1 .-C".i.......... 00:20:33.855 [2024-09-27 15:25:06.126857] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=1, dhgroup=2, seq=3428451709, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.855 [2024-09-27 15:25:06.132143] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.855 [2024-09-27 15:25:06.132181] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.855 [2024-09-27 15:25:06.132198] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.855 [2024-09-27 15:25:06.132214] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.855 [2024-09-27 15:25:06.132233] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.855 [2024-09-27 15:25:06.238491] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.855 [2024-09-27 15:25:06.238510] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.855 [2024-09-27 15:25:06.238518] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.855 [2024-09-27 15:25:06.238528] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.855 [2024-09-27 15:25:06.238582] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.855 ctrlr pubkey: 00:20:33.855 00000000 1d 9b 92 dc 0c 3e 0f c3 27 a3 2d 39 6c b6 3c a6 .....>..'.-9l.<. 00:20:33.855 00000010 4c bc 2b cb 6d e0 2e 0b 8d 5c 28 22 c7 0f 3f 0b L.+.m....\("..?. 00:20:33.855 00000020 5d 37 ab bd 30 09 ea 71 5c f0 64 8e b3 1d 45 bc ]7..0..q\.d...E. 00:20:33.855 00000030 ba 5f 74 d0 da 9d db 6b 30 b6 72 c1 5a 5a fc e0 ._t....k0.r.ZZ.. 00:20:33.855 00000040 00 b3 52 ed 9b 47 2b f7 e0 db b2 3f df 13 7c c3 ..R..G+....?..|. 00:20:33.855 00000050 79 3b 4a 2d f8 58 ce 7d b0 4d a9 f1 41 4d 58 26 y;J-.X.}.M..AMX& 00:20:33.855 00000060 69 d6 37 81 fc 03 47 79 93 f5 4c 3e 24 e8 84 27 i.7...Gy..L>$..' 00:20:33.855 00000070 e0 cb ca a4 25 7a aa 41 a5 cf 79 1a de 2a 1a 69 ....%z.A..y..*.i 00:20:33.855 00000080 a3 fb c3 7e b0 64 69 d4 7a 08 2d f5 b2 aa bd 3c ...~.di.z.-....< 00:20:33.855 00000090 77 34 fa fb de ee 8b 30 ad db f3 32 cc aa ef 3d w4.....0...2...= 00:20:33.855 000000a0 5f 67 81 3e e7 b0 33 8f 7a 02 1a a1 59 0f fc 6e _g.>..3.z...Y..n 00:20:33.855 000000b0 f5 b5 ce e6 6c f3 99 31 04 8c 83 6f 15 39 c2 3f ....l..1...o.9.? 00:20:33.855 000000c0 52 9a 5f 45 c7 83 c0 eb 5a 55 b7 12 ba 47 b0 7e R._E....ZU...G.~ 00:20:33.855 000000d0 39 ee 7c d4 b0 71 73 13 d5 5f 3c ff b2 c7 c7 33 9.|..qs.._<....3 00:20:33.855 000000e0 66 bd ad e8 6e bd 5f 6e 53 54 73 6e c5 32 41 65 f...n._nSTsn.2Ae 00:20:33.855 000000f0 3e b4 ad f0 34 7c 6c e5 76 cc f7 c2 6b 2c 90 55 >...4|l.v...k,.U 00:20:33.855 00000100 13 e9 77 b8 d7 db 65 6d 34 f8 c1 ac 45 12 44 2f ..w...em4...E.D/ 00:20:33.855 00000110 bb 6b 3d 12 a9 d9 ef 82 96 08 8f 32 42 98 a4 9b .k=........2B... 00:20:33.855 00000120 b1 ca 57 bf bb ec 72 e6 38 b9 4c 5f 1b ff 62 fc ..W...r.8.L_..b. 00:20:33.855 00000130 b0 bf 27 85 76 be 74 e8 5d e9 38 a6 e3 cc ce 91 ..'.v.t.].8..... 00:20:33.855 00000140 5f 96 88 f8 26 b1 ae d3 f5 6a b9 c9 7b 46 88 a2 _...&....j..{F.. 00:20:33.855 00000150 71 a5 83 f6 b0 d7 60 df 27 d2 07 f8 86 06 de 30 q.....`.'......0 00:20:33.855 00000160 dd cf 25 4b 01 9f 5b b8 57 0b 5d a9 83 41 c6 d5 ..%K..[.W.]..A.. 00:20:33.855 00000170 79 71 ff 24 6b 45 1c 7a fd 4d e9 d4 db 16 17 ba yq.$kE.z.M...... 00:20:33.855 host pubkey: 00:20:33.855 00000000 36 c3 9c 62 82 2c c5 7d 36 f3 b3 58 58 ea 7a 2c 6..b.,.}6..XX.z, 00:20:33.855 00000010 e2 8e d9 7e 11 33 9b f3 dd 35 bb 07 6c 84 da 61 ...~.3...5..l..a 00:20:33.855 00000020 37 d6 93 1b 06 39 4e 16 a5 da e2 14 17 e8 05 21 7....9N........! 00:20:33.855 00000030 1f 14 56 a7 f0 70 66 cd 07 66 12 58 96 71 8b 67 ..V..pf..f.X.q.g 00:20:33.855 00000040 7e d6 c8 ca 93 cc be 4a 72 70 6b 63 73 81 f0 63 ~......Jrpkcs..c 00:20:33.855 00000050 83 b9 05 8e 28 43 30 6b 7b 7f fa 10 f2 1c c3 eb ....(C0k{....... 00:20:33.855 00000060 cf 4c 75 0b f3 7d ff b3 71 aa 10 f2 7b 85 cd 64 .Lu..}..q...{..d 00:20:33.855 00000070 0a c2 8b f4 ba db fb 62 ff 7f 1f e2 67 3f ff 84 .......b....g?.. 00:20:33.855 00000080 1e 84 8b 68 b5 72 e4 f4 5c 21 fa d8 06 2e 91 b5 ...h.r..\!...... 00:20:33.855 00000090 53 83 08 6e cf d9 98 a8 63 b1 f5 aa 7b 7f 52 01 S..n....c...{.R. 00:20:33.855 000000a0 ef fc 4e e3 e9 c7 cb 93 7c b3 7e 42 a3 50 d3 74 ..N.....|.~B.P.t 00:20:33.855 000000b0 e2 63 c1 7e 78 4b cb c5 9a 4b ce 85 99 b8 1c aa .c.~xK...K...... 00:20:33.855 000000c0 f5 c1 fc 45 87 84 03 73 17 eb de 6c 4a 4d 16 d8 ...E...s...lJM.. 00:20:33.855 000000d0 d8 5e a7 9c 56 ee 93 e0 8e b4 b7 e3 de 4d 6a 61 .^..V........Mja 00:20:33.855 000000e0 b7 79 a5 e7 0c 83 fa 27 bc ff 37 2a cd f6 d4 a7 .y.....'..7*.... 00:20:33.855 000000f0 64 1a ab be 98 63 7b 6c c9 63 c0 b8 ae a1 63 cb d....c{l.c....c. 00:20:33.855 00000100 7a 02 b4 d0 f4 5c 5d 30 db a8 02 15 1e 92 e4 cd z....\]0........ 00:20:33.855 00000110 33 9f 12 de da b0 59 5d 70 cc 4a f1 75 9c 6f cc 3.....Y]p.J.u.o. 00:20:33.855 00000120 eb f4 2d 26 3d 2e 00 67 ba bf cd 00 a1 eb 67 7f ..-&=..g......g. 00:20:33.855 00000130 e5 58 40 81 0a d9 db 78 c0 99 f2 d7 43 8f bf 41 .X@....x....C..A 00:20:33.855 00000140 e1 8e bf 94 a7 91 3c b9 e0 2e 5a 4f f1 92 e2 d5 ......<...ZO.... 00:20:33.855 00000150 fc c6 cb 01 6d 3e 18 9d 32 0b eb 18 1e 2f f6 c2 ....m>..2..../.. 00:20:33.855 00000160 c0 da 7e 04 8d 43 1e 6a 14 a3 5b 37 e8 f7 a3 ce ..~..C.j..[7.... 00:20:33.855 00000170 87 30 1e 1e 81 e4 05 ce 62 5c 09 f7 61 c5 c0 86 .0......b\..a... 00:20:33.855 dh secret: 00:20:33.855 00000000 e3 ea 78 21 33 1e 9f c3 22 88 a7 79 4f 0f 48 3c ..x!3..."..yO.H< 00:20:33.855 00000010 35 ec b5 33 ce 8a 7b 76 4c 1a 77 10 29 d7 fd e8 5..3..{vL.w.)... 00:20:33.855 00000020 2d fe e6 1f ad 59 96 0e 40 cb 68 00 54 1b 78 43 -....Y..@.h.T.xC 00:20:33.855 00000030 f9 2d ff 96 41 64 ee 46 c2 ac 7b 58 14 f0 ab af .-..Ad.F..{X.... 00:20:33.855 00000040 6c 97 10 94 54 e0 ae 40 96 a3 a1 98 91 9f f1 6e l...T..@.......n 00:20:33.855 00000050 5a c9 aa 7f 1f 93 3f 10 b0 f2 9c b0 8c 05 71 22 Z.....?.......q" 00:20:33.855 00000060 79 ed 13 34 b5 9b e6 47 bb 78 5f e7 cb ba a9 f8 y..4...G.x_..... 00:20:33.855 00000070 4f 7c aa 35 a6 c9 21 29 98 84 b9 d8 6a 92 23 bb O|.5..!)....j.#. 00:20:33.855 00000080 17 e6 37 31 ca b2 42 df 0e 3d 87 a9 95 58 07 06 ..71..B..=...X.. 00:20:33.855 00000090 40 64 cb 63 8b 09 88 6f 4a 9d b7 12 b5 72 40 23 @d.c...oJ....r@# 00:20:33.855 000000a0 02 b8 09 5a 54 65 94 af 3a 7d e4 3d db 03 f7 86 ...ZTe..:}.=.... 00:20:33.855 000000b0 9c 02 1c 97 95 a8 21 ce 34 b6 06 31 33 fd 59 cc ......!.4..13.Y. 00:20:33.855 000000c0 93 e7 4c 06 22 c9 17 ca 35 d2 ad 3d 3d b4 25 61 ..L."...5..==.%a 00:20:33.855 000000d0 f0 48 d6 81 59 49 dc 77 14 86 ff 8e 2f f2 f4 ba .H..YI.w..../... 00:20:33.855 000000e0 aa 02 4e 15 fb 8f 25 1c a0 49 49 bb 2e 83 53 4f ..N...%..II...SO 00:20:33.855 000000f0 a0 55 4f 80 18 6b 9e a5 c4 8e 6a 98 e1 58 3e 05 .UO..k....j..X>. 00:20:33.855 00000100 c1 05 a8 8c d2 b0 ed 2c 2b a1 6d f2 9e 0d 8d 29 .......,+.m....) 00:20:33.855 00000110 a0 db 00 5a a3 87 23 a9 ae 97 69 00 93 99 6d 07 ...Z..#...i...m. 00:20:33.855 00000120 56 b6 db 3e 87 88 23 40 e6 51 11 73 2b 2b bb 25 V..>..#@.Q.s++.% 00:20:33.855 00000130 5f fa b7 39 54 62 52 d6 95 06 d7 10 d7 4c 79 1a _..9TbR......Ly. 00:20:33.855 00000140 0b 03 8b 38 2a b0 d7 a8 83 9b 85 30 6e bd 03 4b ...8*......0n..K 00:20:33.855 00000150 40 e1 d3 85 87 67 26 c0 d1 cf 74 ca e0 69 2f 7a @....g&...t..i/z 00:20:33.856 00000160 6f 76 17 75 11 c1 ef 4a 98 29 a4 aa ad 77 4f ce ov.u...J.)...wO. 00:20:33.856 00000170 e5 d6 a8 6e 37 ae 73 5e 70 94 4a 90 e5 e1 9b af ...n7.s^p.J..... 00:20:33.856 [2024-09-27 15:25:06.245911] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=1, dhgroup=2, seq=3428451710, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.856 [2024-09-27 15:25:06.246010] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.856 [2024-09-27 15:25:06.263726] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.856 [2024-09-27 15:25:06.263796] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.856 [2024-09-27 15:25:06.263806] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.856 [2024-09-27 15:25:06.263845] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.856 [2024-09-27 15:25:06.419479] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.856 [2024-09-27 15:25:06.419500] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.856 [2024-09-27 15:25:06.419507] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.856 [2024-09-27 15:25:06.419549] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.856 [2024-09-27 15:25:06.419571] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.856 ctrlr pubkey: 00:20:33.856 00000000 c7 60 c4 51 ea 80 1b aa a2 5c 67 ed 17 19 02 90 .`.Q.....\g..... 00:20:33.856 00000010 4d 1f 08 3c 90 d8 b9 52 e2 d9 d1 6e 1e b5 e6 d4 M..<...R...n.... 00:20:33.856 00000020 22 06 a8 b7 8b fd b3 29 4a e5 5b c1 62 05 fd cf "......)J.[.b... 00:20:33.856 00000030 d3 1e 7d 0c 78 b0 d2 9a 50 cf 4c d2 6f 5b b4 81 ..}.x...P.L.o[.. 00:20:33.856 00000040 1e b8 1d b7 c7 5f a3 2e 8e 38 63 98 d8 8b c3 b3 ....._...8c..... 00:20:33.856 00000050 19 ee 5d 1f 5f 62 21 ed 55 69 3e 0e b3 88 75 a3 ..]._b!.Ui>...u. 00:20:33.856 00000060 c0 03 7a 8c 76 13 af 5a 3c 26 a1 45 a8 73 0a b1 ..z.v..Z<&.E.s.. 00:20:33.856 00000070 b5 88 60 21 4f 54 e4 84 e4 d9 fb 35 d8 1e 5b c4 ..`!OT.....5..[. 00:20:33.856 00000080 ea 97 38 df 30 ee 17 ce 0c d6 f4 6a 6c 84 f6 12 ..8.0......jl... 00:20:33.856 00000090 0a 08 b5 75 7e 87 03 a1 3f 89 69 09 cf 0f 76 78 ...u~...?.i...vx 00:20:33.856 000000a0 85 7b 40 29 2f d9 bc 89 d5 f3 db 49 e9 3c ae 85 .{@)/......I.<.. 00:20:33.856 000000b0 92 8f 60 c5 8f 41 16 0e 7e ce b6 fc 47 21 70 41 ..`..A..~...G!pA 00:20:33.856 000000c0 f9 29 15 be c3 6b 73 d9 27 75 d1 c3 4e f1 63 ff .)...ks.'u..N.c. 00:20:33.856 000000d0 e2 35 52 78 2c 46 54 c7 cc 82 b4 e3 8b 6c 41 c1 .5Rx,FT......lA. 00:20:33.856 000000e0 f2 7f a8 4d 1d 78 b7 28 55 3e 8c 75 f6 cb de 2c ...M.x.(U>.u..., 00:20:33.856 000000f0 7a 40 b5 a9 73 08 68 b5 7c e8 e2 6f a7 8b 2f 4b z@..s.h.|..o../K 00:20:33.856 00000100 77 38 b8 57 ee a9 37 97 22 f0 60 3b 60 ea a7 7e w8.W..7.".`;`..~ 00:20:33.856 00000110 6e 36 ca 38 c6 59 04 5e 90 d5 d9 bb 41 d1 ff 9e n6.8.Y.^....A... 00:20:33.856 00000120 94 ed bd b2 05 bc 40 9b 78 59 e3 a7 16 8d c1 88 ......@.xY...... 00:20:33.856 00000130 b0 9b 2b be 3f 0e 5b 70 73 de d1 54 11 bb 5b 42 ..+.?.[ps..T..[B 00:20:33.856 00000140 78 7b ec b6 fe 64 09 d1 c3 e3 bc d4 ba bb 0e 9c x{...d.......... 00:20:33.856 00000150 84 24 0b 81 e0 3d 80 80 5c 59 c1 35 f7 0d cc c8 .$...=..\Y.5.... 00:20:33.856 00000160 87 3e 8b 87 0d 55 74 f6 69 51 c8 04 b3 c7 59 b2 .>...Ut.iQ....Y. 00:20:33.856 00000170 47 42 fa 20 cb 80 3c 35 ab a4 11 f2 8b 0e c3 d8 GB. ..<5........ 00:20:33.856 host pubkey: 00:20:33.856 00000000 da 02 e4 3a e2 10 c2 20 e5 de 4e b5 f8 f5 7b be ...:... ..N...{. 00:20:33.856 00000010 14 21 62 ac 6b 82 f2 3b e9 6d 58 a5 12 9d ff d0 .!b.k..;.mX..... 00:20:33.856 00000020 fd d9 18 b1 5c 99 91 7e f8 52 12 8c d6 2d 8d f1 ....\..~.R...-.. 00:20:33.856 00000030 a9 ed e2 7a 07 3f 78 9d 57 31 d7 7e c0 0c 15 93 ...z.?x.W1.~.... 00:20:33.856 00000040 7a 7a a4 b8 85 23 05 51 5a a2 2c ac 3d f2 1a d1 zz...#.QZ.,.=... 00:20:33.856 00000050 39 68 b6 c2 94 48 0c f2 18 e9 52 e5 ac f6 ea d5 9h...H....R..... 00:20:33.856 00000060 ac 45 30 79 f3 ee a4 1d c3 1d 80 7a df ee fb 9a .E0y.......z.... 00:20:33.856 00000070 10 b2 b9 b1 7c a6 fc 32 ec eb b2 9e f0 e0 1c fc ....|..2........ 00:20:33.856 00000080 5d 7f 00 21 01 ee b9 b9 0d b8 32 4a 24 b3 71 90 ]..!......2J$.q. 00:20:33.856 00000090 1f 7a d6 4e ef 2a 59 a6 ec af fe 3d d1 77 72 8b .z.N.*Y....=.wr. 00:20:33.856 000000a0 66 8a 68 0d c1 7d 32 03 65 e3 ad 18 7c c5 cd 2a f.h..}2.e...|..* 00:20:33.856 000000b0 d8 e5 9b bf 78 81 41 0f 5b bc 01 64 45 bd 07 79 ....x.A.[..dE..y 00:20:33.856 000000c0 79 e7 1b d1 91 93 52 b4 c9 98 5c 19 13 03 38 40 y.....R...\...8@ 00:20:33.856 000000d0 11 49 52 14 3b 5f 0c 70 95 5f 54 8e 0b a9 f5 d4 .IR.;_.p._T..... 00:20:33.856 000000e0 7f bf 8f 2a 61 7c 1a 3f 87 82 fa 94 e1 c7 59 b5 ...*a|.?......Y. 00:20:33.856 000000f0 e8 53 59 c7 48 a1 be f8 e5 d7 3f 93 14 07 29 56 .SY.H.....?...)V 00:20:33.856 00000100 f3 87 b3 8d c2 99 36 0b d9 79 02 df 17 1c f2 3d ......6..y.....= 00:20:33.856 00000110 b7 44 3e 95 03 37 75 48 a8 01 f5 02 13 c6 14 04 .D>..7uH........ 00:20:33.856 00000120 d9 fe 0e b8 5b ce 9a a3 7d f7 87 a0 97 27 89 98 ....[...}....'.. 00:20:33.856 00000130 d0 8e 77 8e e7 b0 c0 a1 34 7c ad 4d 9c dc aa 0a ..w.....4|.M.... 00:20:33.856 00000140 75 fd 97 1d 24 ef fa 60 3e c5 23 83 67 65 5f 62 u...$..`>.#.ge_b 00:20:33.856 00000150 80 fb 4a 91 60 5c 10 bc fe 28 18 04 70 eb a3 68 ..J.`\...(..p..h 00:20:33.856 00000160 90 8d 93 85 b5 61 b7 ab 48 2a b9 dc aa 90 d3 79 .....a..H*.....y 00:20:33.856 00000170 de 42 9c 9f 8d 46 79 f9 df 1a 46 ce e2 9d 7c 33 .B...Fy...F...|3 00:20:33.856 dh secret: 00:20:33.856 00000000 19 fe 20 44 b4 a8 ff 8e a9 61 71 b6 03 db 52 0d .. D.....aq...R. 00:20:33.856 00000010 ef 97 5c e2 91 a2 28 a7 dd 71 b0 c6 43 1d e6 19 ..\...(..q..C... 00:20:33.856 00000020 9f 71 b7 cd 9b 74 bc 54 03 01 e1 cf 25 2a b6 dc .q...t.T....%*.. 00:20:33.856 00000030 34 8a 5c 64 42 05 dd b0 67 0f d2 27 22 d4 19 93 4.\dB...g..'"... 00:20:33.856 00000040 a9 cc 97 a1 55 6f 2a c9 36 9f bc 1a c7 8c 3d cd ....Uo*.6.....=. 00:20:33.856 00000050 d0 db 7a 93 06 0d 5b d2 23 98 f6 c1 25 20 4c a5 ..z...[.#...% L. 00:20:33.856 00000060 78 ee f7 8f c5 7c a2 8f 7d c1 50 c0 df ed 55 06 x....|..}.P...U. 00:20:33.856 00000070 ce 3a 91 58 b3 d3 41 51 a6 e7 c9 f8 86 d4 fb 55 .:.X..AQ.......U 00:20:33.856 00000080 f5 28 69 c1 5e ae 05 c2 a6 61 28 38 00 14 b4 aa .(i.^....a(8.... 00:20:33.856 00000090 61 b3 9a b7 9e 7a d7 cc 83 61 d3 55 99 b0 25 aa a....z...a.U..%. 00:20:33.856 000000a0 ba b6 cd 99 d3 6e 72 ff 67 c0 bc 45 60 6b 6d 4f .....nr.g..E`kmO 00:20:33.856 000000b0 6d 45 dc 4d 1a 3d 66 fd 1c 03 91 88 6d 07 b0 01 mE.M.=f.....m... 00:20:33.856 000000c0 df 5a dd 3e 06 57 3b 92 60 5e f1 a2 92 f4 88 0b .Z.>.W;.`^...... 00:20:33.856 000000d0 65 38 b5 0b ef d8 98 f1 27 40 a1 4e 2a 5a 23 8e e8......'@.N*Z#. 00:20:33.856 000000e0 4d 80 07 78 39 70 b1 b5 d6 a1 ee b4 71 f5 ef 93 M..x9p......q... 00:20:33.856 000000f0 46 4c 33 a5 dc e7 1f a9 d6 b4 04 12 66 bf 2b d0 FL3.........f.+. 00:20:33.856 00000100 c7 5e 81 15 fa a2 d6 1c e4 41 22 e2 4d 4e f5 dd .^.......A".MN.. 00:20:33.856 00000110 4f 65 3b b8 be 45 e3 40 68 ac 05 85 96 64 56 f8 Oe;..E.@h....dV. 00:20:33.856 00000120 ef 7b f8 2a 1b a6 a3 37 1d bf e7 86 62 dc 0d 12 .{.*...7....b... 00:20:33.856 00000130 62 3a b8 66 64 8e a4 0a 2f 6a be e3 5f fe 81 63 b:.fd.../j.._..c 00:20:33.856 00000140 0f 74 98 76 07 d1 cc 6b 7e ab 53 a1 c2 cb 0d 4d .t.v...k~.S....M 00:20:33.856 00000150 df d2 e8 a1 50 7a 16 ce 20 4e 29 ee d1 cd 70 ae ....Pz.. N)...p. 00:20:33.856 00000160 5f b5 5d cf 4b 64 43 17 a6 26 a5 69 a7 97 c7 2d _.].KdC..&.i...- 00:20:33.856 00000170 dd 24 44 2e f3 68 37 f8 5e 19 b9 c5 72 5b 5b ac .$D..h7.^...r[[. 00:20:33.856 [2024-09-27 15:25:06.426811] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=1, dhgroup=2, seq=3428451711, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.856 [2024-09-27 15:25:06.431921] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.856 [2024-09-27 15:25:06.431958] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.856 [2024-09-27 15:25:06.431974] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.856 [2024-09-27 15:25:06.431994] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.856 [2024-09-27 15:25:06.432008] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.856 [2024-09-27 15:25:06.538307] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.856 [2024-09-27 15:25:06.538325] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.856 [2024-09-27 15:25:06.538332] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.856 [2024-09-27 15:25:06.538347] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.856 [2024-09-27 15:25:06.538408] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.856 ctrlr pubkey: 00:20:33.856 00000000 c7 60 c4 51 ea 80 1b aa a2 5c 67 ed 17 19 02 90 .`.Q.....\g..... 00:20:33.856 00000010 4d 1f 08 3c 90 d8 b9 52 e2 d9 d1 6e 1e b5 e6 d4 M..<...R...n.... 00:20:33.856 00000020 22 06 a8 b7 8b fd b3 29 4a e5 5b c1 62 05 fd cf "......)J.[.b... 00:20:33.856 00000030 d3 1e 7d 0c 78 b0 d2 9a 50 cf 4c d2 6f 5b b4 81 ..}.x...P.L.o[.. 00:20:33.856 00000040 1e b8 1d b7 c7 5f a3 2e 8e 38 63 98 d8 8b c3 b3 ....._...8c..... 00:20:33.856 00000050 19 ee 5d 1f 5f 62 21 ed 55 69 3e 0e b3 88 75 a3 ..]._b!.Ui>...u. 00:20:33.856 00000060 c0 03 7a 8c 76 13 af 5a 3c 26 a1 45 a8 73 0a b1 ..z.v..Z<&.E.s.. 00:20:33.856 00000070 b5 88 60 21 4f 54 e4 84 e4 d9 fb 35 d8 1e 5b c4 ..`!OT.....5..[. 00:20:33.856 00000080 ea 97 38 df 30 ee 17 ce 0c d6 f4 6a 6c 84 f6 12 ..8.0......jl... 00:20:33.856 00000090 0a 08 b5 75 7e 87 03 a1 3f 89 69 09 cf 0f 76 78 ...u~...?.i...vx 00:20:33.856 000000a0 85 7b 40 29 2f d9 bc 89 d5 f3 db 49 e9 3c ae 85 .{@)/......I.<.. 00:20:33.856 000000b0 92 8f 60 c5 8f 41 16 0e 7e ce b6 fc 47 21 70 41 ..`..A..~...G!pA 00:20:33.856 000000c0 f9 29 15 be c3 6b 73 d9 27 75 d1 c3 4e f1 63 ff .)...ks.'u..N.c. 00:20:33.856 000000d0 e2 35 52 78 2c 46 54 c7 cc 82 b4 e3 8b 6c 41 c1 .5Rx,FT......lA. 00:20:33.857 000000e0 f2 7f a8 4d 1d 78 b7 28 55 3e 8c 75 f6 cb de 2c ...M.x.(U>.u..., 00:20:33.857 000000f0 7a 40 b5 a9 73 08 68 b5 7c e8 e2 6f a7 8b 2f 4b z@..s.h.|..o../K 00:20:33.857 00000100 77 38 b8 57 ee a9 37 97 22 f0 60 3b 60 ea a7 7e w8.W..7.".`;`..~ 00:20:33.857 00000110 6e 36 ca 38 c6 59 04 5e 90 d5 d9 bb 41 d1 ff 9e n6.8.Y.^....A... 00:20:33.857 00000120 94 ed bd b2 05 bc 40 9b 78 59 e3 a7 16 8d c1 88 ......@.xY...... 00:20:33.857 00000130 b0 9b 2b be 3f 0e 5b 70 73 de d1 54 11 bb 5b 42 ..+.?.[ps..T..[B 00:20:33.857 00000140 78 7b ec b6 fe 64 09 d1 c3 e3 bc d4 ba bb 0e 9c x{...d.......... 00:20:33.857 00000150 84 24 0b 81 e0 3d 80 80 5c 59 c1 35 f7 0d cc c8 .$...=..\Y.5.... 00:20:33.857 00000160 87 3e 8b 87 0d 55 74 f6 69 51 c8 04 b3 c7 59 b2 .>...Ut.iQ....Y. 00:20:33.857 00000170 47 42 fa 20 cb 80 3c 35 ab a4 11 f2 8b 0e c3 d8 GB. ..<5........ 00:20:33.857 host pubkey: 00:20:33.857 00000000 84 6b d2 c5 34 da c7 cb cd bf cb f8 f8 14 b7 cf .k..4........... 00:20:33.857 00000010 a5 5d 36 ac a6 e3 d6 5b fa 57 b9 82 e0 14 a5 e5 .]6....[.W...... 00:20:33.857 00000020 00 7d 91 4f 41 ba a9 0e cb 51 27 da 14 06 49 ea .}.OA....Q'...I. 00:20:33.857 00000030 ec 40 0a b4 fe a4 fe ef 55 6b f5 98 40 36 6d ce .@......Uk..@6m. 00:20:33.857 00000040 a7 0e de 83 eb 77 eb 59 d8 ee e8 3c 30 17 f5 37 .....w.Y...<0..7 00:20:33.857 00000050 f8 de 51 58 b6 bc 9d fe ef d1 41 2e 87 da c4 00 ..QX......A..... 00:20:33.857 00000060 11 8b d3 d2 13 07 5b c5 9d 5b 5f 60 b4 38 c0 b2 ......[..[_`.8.. 00:20:33.857 00000070 a4 c4 16 8d 9f 7a 1b 91 bb 1e 43 24 da 24 d1 2a .....z....C$.$.* 00:20:33.857 00000080 ea 04 00 9e 1d 87 a9 c9 04 c8 f0 fa 7a 0d 06 76 ............z..v 00:20:33.857 00000090 40 69 92 83 a0 48 be 98 00 f9 c3 6b e6 7c c8 93 @i...H.....k.|.. 00:20:33.857 000000a0 5f 19 a7 99 ad b8 f9 10 55 08 39 de 01 f6 be eb _.......U.9..... 00:20:33.857 000000b0 a9 19 0c 1d 17 69 32 61 0d 4e 05 bc 1f 52 1f 1a .....i2a.N...R.. 00:20:33.857 000000c0 1a d1 7b f2 77 87 6f b8 5e 30 7c 61 6e 6c 10 28 ..{.w.o.^0|anl.( 00:20:33.857 000000d0 54 88 f6 4b d2 b1 82 eb 38 fe 85 39 df 25 99 c0 T..K....8..9.%.. 00:20:33.857 000000e0 42 02 97 f4 86 01 0b 8a 94 8f 6a 63 23 98 02 b4 B.........jc#... 00:20:33.857 000000f0 80 29 f6 52 18 a5 9b a1 e7 cc 6b 32 f7 dd ff 98 .).R......k2.... 00:20:33.857 00000100 17 2b 80 90 c2 df 3d 1d aa df ad 1f be 60 87 f8 .+....=......`.. 00:20:33.857 00000110 9f 18 83 e7 1d ac e7 ca a5 b0 e0 55 1c dd 3a 0c ...........U..:. 00:20:33.857 00000120 7c 8d df 54 46 7b 43 71 d9 8a b4 92 cb 90 56 8c |..TF{Cq......V. 00:20:33.857 00000130 9f 43 6a 42 74 fe af 45 26 25 9e a1 93 10 63 96 .CjBt..E&%....c. 00:20:33.857 00000140 66 24 40 5e 7b 83 c0 8e 58 91 9f 75 7d ce 7b 08 f$@^{...X..u}.{. 00:20:33.857 00000150 f5 19 ad a7 d2 33 0d f2 b3 83 5e 64 12 df b8 ad .....3....^d.... 00:20:33.857 00000160 3d 2a d9 68 5c 96 94 5e ec 30 65 ef d3 2f 87 03 =*.h\..^.0e../.. 00:20:33.857 00000170 70 24 97 c9 8b fb 5c a8 08 65 4f 33 10 74 d3 39 p$....\..eO3.t.9 00:20:33.857 dh secret: 00:20:33.857 00000000 65 71 d3 8f bd e6 ee ab 3e db b4 e5 cb 1f e4 df eq......>....... 00:20:33.857 00000010 a4 e5 2b 83 ff f0 61 61 d8 5b e8 ce f7 ae bc 65 ..+...aa.[.....e 00:20:33.857 00000020 f4 58 5c 46 d1 0a 86 55 ba 59 4a 08 75 29 9e ae .X\F...U.YJ.u).. 00:20:33.857 00000030 53 e2 91 50 92 b9 24 38 fd 8e dc 97 67 d1 35 d8 S..P..$8....g.5. 00:20:33.857 00000040 65 d2 b7 21 65 e3 ce ea a2 56 3d 56 54 40 4b 0b e..!e....V=VT@K. 00:20:33.857 00000050 86 ed c9 0d 08 0e b0 93 34 01 c4 28 d3 ef a8 c7 ........4..(.... 00:20:33.857 00000060 8e aa f7 69 0b a6 0f 68 be b4 f5 54 77 4f e3 e4 ...i...h...TwO.. 00:20:33.857 00000070 c3 03 25 86 42 47 7a c8 23 52 5f 0e 1c d5 b7 23 ..%.BGz.#R_....# 00:20:33.857 00000080 ae e1 53 fa b7 d0 7f 0b c1 c9 1a 92 85 33 3c dd ..S..........3<. 00:20:33.857 00000090 66 e0 e4 ca 47 6c b6 55 2a 1c 2f 81 4c b7 41 c2 f...Gl.U*./.L.A. 00:20:33.857 000000a0 09 ac 42 f3 74 43 1e 05 fe ff 78 98 03 08 42 ff ..B.tC....x...B. 00:20:33.857 000000b0 4b 9b cc 11 33 34 d4 59 a0 9a b7 d9 ea 38 d4 f5 K...34.Y.....8.. 00:20:33.857 000000c0 e5 d2 8a 93 49 3e da 65 e3 9a 91 3a 60 12 9f 58 ....I>.e...:`..X 00:20:33.857 000000d0 fc 77 07 2e b6 42 af 85 6c b7 40 9e a6 49 1e 18 .w...B..l.@..I.. 00:20:33.857 000000e0 5b 63 38 a3 3f 3a 7d cc a7 d6 79 3c 9a b5 fa 28 [c8.?:}...y<...( 00:20:33.857 000000f0 0c 8f ee ca 70 87 04 3c fd ab 7b a3 25 93 3b 0c ....p..<..{.%.;. 00:20:33.857 00000100 c6 15 34 97 0d 54 de a3 6e 74 60 86 a5 17 21 a6 ..4..T..nt`...!. 00:20:33.857 00000110 e5 ba 34 7e 42 8c 5d 25 19 7a 18 7d 14 2b 71 d3 ..4~B.]%.z.}.+q. 00:20:33.857 00000120 2b 7e a1 52 ca 95 85 59 20 dc b5 e5 e9 ad 55 aa +~.R...Y .....U. 00:20:33.857 00000130 05 53 2d c4 19 42 70 d8 5b d7 9b 2f ae d2 d3 d5 .S-..Bp.[../.... 00:20:33.857 00000140 ce 9a 60 88 4a 22 49 07 8a 77 1a a8 58 37 9e f2 ..`.J"I..w..X7.. 00:20:33.857 00000150 f7 bd e4 ec 40 0d 4f 23 d2 a9 ef 4c 6e 44 de 08 ....@.O#...LnD.. 00:20:33.857 00000160 e9 0c be 39 76 9b 2a 82 56 26 81 69 ff be 28 e7 ...9v.*.V&.i..(. 00:20:33.857 00000170 87 02 c3 df c5 7d 5c 06 bd 46 18 1f 72 77 a3 a4 .....}\..F..rw.. 00:20:33.857 [2024-09-27 15:25:06.545820] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=1, dhgroup=2, seq=3428451712, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.857 [2024-09-27 15:25:06.545913] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.857 [2024-09-27 15:25:06.561733] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.857 [2024-09-27 15:25:06.561784] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.857 [2024-09-27 15:25:06.561794] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.857 [2024-09-27 15:25:06.561829] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.857 [2024-09-27 15:25:06.715348] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.857 [2024-09-27 15:25:06.715372] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.857 [2024-09-27 15:25:06.715379] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.857 [2024-09-27 15:25:06.715430] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.857 [2024-09-27 15:25:06.715454] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.857 ctrlr pubkey: 00:20:33.857 00000000 4e 3a 91 bc 08 d8 29 61 7b c8 14 d0 da bd 02 ac N:....)a{....... 00:20:33.857 00000010 7b 25 76 6a c6 42 9d 13 2b 18 98 6f 73 8d ee 81 {%vj.B..+..os... 00:20:33.857 00000020 ad 89 cd 21 41 29 d5 5b 4a b5 bf 4f c2 a0 89 91 ...!A).[J..O.... 00:20:33.857 00000030 9e d8 cf 00 84 7c 26 cc cb 10 23 46 93 9e 35 66 .....|&...#F..5f 00:20:33.857 00000040 da 5b b2 0f e0 a1 72 b4 be 78 9f e4 46 47 97 28 .[....r..x..FG.( 00:20:33.857 00000050 48 97 0c 06 cc 2d cb b5 64 e3 b3 1b 72 06 98 76 H....-..d...r..v 00:20:33.857 00000060 f3 dd 3d 96 41 f4 f9 da 10 bd 50 02 99 2e 87 af ..=.A.....P..... 00:20:33.857 00000070 f6 9b 4a 5d 02 dd 13 da e4 ee a2 08 3f e0 94 a7 ..J]........?... 00:20:33.857 00000080 67 69 3e b0 be ce 0f 8d 27 2f 68 a6 80 96 a7 19 gi>.....'/h..... 00:20:33.857 00000090 1d 8a b6 0c 81 43 df 2e a4 32 2b 12 3d 6d da 09 .....C...2+.=m.. 00:20:33.857 000000a0 2f b0 0e fa d9 63 d9 98 e1 9a ac 31 21 30 ee 96 /....c.....1!0.. 00:20:33.857 000000b0 03 93 eb 5d 70 39 74 60 0c f7 ca 72 25 a2 2c 26 ...]p9t`...r%.,& 00:20:33.857 000000c0 80 27 d7 5f 91 51 c3 d7 1b 7d 89 2c 5d 8c 81 d7 .'._.Q...}.,]... 00:20:33.857 000000d0 ba 68 c3 e0 eb 14 e8 5f eb e2 3e c2 b6 58 55 c0 .h....._..>..XU. 00:20:33.857 000000e0 5f 7d 8c e2 3c bb 38 31 f8 a0 04 f9 41 05 9d 26 _}..<.81....A..& 00:20:33.857 000000f0 71 f9 d0 0d 4d 3c 8b 24 83 5a 1a f8 56 6b 91 17 q...M<.$.Z..Vk.. 00:20:33.857 00000100 ea af 4e 29 78 c7 fc 98 e1 aa 24 80 b7 29 0b 26 ..N)x.....$..).& 00:20:33.857 00000110 d7 fd 0e 7e 67 6a 65 e8 42 14 3b d9 58 b3 23 8e ...~gje.B.;.X.#. 00:20:33.857 00000120 0a 5f 72 61 ba 1b 10 36 46 08 3d ae cb df e5 0f ._ra...6F.=..... 00:20:33.857 00000130 67 86 fc 4a e6 52 14 91 d6 dd 4f c6 99 69 00 36 g..J.R....O..i.6 00:20:33.857 00000140 16 d7 74 d8 01 f7 8f 16 43 86 cb d9 1f 4b bc f1 ..t.....C....K.. 00:20:33.857 00000150 2a a2 76 94 87 78 a3 d0 dd fc 81 ca 45 75 c9 4d *.v..x......Eu.M 00:20:33.857 00000160 38 0c 97 10 f4 d3 39 94 95 4b f6 4e 76 f9 bc 35 8.....9..K.Nv..5 00:20:33.857 00000170 79 87 02 75 3f 81 6d 03 e7 ae fe a5 81 99 98 1e y..u?.m......... 00:20:33.857 host pubkey: 00:20:33.857 00000000 c6 62 f4 75 c2 b9 d9 a9 57 dd b8 2f 15 be cc 56 .b.u....W../...V 00:20:33.857 00000010 38 54 a3 b2 cc 8d fe 86 7a c0 cc 23 dd 3a 2a 46 8T......z..#.:*F 00:20:33.857 00000020 fa 32 99 1f 7a 08 0b 91 80 ba d2 74 12 e9 ab 76 .2..z......t...v 00:20:33.857 00000030 c4 68 c3 83 d4 ca ec 00 33 24 d4 2a 54 13 ed b9 .h......3$.*T... 00:20:33.857 00000040 93 9e 74 87 58 87 9f 00 c8 c8 88 99 a9 b2 37 83 ..t.X.........7. 00:20:33.857 00000050 ff 56 4f 67 3c bb 6f 93 61 fa 3e d8 6a 42 ca 2e .VOg<.o.a.>.jB.. 00:20:33.857 00000060 73 fc c9 a8 d2 88 31 77 d9 67 f7 e5 75 f1 89 01 s.....1w.g..u... 00:20:33.857 00000070 1b 46 0b c3 c7 af 0d c8 2e f2 18 fa 26 f5 cd 32 .F..........&..2 00:20:33.857 00000080 98 7e a2 ca 17 f7 b2 34 d5 16 c9 b3 08 79 1a 9c .~.....4.....y.. 00:20:33.857 00000090 7a 1c 20 cb ed 99 e7 49 97 fd a6 18 90 51 b0 3a z. ....I.....Q.: 00:20:33.857 000000a0 3a d8 f2 bd 38 fc 85 ce 2f 68 44 43 c4 01 a6 d3 :...8.../hDC.... 00:20:33.857 000000b0 c2 2b 05 11 27 27 30 4c 5a 56 ba 79 f3 c1 24 11 .+..''0LZV.y..$. 00:20:33.857 000000c0 97 62 30 4b 1b 63 a9 27 1c 4d 4d 9c 3c ee e5 e4 .b0K.c.'.MM.<... 00:20:33.857 000000d0 40 fd 24 f6 4b af 30 cf 5b c7 4f e6 40 67 94 fa @.$.K.0.[.O.@g.. 00:20:33.857 000000e0 78 f0 68 ba f8 12 97 c5 35 ae a7 b2 95 75 a6 27 x.h.....5....u.' 00:20:33.857 000000f0 51 92 b3 df ca 81 7c 34 c0 71 8a fb 95 12 35 0e Q.....|4.q....5. 00:20:33.857 00000100 de 7a fd 4b a3 4f 0d 73 6e 33 3f 1d bc 1d 80 1e .z.K.O.sn3?..... 00:20:33.857 00000110 b0 ec 5c e5 c2 83 f3 5a 94 35 b1 3f 84 6f 7c 5a ..\....Z.5.?.o|Z 00:20:33.857 00000120 c9 10 fe 3a 30 17 c5 67 17 21 07 31 1b f0 b9 31 ...:0..g.!.1...1 00:20:33.857 00000130 31 e2 76 81 55 15 c8 3e f6 ad 2c 90 51 7b 6f bb 1.v.U..>..,.Q{o. 00:20:33.857 00000140 e7 45 a3 24 e1 bc 85 68 c5 e5 82 05 3f 81 e0 f7 .E.$...h....?... 00:20:33.857 00000150 8f 0f f2 2c 1c ac f4 e4 05 b0 07 05 46 ae b2 63 ...,........F..c 00:20:33.857 00000160 fc 9a 39 da d8 1e fa d2 2f e0 c3 51 58 ac 8d 75 ..9...../..QX..u 00:20:33.857 00000170 ef 13 00 5a 54 05 66 3f 8d c0 28 ca fa f6 99 b8 ...ZT.f?..(..... 00:20:33.857 dh secret: 00:20:33.858 00000000 08 c2 af 77 c3 67 3c 27 bd 3d 74 9b ee 56 db 93 ...w.g<'.=t..V.. 00:20:33.858 00000010 cd 0d 65 31 e8 c7 5c 3d 01 99 4e 94 d4 dc 96 6b ..e1..\=..N....k 00:20:33.858 00000020 fb 59 d8 01 8c 1d 4a cc 85 bb 4f 02 a4 d1 d9 f0 .Y....J...O..... 00:20:33.858 00000030 15 83 3c 0d 8b f1 35 5c 04 89 6d 6e 20 90 92 eb ..<...5\..mn ... 00:20:33.858 00000040 32 e3 f1 c3 3d 61 fe 78 7b d5 bf ec 91 6d bf 5b 2...=a.x{....m.[ 00:20:33.858 00000050 92 6c b9 c1 2e b1 f3 0d 55 10 b9 57 70 15 37 a2 .l......U..Wp.7. 00:20:33.858 00000060 2b 45 37 87 3a 19 78 b2 e3 7b 51 88 68 e2 3f 16 +E7.:.x..{Q.h.?. 00:20:33.858 00000070 be 25 34 44 10 8e 91 83 09 a6 9b ea 21 95 8a 08 .%4D........!... 00:20:33.858 00000080 1a 25 f4 db ef 0f c3 67 80 34 da fe 5f 5b e8 3d .%.....g.4.._[.= 00:20:33.858 00000090 c7 fc 2a 6a 68 7c b9 38 f3 eb c2 82 00 15 77 5b ..*jh|.8......w[ 00:20:33.858 000000a0 a3 bd ca aa 0e ab 4a 41 28 21 2c 3e 65 5b 26 b0 ......JA(!,>e[&. 00:20:33.858 000000b0 f9 a9 f9 45 72 5b b0 55 36 85 bd 40 d3 f4 92 bf ...Er[.U6..@.... 00:20:33.858 000000c0 7e 39 fc ee 7b d2 15 9d 78 5f 71 11 39 a8 f7 92 ~9..{...x_q.9... 00:20:33.858 000000d0 07 7a ab fe ca 75 c9 61 cf c2 ef a8 11 77 dd 54 .z...u.a.....w.T 00:20:33.858 000000e0 c2 d1 5b 18 fd c9 33 82 83 6a 71 59 fd 28 a1 bf ..[...3..jqY.(.. 00:20:33.858 000000f0 04 91 b3 e0 82 b6 4b 18 63 b2 3b ce 4d 0e d8 8e ......K.c.;.M... 00:20:33.858 00000100 f0 55 fa e0 1f d2 3b b6 1e 5c 3b e9 9e f2 41 c8 .U....;..\;...A. 00:20:33.858 00000110 10 fe b4 1b 6e 2f ae b8 24 73 d9 f9 69 c9 1d 16 ....n/..$s..i... 00:20:33.858 00000120 e5 ae c5 95 62 7e 10 c1 6b 31 ff bf bc 8e a7 70 ....b~..k1.....p 00:20:33.858 00000130 4c 6d 58 37 77 fb e9 ed 7e 87 81 a6 bb 0d ff 7a LmX7w...~......z 00:20:33.858 00000140 b6 4e c2 1f 13 7a 05 47 51 bb e7 10 cf 40 43 f3 .N...z.GQ....@C. 00:20:33.858 00000150 1c e2 f7 c3 e7 65 65 c9 7a 87 ce 99 a7 e1 f1 7b .....ee.z......{ 00:20:33.858 00000160 47 93 26 68 4c 9a 88 4f 7b 3d 03 f1 35 03 b1 93 G.&hL..O{=..5... 00:20:33.858 00000170 8d 24 91 6b b6 54 78 50 ca 60 d7 7a 57 c9 7e fa .$.k.TxP.`.zW.~. 00:20:33.858 [2024-09-27 15:25:06.722826] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=1, dhgroup=2, seq=3428451713, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.858 [2024-09-27 15:25:06.728053] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.858 [2024-09-27 15:25:06.728082] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.858 [2024-09-27 15:25:06.728098] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.858 [2024-09-27 15:25:06.728104] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.858 [2024-09-27 15:25:06.834267] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.858 [2024-09-27 15:25:06.834288] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.858 [2024-09-27 15:25:06.834296] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.858 [2024-09-27 15:25:06.834307] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.858 [2024-09-27 15:25:06.834370] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.858 ctrlr pubkey: 00:20:33.858 00000000 4e 3a 91 bc 08 d8 29 61 7b c8 14 d0 da bd 02 ac N:....)a{....... 00:20:33.858 00000010 7b 25 76 6a c6 42 9d 13 2b 18 98 6f 73 8d ee 81 {%vj.B..+..os... 00:20:33.858 00000020 ad 89 cd 21 41 29 d5 5b 4a b5 bf 4f c2 a0 89 91 ...!A).[J..O.... 00:20:33.858 00000030 9e d8 cf 00 84 7c 26 cc cb 10 23 46 93 9e 35 66 .....|&...#F..5f 00:20:33.858 00000040 da 5b b2 0f e0 a1 72 b4 be 78 9f e4 46 47 97 28 .[....r..x..FG.( 00:20:33.858 00000050 48 97 0c 06 cc 2d cb b5 64 e3 b3 1b 72 06 98 76 H....-..d...r..v 00:20:33.858 00000060 f3 dd 3d 96 41 f4 f9 da 10 bd 50 02 99 2e 87 af ..=.A.....P..... 00:20:33.858 00000070 f6 9b 4a 5d 02 dd 13 da e4 ee a2 08 3f e0 94 a7 ..J]........?... 00:20:33.858 00000080 67 69 3e b0 be ce 0f 8d 27 2f 68 a6 80 96 a7 19 gi>.....'/h..... 00:20:33.858 00000090 1d 8a b6 0c 81 43 df 2e a4 32 2b 12 3d 6d da 09 .....C...2+.=m.. 00:20:33.858 000000a0 2f b0 0e fa d9 63 d9 98 e1 9a ac 31 21 30 ee 96 /....c.....1!0.. 00:20:33.858 000000b0 03 93 eb 5d 70 39 74 60 0c f7 ca 72 25 a2 2c 26 ...]p9t`...r%.,& 00:20:33.858 000000c0 80 27 d7 5f 91 51 c3 d7 1b 7d 89 2c 5d 8c 81 d7 .'._.Q...}.,]... 00:20:33.858 000000d0 ba 68 c3 e0 eb 14 e8 5f eb e2 3e c2 b6 58 55 c0 .h....._..>..XU. 00:20:33.858 000000e0 5f 7d 8c e2 3c bb 38 31 f8 a0 04 f9 41 05 9d 26 _}..<.81....A..& 00:20:33.858 000000f0 71 f9 d0 0d 4d 3c 8b 24 83 5a 1a f8 56 6b 91 17 q...M<.$.Z..Vk.. 00:20:33.858 00000100 ea af 4e 29 78 c7 fc 98 e1 aa 24 80 b7 29 0b 26 ..N)x.....$..).& 00:20:33.858 00000110 d7 fd 0e 7e 67 6a 65 e8 42 14 3b d9 58 b3 23 8e ...~gje.B.;.X.#. 00:20:33.858 00000120 0a 5f 72 61 ba 1b 10 36 46 08 3d ae cb df e5 0f ._ra...6F.=..... 00:20:33.858 00000130 67 86 fc 4a e6 52 14 91 d6 dd 4f c6 99 69 00 36 g..J.R....O..i.6 00:20:33.858 00000140 16 d7 74 d8 01 f7 8f 16 43 86 cb d9 1f 4b bc f1 ..t.....C....K.. 00:20:33.858 00000150 2a a2 76 94 87 78 a3 d0 dd fc 81 ca 45 75 c9 4d *.v..x......Eu.M 00:20:33.858 00000160 38 0c 97 10 f4 d3 39 94 95 4b f6 4e 76 f9 bc 35 8.....9..K.Nv..5 00:20:33.858 00000170 79 87 02 75 3f 81 6d 03 e7 ae fe a5 81 99 98 1e y..u?.m......... 00:20:33.858 host pubkey: 00:20:33.858 00000000 94 6a f4 84 15 b1 42 a2 c6 27 9a 16 07 94 a0 02 .j....B..'...... 00:20:33.858 00000010 9d d6 37 12 eb 39 bc 03 11 a7 c0 47 fb 51 86 0b ..7..9.....G.Q.. 00:20:33.858 00000020 0d bd 6b 13 4c 04 84 fa 05 fc 2b 0f 43 2b e9 e0 ..k.L.....+.C+.. 00:20:33.858 00000030 f1 c2 13 e3 4c 4b 1c c5 16 ed b0 d8 99 9f fd f7 ....LK.......... 00:20:33.858 00000040 8b 1f b2 81 83 c5 d8 b4 2d 09 c5 09 6d ea ee 34 ........-...m..4 00:20:33.858 00000050 35 21 52 9a 0d 8c db 8b 7a 98 f3 a2 85 32 e0 fa 5!R.....z....2.. 00:20:33.858 00000060 42 0a 02 7f ca 9a 1d 70 02 34 2c 26 02 4e c8 d5 B......p.4,&.N.. 00:20:33.858 00000070 29 5d 82 51 88 7b 2b 20 b9 b2 72 23 c3 68 7e 67 )].Q.{+ ..r#.h~g 00:20:33.858 00000080 d7 c8 1c 2e 7f 4d 7d e3 1b 3c 53 5c 8c eb 9b 11 .....M}..%.7S.V..... 00:20:33.858 00000040 d4 e7 67 72 50 15 c6 d9 8f fa 07 74 7a 73 63 57 ..grP......tzscW 00:20:33.858 00000050 5e ab 38 ce a3 84 02 2c 38 a0 69 aa 1f f4 1e 42 ^.8....,8.i....B 00:20:33.858 00000060 19 dd e4 7c cb f2 f2 e3 f1 67 92 88 8c 0e b4 64 ...|.....g.....d 00:20:33.858 00000070 ce 3e 1a 78 09 11 fb 86 e9 c0 41 d4 7c b1 c9 fc .>.x......A.|... 00:20:33.858 00000080 56 25 77 93 14 12 8d b6 1f b6 42 c4 c4 45 e0 64 V%w.......B..E.d 00:20:33.858 00000090 86 d1 6e a8 13 b8 03 e0 81 ea 17 c2 1a 7d b2 0e ..n..........}.. 00:20:33.858 000000a0 ed a3 2b af be dd 02 dc 1c 2d 9e 76 8f 6b 18 47 ..+......-.v.k.G 00:20:33.858 000000b0 21 d5 7c 74 19 97 75 d5 65 05 6b 49 e9 50 c1 84 !.|t..u.e.kI.P.. 00:20:33.858 000000c0 fa 55 b7 63 2a f4 80 ec 82 e0 0d b1 96 a0 6f 15 .U.c*.........o. 00:20:33.859 000000d0 8d d5 ad f6 0d d3 9c 12 2d ef f4 b9 f6 2a 62 57 ........-....*bW 00:20:33.859 000000e0 3b 3c 16 40 ec 90 78 e7 31 e4 41 c9 93 59 e3 63 ;<.@..x.1.A..Y.c 00:20:33.859 000000f0 ca e4 2e c7 25 30 87 e3 d8 4a b7 cf 63 d4 d5 20 ....%0...J..c.. 00:20:33.859 00000100 56 d4 16 08 a9 e5 8e 41 bd 2c 92 9d e2 ba 96 31 V......A.,.....1 00:20:33.859 00000110 d6 d7 46 12 22 87 dc 52 d6 2e f5 95 9d 0e 84 4a ..F."..R.......J 00:20:33.859 00000120 2a c8 36 fb a6 83 cf 89 cc af dc f1 a8 e0 f3 2e *.6............. 00:20:33.859 00000130 78 6e 7f e0 33 e5 3a 71 4d 4f a7 d4 c4 84 fa 94 xn..3.:qMO...... 00:20:33.859 00000140 36 fc 0b 37 81 91 84 39 d2 0d 4d 2e 5e f5 f9 49 6..7...9..M.^..I 00:20:33.859 00000150 7d 46 e1 80 ad 80 cf c8 7e 4c 97 66 e2 ae 83 31 }F......~L.f...1 00:20:33.859 00000160 ce 0d 80 62 6f fd fb a9 b1 c9 81 3e b5 8b 20 fa ...bo......>.. . 00:20:33.859 00000170 8b 7e 86 8b e2 29 af 1e d2 98 b5 c3 06 b6 91 45 .~...).........E 00:20:33.859 [2024-09-27 15:25:06.842003] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=1, dhgroup=2, seq=3428451714, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.859 [2024-09-27 15:25:06.842079] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.859 [2024-09-27 15:25:06.859846] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.859 [2024-09-27 15:25:06.859876] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.859 [2024-09-27 15:25:06.859883] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.859 [2024-09-27 15:25:07.025919] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.859 [2024-09-27 15:25:07.025943] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.859 [2024-09-27 15:25:07.025951] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.859 [2024-09-27 15:25:07.025999] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.859 [2024-09-27 15:25:07.026024] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.859 ctrlr pubkey: 00:20:33.859 00000000 c2 8d 76 86 f3 6d 50 76 96 17 26 4f 95 df 34 bd ..v..mPv..&O..4. 00:20:33.859 00000010 ec de 6d 5a 66 59 29 c7 ad 40 62 99 50 cb e1 97 ..mZfY)..@b.P... 00:20:33.859 00000020 71 f1 35 e3 50 dc a4 05 5e 9d 1c 45 7f 8b fd 5f q.5.P...^..E..._ 00:20:33.859 00000030 ae 73 86 60 31 ea 8c bf 24 82 ba ab e0 58 c8 73 .s.`1...$....X.s 00:20:33.859 00000040 f2 89 94 a7 00 00 f7 27 14 a8 a3 a0 75 e8 cd b2 .......'....u... 00:20:33.859 00000050 3a 12 b6 d8 76 42 7d ad b5 2b 25 0b 4a 00 67 02 :...vB}..+%.J.g. 00:20:33.859 00000060 22 e3 16 b9 b9 85 23 19 3d 16 9a bc fd 7d 72 f8 ".....#.=....}r. 00:20:33.859 00000070 6c b3 a3 6e c7 af 7c 72 57 1c 0c 1a d3 c1 2a b3 l..n..|rW.....*. 00:20:33.859 00000080 54 78 c1 d1 48 18 54 12 7b 1a f2 27 22 ad 96 e0 Tx..H.T.{..'"... 00:20:33.859 00000090 29 5d 4c 64 70 83 45 cc 2b 93 95 1d 25 90 fd 90 )]Ldp.E.+...%... 00:20:33.859 000000a0 23 85 1d 3c b7 e2 0e 5a f3 8c 76 82 37 9a b1 73 #..<...Z..v.7..s 00:20:33.859 000000b0 39 20 ff 73 57 f3 52 cb 98 b1 92 11 2d 05 41 2e 9 .sW.R.....-.A. 00:20:33.859 000000c0 1f fa dd 05 09 d3 2b 83 8a 9b ef f4 36 99 5e 99 ......+.....6.^. 00:20:33.859 000000d0 94 ad 10 95 e2 f7 c7 aa 0f af 96 16 f3 96 f6 4b ...............K 00:20:33.859 000000e0 a6 4f e2 9f 8b af f7 d7 d2 d3 3d d7 1e 03 69 66 .O........=...if 00:20:33.859 000000f0 b6 53 69 26 60 a3 70 fd 50 1e 4c 3a e5 e0 a7 f8 .Si&`.p.P.L:.... 00:20:33.859 00000100 34 20 e1 0c 36 49 dd 7f 27 46 b2 84 52 cc 4d fd 4 ..6I..'F..R.M. 00:20:33.859 00000110 9a dd a9 f7 1c 72 47 f6 9d 96 7d 68 fb e8 fb 30 .....rG...}h...0 00:20:33.859 00000120 e1 c3 41 c9 7e 45 5d 23 7d 31 90 25 11 ce b6 db ..A.~E]#}1.%.... 00:20:33.859 00000130 47 c3 46 70 43 c4 df 03 d6 09 eb 41 f9 57 91 7b G.FpC......A.W.{ 00:20:33.859 00000140 4d ce 8d a3 92 ff fb 96 fd 38 89 98 6f 1c 80 7d M........8..o..} 00:20:33.859 00000150 ce d2 e9 8d ac c4 80 41 c7 50 83 e0 01 b5 fc 08 .......A.P...... 00:20:33.859 00000160 84 ba da 67 0e 94 30 55 89 ba 32 b8 67 f0 83 46 ...g..0U..2.g..F 00:20:33.859 00000170 f5 45 91 dd 94 b1 2b 69 31 78 4a 61 4b a3 c0 eb .E....+i1xJaK... 00:20:33.859 00000180 c1 2f b1 72 be 64 5c c4 90 b8 39 42 50 05 e1 83 ./.r.d\...9BP... 00:20:33.859 00000190 20 81 86 a1 9a b2 34 e4 b0 36 15 d8 c2 8e b5 4c .....4..6.....L 00:20:33.859 000001a0 d2 a2 a4 b3 33 f3 9e 6f 34 92 5a 22 5e 67 65 5b ....3..o4.Z"^ge[ 00:20:33.859 000001b0 b7 fb 7e d9 dc 42 d6 b5 14 e9 1d 67 43 b0 b5 96 ..~..B.....gC... 00:20:33.859 000001c0 a3 b7 d7 78 bf 43 7c 94 49 d3 07 0d 71 ae c1 14 ...x.C|.I...q... 00:20:33.859 000001d0 1e d8 45 d1 3e fd ce b9 0b 22 74 97 48 ae 13 de ..E.>...."t.H... 00:20:33.859 000001e0 d8 f6 bd 17 46 7f 2a f9 cd 93 08 71 16 1f 02 12 ....F.*....q.... 00:20:33.859 000001f0 e0 5b 4e d3 0b 67 ce 30 65 83 7a 10 10 79 46 55 .[N..g.0e.z..yFU 00:20:33.859 host pubkey: 00:20:33.859 00000000 84 ff c7 1b ca fd 15 54 60 b2 ab 8f 54 4c 03 ed .......T`...TL.. 00:20:33.859 00000010 32 d6 13 4d f8 92 c3 04 90 44 82 2f a2 94 f9 9f 2..M.....D./.... 00:20:33.859 00000020 84 d1 7b d6 d8 0d 68 75 83 df 1b d9 28 59 b4 f0 ..{...hu....(Y.. 00:20:33.859 00000030 0b 8e ed c0 05 d3 cf 64 1c 44 91 4d cf 0d d7 6c .......d.D.M...l 00:20:33.859 00000040 d6 d2 2e 0a 21 51 f6 20 cb 9f 6b f5 1f f1 a3 36 ....!Q. ..k....6 00:20:33.859 00000050 a5 eb 25 dd e1 98 a0 b9 d4 9f 08 33 51 fe 95 10 ..%........3Q... 00:20:33.859 00000060 bf 22 80 14 74 90 43 55 2a 6e 6b 26 b8 03 00 3b ."..t.CU*nk&...; 00:20:33.859 00000070 b4 2e 5c d3 ae 0c 44 48 5f 6a 1f 95 78 9c 28 96 ..\...DH_j..x.(. 00:20:33.859 00000080 33 52 d3 1f 4a df b2 2c da 61 c5 49 74 d3 b0 d1 3R..J..,.a.It... 00:20:33.859 00000090 b8 97 f5 09 da 9c 26 4b 68 f4 39 7d b9 b7 09 20 ......&Kh.9}... 00:20:33.859 000000a0 2b 2f 6a 1a d4 44 5a 4e 00 8e 5d c2 48 35 5d 0a +/j..DZN..].H5]. 00:20:33.859 000000b0 95 ed d2 82 36 66 66 f5 fb 22 92 86 40 ba 02 73 ....6ff.."..@..s 00:20:33.859 000000c0 44 4a 06 37 68 78 25 f8 e2 66 09 8d 08 ab da ae DJ.7hx%..f...... 00:20:33.859 000000d0 c6 2f 77 cf ed 46 18 6c 3e a4 af f9 72 f0 67 b4 ./w..F.l>...r.g. 00:20:33.859 000000e0 eb e9 aa 38 a4 93 52 ae b3 06 58 12 55 ca f3 8b ...8..R...X.U... 00:20:33.859 000000f0 8e 4f 50 b6 c3 af bf 19 a6 c6 75 33 42 75 bd e0 .OP.......u3Bu.. 00:20:33.859 00000100 61 aa df 08 73 81 b5 68 68 1d 86 23 a6 a3 a5 53 a...s..hh..#...S 00:20:33.859 00000110 2c c1 e7 4e 41 ca 22 f3 24 ad f3 40 eb ef 72 01 ,..NA.".$..@..r. 00:20:33.859 00000120 bd 3a ed 45 7e c7 93 92 e5 ce 4a 08 e1 d8 45 74 .:.E~.....J...Et 00:20:33.859 00000130 cb 4d b0 f0 8d 88 f3 59 be 42 33 66 f0 72 50 56 .M.....Y.B3f.rPV 00:20:33.859 00000140 92 13 66 ee 6c a1 26 19 e8 7f 88 9f 91 91 12 da ..f.l.&......... 00:20:33.859 00000150 0f f4 da 47 10 85 2c fd 01 43 8a 89 75 eb b6 eb ...G..,..C..u... 00:20:33.859 00000160 d8 45 5a aa 88 54 4f 29 f7 d2 81 6c 07 a6 7e 05 .EZ..TO)...l..~. 00:20:33.859 00000170 86 b0 d9 f2 bb b4 22 8f 51 37 c5 56 20 0a dc 96 ......".Q7.V ... 00:20:33.859 00000180 5c 76 b1 19 4e e3 3c a7 6d f6 e4 bd 8f 4b 47 7a \v..N.<.m....KGz 00:20:33.859 00000190 f4 74 fc bf 0d d1 ad ca 55 e0 31 83 10 dd 24 87 .t......U.1...$. 00:20:33.859 000001a0 87 21 37 9b 4e 51 11 fc 3c 70 80 61 f0 69 87 bf .!7.NQ....."{...y 00:20:33.859 00000010 97 fd f7 e6 40 63 50 d4 e6 0b cc 85 f1 77 ca f2 ....@cP......w.. 00:20:33.859 00000020 59 c9 e7 bc c3 9a f8 1a 91 3d 16 55 97 3c 08 94 Y........=.U.<.. 00:20:33.859 00000030 05 9e 04 88 e7 10 1d 61 92 24 52 3e b2 ed 62 14 .......a.$R>..b. 00:20:33.859 00000040 1b 37 d8 65 d0 f3 9c f0 ee 41 5e 75 26 36 08 ed .7.e.....A^u&6.. 00:20:33.859 00000050 59 f7 1c 36 7d 61 ee b1 16 35 ba 28 c5 3e 75 68 Y..6}a...5.(.>uh 00:20:33.859 00000060 34 ee ac 63 7a 7f ac 33 e5 11 88 ab e9 09 67 c3 4..cz..3......g. 00:20:33.859 00000070 48 0a bf 32 80 2b b6 f3 2f 4d 7a 77 7b 8f 7c 28 H..2.+../Mzw{.|( 00:20:33.859 00000080 77 0e 43 59 43 69 9f ff f3 e2 fc 21 b1 3a bf ad w.CYCi.....!.:.. 00:20:33.859 00000090 fa c4 94 b9 4f 43 94 49 a8 31 88 bd 2e 46 c0 7c ....OC.I.1...F.| 00:20:33.859 000000a0 0c 9c e8 20 d3 52 fb eb 88 34 6e 25 98 7c 0f 8d ... .R...4n%.|.. 00:20:33.859 000000b0 08 09 a5 3d 21 4f e8 e6 63 d2 99 a5 fb d1 82 d4 ...=!O..c....... 00:20:33.859 000000c0 0b c1 d2 10 39 ab d8 53 a2 a5 4b 7c 77 66 68 9d ....9..S..K|wfh. 00:20:33.859 000000d0 6a 71 9b a1 6a 28 db 90 ea 4a 54 17 0f 27 1f 44 jq..j(...JT..'.D 00:20:33.859 000000e0 00 8b 65 40 4c 04 ee 0d c3 0c ac ee c4 fb 14 5d ..e@L..........] 00:20:33.859 000000f0 ab 69 ad d6 d5 5c 96 6b f8 a9 c9 79 43 3e 47 cc .i...\.k...yC>G. 00:20:33.859 00000100 56 1a 1c 3b e3 e5 a1 7a 04 f2 ed 32 45 8b 58 83 V..;...z...2E.X. 00:20:33.859 00000110 47 6f 7b ba a4 fe 2b 47 44 a7 8d 60 96 3a b7 e1 Go{...+GD..`.:.. 00:20:33.859 00000120 cb 7f 82 cf 0c ab f2 64 85 c9 cd 9c af f8 d8 31 .......d.......1 00:20:33.859 00000130 85 fb 37 1c b6 30 2c 37 8a 4e 94 6d d6 64 9d 2f ..7..0,7.N.m.d./ 00:20:33.859 00000140 34 50 2b 5c 8c 5a 19 51 c0 ae 2c 71 02 f2 7e 24 4P+\.Z.Q..,q..~$ 00:20:33.859 00000150 2f 8a 60 f0 89 20 e6 fa 2a dc ab 33 0d 03 39 9b /.`.. ..*..3..9. 00:20:33.859 00000160 b7 c9 f0 c8 64 0a a2 b1 85 11 4e 8a e0 dc 27 7a ....d.....N...'z 00:20:33.859 00000170 72 67 69 46 1e 44 bd b9 c4 11 e3 2e 3e 21 28 09 rgiF.D......>!(. 00:20:33.859 00000180 14 f8 0b 7b 52 63 40 55 3c 3d 7c ea ca 8e 30 33 ...{Rc@U<=|...03 00:20:33.859 00000190 a8 ef 25 ff 37 4c 07 b4 72 bb 8c b9 91 5f 9e bb ..%.7L..r...._.. 00:20:33.859 000001a0 17 51 4c 2f 2f e8 65 fe 24 45 dc ae a6 b2 28 c2 .QL//.e.$E....(. 00:20:33.860 000001b0 48 2b 46 f8 3c 40 cc b4 92 74 b6 45 38 0c 86 fa H+F.<@...t.E8... 00:20:33.860 000001c0 db d7 ab e7 db 11 ad f0 bf 43 eb 22 56 13 93 69 .........C."V..i 00:20:33.860 000001d0 0c 2b 88 ac 77 7d 26 4d 1e 35 b7 d2 2d 41 8a 1d .+..w}&M.5..-A.. 00:20:33.860 000001e0 f7 17 92 cb 40 04 f8 b5 e7 a1 7c fc 67 4d f9 5b ....@.....|.gM.[ 00:20:33.860 000001f0 c5 89 d0 e8 f9 7f 30 eb 20 72 97 83 d5 ec 6a 55 ......0. r....jU 00:20:33.860 [2024-09-27 15:25:07.042442] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=1, dhgroup=3, seq=3428451715, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.860 [2024-09-27 15:25:07.060092] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.860 [2024-09-27 15:25:07.060140] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.860 [2024-09-27 15:25:07.060158] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.860 [2024-09-27 15:25:07.060183] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.860 [2024-09-27 15:25:07.060194] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.860 [2024-09-27 15:25:07.166614] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.860 [2024-09-27 15:25:07.166635] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.860 [2024-09-27 15:25:07.166643] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.860 [2024-09-27 15:25:07.166654] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.860 [2024-09-27 15:25:07.166708] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.860 ctrlr pubkey: 00:20:33.860 00000000 c2 8d 76 86 f3 6d 50 76 96 17 26 4f 95 df 34 bd ..v..mPv..&O..4. 00:20:33.860 00000010 ec de 6d 5a 66 59 29 c7 ad 40 62 99 50 cb e1 97 ..mZfY)..@b.P... 00:20:33.860 00000020 71 f1 35 e3 50 dc a4 05 5e 9d 1c 45 7f 8b fd 5f q.5.P...^..E..._ 00:20:33.860 00000030 ae 73 86 60 31 ea 8c bf 24 82 ba ab e0 58 c8 73 .s.`1...$....X.s 00:20:33.860 00000040 f2 89 94 a7 00 00 f7 27 14 a8 a3 a0 75 e8 cd b2 .......'....u... 00:20:33.860 00000050 3a 12 b6 d8 76 42 7d ad b5 2b 25 0b 4a 00 67 02 :...vB}..+%.J.g. 00:20:33.860 00000060 22 e3 16 b9 b9 85 23 19 3d 16 9a bc fd 7d 72 f8 ".....#.=....}r. 00:20:33.860 00000070 6c b3 a3 6e c7 af 7c 72 57 1c 0c 1a d3 c1 2a b3 l..n..|rW.....*. 00:20:33.860 00000080 54 78 c1 d1 48 18 54 12 7b 1a f2 27 22 ad 96 e0 Tx..H.T.{..'"... 00:20:33.860 00000090 29 5d 4c 64 70 83 45 cc 2b 93 95 1d 25 90 fd 90 )]Ldp.E.+...%... 00:20:33.860 000000a0 23 85 1d 3c b7 e2 0e 5a f3 8c 76 82 37 9a b1 73 #..<...Z..v.7..s 00:20:33.860 000000b0 39 20 ff 73 57 f3 52 cb 98 b1 92 11 2d 05 41 2e 9 .sW.R.....-.A. 00:20:33.860 000000c0 1f fa dd 05 09 d3 2b 83 8a 9b ef f4 36 99 5e 99 ......+.....6.^. 00:20:33.860 000000d0 94 ad 10 95 e2 f7 c7 aa 0f af 96 16 f3 96 f6 4b ...............K 00:20:33.860 000000e0 a6 4f e2 9f 8b af f7 d7 d2 d3 3d d7 1e 03 69 66 .O........=...if 00:20:33.860 000000f0 b6 53 69 26 60 a3 70 fd 50 1e 4c 3a e5 e0 a7 f8 .Si&`.p.P.L:.... 00:20:33.860 00000100 34 20 e1 0c 36 49 dd 7f 27 46 b2 84 52 cc 4d fd 4 ..6I..'F..R.M. 00:20:33.860 00000110 9a dd a9 f7 1c 72 47 f6 9d 96 7d 68 fb e8 fb 30 .....rG...}h...0 00:20:33.860 00000120 e1 c3 41 c9 7e 45 5d 23 7d 31 90 25 11 ce b6 db ..A.~E]#}1.%.... 00:20:33.860 00000130 47 c3 46 70 43 c4 df 03 d6 09 eb 41 f9 57 91 7b G.FpC......A.W.{ 00:20:33.860 00000140 4d ce 8d a3 92 ff fb 96 fd 38 89 98 6f 1c 80 7d M........8..o..} 00:20:33.860 00000150 ce d2 e9 8d ac c4 80 41 c7 50 83 e0 01 b5 fc 08 .......A.P...... 00:20:33.860 00000160 84 ba da 67 0e 94 30 55 89 ba 32 b8 67 f0 83 46 ...g..0U..2.g..F 00:20:33.860 00000170 f5 45 91 dd 94 b1 2b 69 31 78 4a 61 4b a3 c0 eb .E....+i1xJaK... 00:20:33.860 00000180 c1 2f b1 72 be 64 5c c4 90 b8 39 42 50 05 e1 83 ./.r.d\...9BP... 00:20:33.860 00000190 20 81 86 a1 9a b2 34 e4 b0 36 15 d8 c2 8e b5 4c .....4..6.....L 00:20:33.860 000001a0 d2 a2 a4 b3 33 f3 9e 6f 34 92 5a 22 5e 67 65 5b ....3..o4.Z"^ge[ 00:20:33.860 000001b0 b7 fb 7e d9 dc 42 d6 b5 14 e9 1d 67 43 b0 b5 96 ..~..B.....gC... 00:20:33.860 000001c0 a3 b7 d7 78 bf 43 7c 94 49 d3 07 0d 71 ae c1 14 ...x.C|.I...q... 00:20:33.860 000001d0 1e d8 45 d1 3e fd ce b9 0b 22 74 97 48 ae 13 de ..E.>...."t.H... 00:20:33.860 000001e0 d8 f6 bd 17 46 7f 2a f9 cd 93 08 71 16 1f 02 12 ....F.*....q.... 00:20:33.860 000001f0 e0 5b 4e d3 0b 67 ce 30 65 83 7a 10 10 79 46 55 .[N..g.0e.z..yFU 00:20:33.860 host pubkey: 00:20:33.860 00000000 23 3d 3e 99 ab b2 76 4e fa e7 bd 87 e9 4a 73 97 #=>...vN.....Js. 00:20:33.860 00000010 8a 8a 07 e7 93 c2 c6 f3 96 e1 91 5a 39 13 21 c5 ...........Z9.!. 00:20:33.860 00000020 3f 29 f2 4c 84 57 38 cf 33 55 cf f1 f7 23 3a 6b ?).L.W8.3U...#:k 00:20:33.860 00000030 b9 38 59 99 a4 df e7 cb 23 7d 31 a8 92 d6 f1 08 .8Y.....#}1..... 00:20:33.860 00000040 74 58 c6 ec d2 b9 8a 29 03 11 53 e8 81 63 05 64 tX.....)..S..c.d 00:20:33.860 00000050 f0 db a9 63 29 d6 15 5f 10 97 35 9a f6 8d b6 fe ...c).._..5..... 00:20:33.860 00000060 76 e9 95 bb cc 22 d2 3b f7 2f 25 70 8e f5 8b fa v....".;./%p.... 00:20:33.860 00000070 06 30 e1 de 23 70 80 9a 9c 15 8d b3 8b a3 a0 fa .0..#p.......... 00:20:33.860 00000080 53 37 57 b5 92 34 8c ea 01 c2 15 ce b3 9d 17 a1 S7W..4.......... 00:20:33.860 00000090 9e a3 bc 14 bd ff 8a ba 99 b1 41 db b7 e7 68 35 ..........A...h5 00:20:33.860 000000a0 70 b4 dc 3c 5d 5a ea a6 7e 63 49 8f 34 26 d6 ae p..<]Z..~cI.4&.. 00:20:33.860 000000b0 9a 22 0a aa e2 87 84 65 60 b2 6a a8 f8 90 5a b0 .".....e`.j...Z. 00:20:33.860 000000c0 8d c7 d8 22 7a f5 02 5b dd 81 71 c0 fd 39 bb db ..."z..[..q..9.. 00:20:33.860 000000d0 ac 59 7a ee 22 77 8d d2 70 7f 17 72 71 51 a9 72 .Yz."w..p..rqQ.r 00:20:33.860 000000e0 00 13 74 57 c3 c7 bb c4 8f 43 d3 08 06 11 57 13 ..tW.....C....W. 00:20:33.860 000000f0 95 50 a3 b4 5d 8b 17 f9 6d 1d 00 17 a6 12 7b bc .P..]...m.....{. 00:20:33.860 00000100 90 0b eb 7d 5a 13 6d 7b e0 f4 59 69 cc 05 78 42 ...}Z.m{..Yi..xB 00:20:33.860 00000110 2f 22 78 6b 66 b4 09 f3 14 8a 6c 72 4a 23 d4 3c /"xkf.....lrJ#.< 00:20:33.860 00000120 74 91 fe 2c 63 06 8a 1e 5c 24 ac ca 91 0e f5 7c t..,c...\$.....| 00:20:33.860 00000130 4a 85 20 bf 3c 70 40 1a e8 f7 29 5a 7d c7 28 29 J. .s. 00:20:33.860 000001c0 20 fc 35 57 f8 24 79 15 b5 b8 d3 4c 86 ac 0f 58 .5W.$y....L...X 00:20:33.860 000001d0 34 03 fd ee 4c da 7b 87 c8 0b e5 58 e1 9d 1b de 4...L.{....X.... 00:20:33.860 000001e0 66 4a 5e d8 ad 20 d8 0e 3d 8c 39 e0 01 9c 6e b6 fJ^.. ..=.9...n. 00:20:33.860 000001f0 7f 4d 72 ee 9c 01 b7 68 0d a0 39 3c 45 b9 9b 27 .Mr....h..9... 00:20:33.860 00000040 5f 40 93 8b c4 68 96 ad 19 ed ee dc 55 12 ab 8c _@...h......U... 00:20:33.860 00000050 8e 69 27 3a 60 51 e3 6d ea 49 18 98 60 cd 78 24 .i':`Q.m.I..`.x$ 00:20:33.860 00000060 7a 56 3a e9 30 2f 6d 38 17 20 6b e6 03 43 90 63 zV:.0/m8. k..C.c 00:20:33.860 00000070 25 9c e3 7c b7 86 f5 20 c7 34 9e a4 95 85 68 06 %..|... .4....h. 00:20:33.860 00000080 b1 20 db 67 5c 15 a5 d4 5d e3 ed c0 4b 9a 65 36 . .g\...]...K.e6 00:20:33.860 00000090 5d 24 0f 33 ad 42 56 2c 61 58 2e 8a 50 4d 2a 2a ]$.3.BV,aX..PM** 00:20:33.860 000000a0 93 e1 f9 c3 c4 15 3f ef 8e 0c 21 ef 56 e0 d0 50 ......?...!.V..P 00:20:33.860 000000b0 97 fc c3 3b b8 3d 65 cb fb 8e 03 69 c0 f4 2b d4 ...;.=e....i..+. 00:20:33.860 000000c0 f4 e1 b6 b9 16 e8 bf 51 ca 14 0d ff dd c3 e4 3a .......Q.......: 00:20:33.860 000000d0 9f 8b 9d f0 92 78 4a 82 38 3e 8a 42 ea a4 b1 59 .....xJ.8>.B...Y 00:20:33.860 000000e0 e0 72 24 b7 10 f5 9e 0e 89 93 e8 93 03 de 78 c6 .r$...........x. 00:20:33.860 000000f0 ba 14 06 7d b5 ac 82 99 5f 67 27 a3 df a4 2e 1c ...}...._g'..... 00:20:33.860 00000100 8c 6f c9 e5 2c a8 ca 82 d7 fc 7a b2 64 b9 7b d9 .o..,.....z.d.{. 00:20:33.860 00000110 f0 5b af 2f 3e 6b c7 76 42 56 c0 38 b4 d8 f3 63 .[./>k.vBV.8...c 00:20:33.860 00000120 7a c1 70 8a b8 fb 2e 27 48 0a c0 de 57 6d f1 c5 z.p....'H...Wm.. 00:20:33.860 00000130 4a a6 26 09 99 7d de 46 86 58 7a b6 a8 3f 81 a6 J.&..}.F.Xz..?.. 00:20:33.860 00000140 5a e9 38 c1 2b d0 79 61 4e 76 36 f6 1d 8c 3e d1 Z.8.+.yaNv6...>. 00:20:33.860 00000150 3c 5d 52 98 cf 54 61 ee e7 7d 61 bd bb ed 81 1b <]R..Ta..}a..... 00:20:33.860 00000160 f0 57 02 68 d6 f9 1b 0e 62 4c 99 7c 4e 56 94 d3 .W.h....bL.|NV.. 00:20:33.860 00000170 f1 2a 1a 87 11 32 7d 80 e6 00 c0 e8 72 cc 4e 7f .*...2}.....r.N. 00:20:33.860 00000180 f5 c9 99 e4 39 72 ae db 87 b8 5d e5 11 ce 63 23 ....9r....]...c# 00:20:33.860 00000190 3c 65 80 86 95 79 17 cd 5c a7 d5 60 0e 1e d9 23 ....0! 00:20:33.860 [2024-09-27 15:25:07.183332] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=1, dhgroup=3, seq=3428451716, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.861 [2024-09-27 15:25:07.183449] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.861 [2024-09-27 15:25:07.220844] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.861 [2024-09-27 15:25:07.220892] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.861 [2024-09-27 15:25:07.220902] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.861 [2024-09-27 15:25:07.220932] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.861 [2024-09-27 15:25:07.385657] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.861 [2024-09-27 15:25:07.385680] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.861 [2024-09-27 15:25:07.385688] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.861 [2024-09-27 15:25:07.385739] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.861 [2024-09-27 15:25:07.385765] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.861 ctrlr pubkey: 00:20:33.861 00000000 20 82 3f d7 17 af ad 00 cb cc 69 24 9a a2 66 3e .?.......i$..f> 00:20:33.861 00000010 32 58 b9 73 51 c8 58 2c 4d 02 a3 a8 38 de fc 8b 2X.sQ.X,M...8... 00:20:33.861 00000020 d0 7a e8 fe 66 03 fa 57 28 d5 9b 1d 54 7a 48 28 .z..f..W(...TzH( 00:20:33.861 00000030 5d d0 0c d8 2c 81 8f 8d 4c 6f 01 d4 ba b3 87 6f ]...,...Lo.....o 00:20:33.861 00000040 b0 20 5b 2b 0d 88 e7 38 58 f9 96 9d b6 02 84 97 . [+...8X....... 00:20:33.861 00000050 1d 79 ae 72 24 16 20 97 05 04 1b 97 f3 eb b2 e3 .y.r$. ......... 00:20:33.861 00000060 4e b2 1f 28 8f df e2 be 8a 0d 0a 35 e8 81 90 44 N..(.......5...D 00:20:33.861 00000070 ca e7 39 30 ae 1f 19 53 53 a3 b1 0c 57 89 01 d6 ..90...SS...W... 00:20:33.861 00000080 9a bd a6 84 9b 85 f9 3c bd 51 c7 64 ab 4e 30 fa .......<.Q.d.N0. 00:20:33.861 00000090 56 15 1f 44 6c cb 86 83 32 68 23 16 dc 7e f7 97 V..Dl...2h#..~.. 00:20:33.861 000000a0 59 57 88 5c 22 3e 1b 19 93 8f d9 e5 b6 84 ea e6 YW.\">.......... 00:20:33.861 000000b0 1c 1b aa 0a d4 a0 bc 68 17 49 b7 14 d2 8c 70 5d .......h.I....p] 00:20:33.861 000000c0 53 c6 60 3a e7 22 21 7e ab 02 68 ff 93 b3 7d f7 S.`:."!~..h...}. 00:20:33.861 000000d0 9d 44 03 46 05 24 5b 06 50 aa e9 43 61 3c 80 c4 .D.F.$[.P..Ca<.. 00:20:33.861 000000e0 b1 71 a9 53 77 44 70 6b d4 3d 29 62 87 34 44 8f .q.SwDpk.=)b.4D. 00:20:33.861 000000f0 80 a2 3a db 60 b1 ab 26 11 80 a2 40 b6 10 1a e4 ..:.`..&...@.... 00:20:33.861 00000100 3b 8e 57 9a 3b 9f ec fe 95 ba 1e cb 72 ca 2f 94 ;.W.;.......r./. 00:20:33.861 00000110 33 33 7e fd c2 a9 b4 55 e8 71 28 f0 3e aa f4 bf 33~....U.q(.>... 00:20:33.861 00000120 35 da 98 22 20 60 69 3b c0 c7 66 54 b3 31 0b 38 5.." `i;..fT.1.8 00:20:33.861 00000130 b8 5d e4 68 75 a9 f8 51 58 1d cf bb 82 9d 7a bd .].hu..QX.....z. 00:20:33.861 00000140 9b f9 d8 d5 2a 3c 3c 82 46 34 8f 52 d2 f5 cf c9 ....*<<.F4.R.... 00:20:33.861 00000150 b9 fd 17 a8 ce 45 c1 84 ee 4b e3 18 ce f4 11 f5 .....E...K...... 00:20:33.861 00000160 57 ea 61 c1 aa 18 63 5f c8 7d f9 19 51 51 30 2a W.a...c_.}..QQ0* 00:20:33.861 00000170 bc 9a fa 92 05 df 3b 75 6b b6 18 d8 a3 ca 9b f9 ......;uk....... 00:20:33.861 00000180 a5 7c 99 61 ea dd 77 51 d7 55 31 cc b8 97 85 5f .|.a..wQ.U1...._ 00:20:33.861 00000190 cb d2 ef 24 ee 48 9b b6 bc f3 be 94 e5 be 45 7a ...$.H........Ez 00:20:33.861 000001a0 b0 3a 07 c7 06 fa 2c 2e c8 99 bd dc 1a 74 58 22 .:....,......tX" 00:20:33.861 000001b0 d9 98 94 c0 38 f6 d7 1f 2a c2 15 11 55 7b 2a 11 ....8...*...U{*. 00:20:33.861 000001c0 8f fc b1 97 3b 6b 3a 52 a3 fe 93 cc 7e 02 5a 22 ....;k:R....~.Z" 00:20:33.861 000001d0 30 ac 33 5d a7 d4 50 98 03 d7 eb 77 f0 59 41 03 0.3]..P....w.YA. 00:20:33.861 000001e0 f6 1c 8e 5f a4 ec f5 64 40 61 8f 0d 1c 07 f4 bb ..._...d@a...... 00:20:33.861 000001f0 ad 1a 00 bf b8 07 56 1a 3d 65 dc d1 39 31 c4 53 ......V.=e..91.S 00:20:33.861 host pubkey: 00:20:33.861 00000000 ce 5e 77 3b 79 1c 2f 7e 23 5b 56 19 3a 5b cf 82 .^w;y./~#[V.:[.. 00:20:33.861 00000010 23 70 62 c6 0b fe 92 6a 29 e0 c1 a7 6f c5 85 10 #pb....j)...o... 00:20:33.861 00000020 da 02 fb ca 8a 79 df ce 23 42 e4 ef f6 7b 4d 3a .....y..#B...{M: 00:20:33.861 00000030 45 9c 3f 66 9d eb d5 83 d7 60 e7 d1 c5 e6 02 4e E.?f.....`.....N 00:20:33.861 00000040 09 7b bf 06 4e 65 9a ce 67 75 b3 90 1c b1 9b 51 .{..Ne..gu.....Q 00:20:33.861 00000050 0d 88 ff 6a fe 99 a0 63 91 1a 6c b0 2f d3 44 24 ...j...c..l./.D$ 00:20:33.861 00000060 e8 91 2c 43 62 55 33 b2 8b 4b 1d 91 88 fd 4e 7f ..,CbU3..K....N. 00:20:33.861 00000070 76 ac ce c6 66 e1 a2 9a d3 70 dc 85 7f 1f 3f e6 v...f....p....?. 00:20:33.861 00000080 b3 e7 65 08 c4 6c bc ab 96 61 36 cc 6e 8f 72 62 ..e..l...a6.n.rb 00:20:33.861 00000090 9f f8 37 fb 47 c7 41 f4 d4 5b d1 47 54 81 59 a2 ..7.G.A..[.GT.Y. 00:20:33.861 000000a0 7e 20 1a 66 3d fe a0 91 ce b3 8a 47 1f 4f 2a ed ~ .f=......G.O*. 00:20:33.861 000000b0 0e c1 12 5e 11 95 31 9a d7 22 70 6c d0 f8 40 73 ...^..1.."pl..@s 00:20:33.861 000000c0 e8 08 06 49 80 44 16 82 d0 76 14 1d 69 8d 5b 2e ...I.D...v..i.[. 00:20:33.861 000000d0 30 eb bc 95 fe b2 48 9e 93 c3 dc dc 21 fb 67 c3 0.....H.....!.g. 00:20:33.861 000000e0 ae aa 60 bd 25 28 17 3e 26 01 a4 87 17 1b ff 24 ..`.%(.>&......$ 00:20:33.861 000000f0 d4 eb 2b e1 f2 61 2d 2b 0e 8e 30 c2 d1 05 fc 5c ..+..a-+..0....\ 00:20:33.861 00000100 b4 c6 00 ab 71 1f 11 7a 38 46 c6 83 29 c0 b9 03 ....q..z8F..)... 00:20:33.861 00000110 01 30 e7 07 26 d2 bf 7b 27 26 14 04 8b cc 4b 9f .0..&..{'&....K. 00:20:33.861 00000120 24 4f 9b 41 08 90 3d d8 d8 a9 29 4e 8d ff b1 ea $O.A..=...)N.... 00:20:33.861 00000130 f5 dc 65 2b 73 b5 7c d8 dc 34 cd 4b b6 1e 16 ff ..e+s.|..4.K.... 00:20:33.861 00000140 60 04 06 fd 93 e4 55 24 3c 96 0f 94 81 25 fc 7d `.....U$<....%.} 00:20:33.861 00000150 c3 bc d8 e4 06 fa 58 bc 4f 23 65 3c 7e c1 f8 2f ......X.O#e<~../ 00:20:33.861 00000160 72 b3 cd fa bd a1 a4 4d 05 01 30 68 0a 00 6a 14 r......M..0h..j. 00:20:33.861 00000170 b7 a3 77 1e ad 89 f6 c2 cc 0d 22 f4 61 76 85 b4 ..w.......".av.. 00:20:33.861 00000180 6a 0b f2 b1 77 b3 a3 a3 43 60 20 7b 52 b5 28 a7 j...w...C` {R.(. 00:20:33.861 00000190 17 3c b2 d3 46 1c ba b4 ea f8 e4 70 e2 5a 71 e4 .<..F......p.Zq. 00:20:33.861 000001a0 81 2e 94 d9 32 63 32 17 4b 53 52 3f fb 4e fb 90 ....2c2.KSR?.N.. 00:20:33.861 000001b0 d9 be e9 ef 91 a3 87 a8 82 d5 7a 30 e6 1f 3f 07 ..........z0..?. 00:20:33.861 000001c0 b3 77 73 0d 65 16 d6 81 37 07 1f 10 2f c2 7d 54 .ws.e...7.../.}T 00:20:33.861 000001d0 90 c8 76 48 83 6e a6 25 19 cb 48 e2 f5 98 d7 a1 ..vH.n.%..H..... 00:20:33.861 000001e0 df 5a cb 78 d9 a4 f1 91 1f fb 61 fb 50 ae 32 5c .Z.x......a.P.2\ 00:20:33.861 000001f0 73 c0 50 61 51 97 e9 15 73 ed 14 5d 07 42 2d c6 s.PaQ...s..].B-. 00:20:33.861 dh secret: 00:20:33.861 00000000 24 e4 ad 26 b2 8b b1 49 17 f0 4f 54 ee 24 9d ec $..&...I..OT.$.. 00:20:33.861 00000010 3a 3b 89 77 31 d3 20 e3 eb 70 d2 2e 8c 48 46 54 :;.w1. ..p...HFT 00:20:33.861 00000020 9d 58 b6 2c 56 5b 34 01 d7 10 78 05 b6 38 fb a0 .X.,V[4...x..8.. 00:20:33.861 00000030 10 6c c2 6c a2 cd 7f 53 ba 99 ad 0d 4c 0d fe 43 .l.l...S....L..C 00:20:33.861 00000040 eb cb f3 cf 89 65 ff 49 c1 32 29 8e ca fb a4 35 .....e.I.2)....5 00:20:33.861 00000050 a2 92 b7 d6 fd a8 18 74 c6 e2 d1 a2 0a 71 8d b0 .......t.....q.. 00:20:33.861 00000060 c8 ff 4b 10 22 1a ac 34 e8 7e f7 17 19 28 44 4c ..K."..4.~...(DL 00:20:33.861 00000070 db 29 25 cc 54 5a 8d fb c0 81 dc dc b6 ed d8 8f .)%.TZ.......... 00:20:33.861 00000080 0d bd 47 c0 b9 6b 26 3d 7a 88 b3 90 1d 0a 8a 20 ..G..k&=z...... 00:20:33.861 00000090 ca 93 d9 a7 7a ab 0d 9b 29 93 95 20 de ff 4d db ....z...).. ..M. 00:20:33.861 000000a0 df 5c b9 ab 4f 69 55 c1 ed 49 53 e1 71 cd 82 5c .\..OiU..IS.q..\ 00:20:33.861 000000b0 15 62 d3 3e 4c 9e 6b 73 49 2a 09 e4 f3 87 a1 7d .b.>L.ksI*.....} 00:20:33.861 000000c0 8d 2a c6 33 1d 30 fd f1 b2 2b 5f ad 0d 81 4c fe .*.3.0...+_...L. 00:20:33.861 000000d0 13 73 bc 06 d0 0b 82 7a 65 5c 3d 33 7e 0a 43 68 .s.....ze\=3~.Ch 00:20:33.861 000000e0 39 3b ea 74 9d 5f eb 02 2a a7 d9 73 82 dd f5 41 9;.t._..*..s...A 00:20:33.861 000000f0 29 fb cf e5 fa c9 6f 8a 85 55 42 ec 0b ce 6f 29 ).....o..UB...o) 00:20:33.861 00000100 0d ab 2b e5 35 58 dd 62 9d 9c bb 9a 8d 0b 59 89 ..+.5X.b......Y. 00:20:33.861 00000110 62 77 96 9a 1b 8e ff c1 8e 87 20 37 0c 8c 39 5b bw........ 7..9[ 00:20:33.861 00000120 0d 00 a5 03 97 c2 8e 66 17 51 a4 f8 ec fa 47 2f .......f.Q....G/ 00:20:33.861 00000130 08 61 98 6c d9 7c 8d 9d 7e 48 f9 17 68 4a 95 7d .a.l.|..~H..hJ.} 00:20:33.861 00000140 cb 19 3d 6e 67 7f 38 7f 62 ec cb 83 95 91 8d 9d ..=ng.8.b....... 00:20:33.861 00000150 02 7e e7 29 8b 3d 35 05 1b 09 0c dc ac 8c 8b 81 .~.).=5......... 00:20:33.861 00000160 8d 07 b3 a5 4e 17 da a2 cf 25 c4 f7 ca 2e 52 f7 ....N....%....R. 00:20:33.861 00000170 bd e1 9e 59 04 34 eb 42 ee 7d 24 8f 40 48 83 77 ...Y.4.B.}$.@H.w 00:20:33.861 00000180 74 18 08 a9 7f f2 8a 59 c1 4a 1f 66 9b 0d 7e 86 t......Y.J.f..~. 00:20:33.861 00000190 f1 99 dc a5 77 28 0a 3d c1 1e ca c0 fe 87 6f 66 ....w(.=......of 00:20:33.861 000001a0 07 31 e9 55 83 7c bf ea 91 28 af 16 d5 07 65 84 .1.U.|...(....e. 00:20:33.861 000001b0 86 ef 10 2c 46 a0 b0 00 d6 7e b7 0e 33 55 85 3f ...,F....~..3U.? 00:20:33.861 000001c0 e9 9b 4d f5 bd 65 20 4a ba 44 83 b4 5d 0d 32 ea ..M..e J.D..].2. 00:20:33.861 000001d0 fc 2a 86 0e 79 37 e6 df ee 20 86 e7 43 c9 8c fc .*..y7... ..C... 00:20:33.861 000001e0 5a 41 c3 7c 92 9f 0e 9a 61 66 56 3f 66 53 c8 e7 ZA.|....afV?fS.. 00:20:33.861 000001f0 2a 9c a0 87 00 33 3e d3 50 e7 e3 92 fb fe 5a 82 *....3>.P.....Z. 00:20:33.861 [2024-09-27 15:25:07.401879] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=3, seq=3428451717, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.861 [2024-09-27 15:25:07.419172] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.861 [2024-09-27 15:25:07.419221] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.861 [2024-09-27 15:25:07.419239] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.861 [2024-09-27 15:25:07.419263] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.861 [2024-09-27 15:25:07.419278] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.861 [2024-09-27 15:25:07.525765] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.861 [2024-09-27 15:25:07.525784] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.862 [2024-09-27 15:25:07.525792] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.862 [2024-09-27 15:25:07.525801] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.862 [2024-09-27 15:25:07.525855] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.862 ctrlr pubkey: 00:20:33.862 00000000 20 82 3f d7 17 af ad 00 cb cc 69 24 9a a2 66 3e .?.......i$..f> 00:20:33.862 00000010 32 58 b9 73 51 c8 58 2c 4d 02 a3 a8 38 de fc 8b 2X.sQ.X,M...8... 00:20:33.862 00000020 d0 7a e8 fe 66 03 fa 57 28 d5 9b 1d 54 7a 48 28 .z..f..W(...TzH( 00:20:33.862 00000030 5d d0 0c d8 2c 81 8f 8d 4c 6f 01 d4 ba b3 87 6f ]...,...Lo.....o 00:20:33.862 00000040 b0 20 5b 2b 0d 88 e7 38 58 f9 96 9d b6 02 84 97 . [+...8X....... 00:20:33.862 00000050 1d 79 ae 72 24 16 20 97 05 04 1b 97 f3 eb b2 e3 .y.r$. ......... 00:20:33.862 00000060 4e b2 1f 28 8f df e2 be 8a 0d 0a 35 e8 81 90 44 N..(.......5...D 00:20:33.862 00000070 ca e7 39 30 ae 1f 19 53 53 a3 b1 0c 57 89 01 d6 ..90...SS...W... 00:20:33.862 00000080 9a bd a6 84 9b 85 f9 3c bd 51 c7 64 ab 4e 30 fa .......<.Q.d.N0. 00:20:33.862 00000090 56 15 1f 44 6c cb 86 83 32 68 23 16 dc 7e f7 97 V..Dl...2h#..~.. 00:20:33.862 000000a0 59 57 88 5c 22 3e 1b 19 93 8f d9 e5 b6 84 ea e6 YW.\">.......... 00:20:33.862 000000b0 1c 1b aa 0a d4 a0 bc 68 17 49 b7 14 d2 8c 70 5d .......h.I....p] 00:20:33.862 000000c0 53 c6 60 3a e7 22 21 7e ab 02 68 ff 93 b3 7d f7 S.`:."!~..h...}. 00:20:33.862 000000d0 9d 44 03 46 05 24 5b 06 50 aa e9 43 61 3c 80 c4 .D.F.$[.P..Ca<.. 00:20:33.862 000000e0 b1 71 a9 53 77 44 70 6b d4 3d 29 62 87 34 44 8f .q.SwDpk.=)b.4D. 00:20:33.862 000000f0 80 a2 3a db 60 b1 ab 26 11 80 a2 40 b6 10 1a e4 ..:.`..&...@.... 00:20:33.862 00000100 3b 8e 57 9a 3b 9f ec fe 95 ba 1e cb 72 ca 2f 94 ;.W.;.......r./. 00:20:33.862 00000110 33 33 7e fd c2 a9 b4 55 e8 71 28 f0 3e aa f4 bf 33~....U.q(.>... 00:20:33.862 00000120 35 da 98 22 20 60 69 3b c0 c7 66 54 b3 31 0b 38 5.." `i;..fT.1.8 00:20:33.862 00000130 b8 5d e4 68 75 a9 f8 51 58 1d cf bb 82 9d 7a bd .].hu..QX.....z. 00:20:33.862 00000140 9b f9 d8 d5 2a 3c 3c 82 46 34 8f 52 d2 f5 cf c9 ....*<<.F4.R.... 00:20:33.862 00000150 b9 fd 17 a8 ce 45 c1 84 ee 4b e3 18 ce f4 11 f5 .....E...K...... 00:20:33.862 00000160 57 ea 61 c1 aa 18 63 5f c8 7d f9 19 51 51 30 2a W.a...c_.}..QQ0* 00:20:33.862 00000170 bc 9a fa 92 05 df 3b 75 6b b6 18 d8 a3 ca 9b f9 ......;uk....... 00:20:33.862 00000180 a5 7c 99 61 ea dd 77 51 d7 55 31 cc b8 97 85 5f .|.a..wQ.U1...._ 00:20:33.862 00000190 cb d2 ef 24 ee 48 9b b6 bc f3 be 94 e5 be 45 7a ...$.H........Ez 00:20:33.862 000001a0 b0 3a 07 c7 06 fa 2c 2e c8 99 bd dc 1a 74 58 22 .:....,......tX" 00:20:33.862 000001b0 d9 98 94 c0 38 f6 d7 1f 2a c2 15 11 55 7b 2a 11 ....8...*...U{*. 00:20:33.862 000001c0 8f fc b1 97 3b 6b 3a 52 a3 fe 93 cc 7e 02 5a 22 ....;k:R....~.Z" 00:20:33.862 000001d0 30 ac 33 5d a7 d4 50 98 03 d7 eb 77 f0 59 41 03 0.3]..P....w.YA. 00:20:33.862 000001e0 f6 1c 8e 5f a4 ec f5 64 40 61 8f 0d 1c 07 f4 bb ..._...d@a...... 00:20:33.862 000001f0 ad 1a 00 bf b8 07 56 1a 3d 65 dc d1 39 31 c4 53 ......V.=e..91.S 00:20:33.862 host pubkey: 00:20:33.862 00000000 1d 81 69 2a 95 20 87 23 58 c4 f5 23 99 36 c2 80 ..i*. .#X..#.6.. 00:20:33.862 00000010 ba cd f5 12 85 ec bb d0 7e 1f 5b 74 b6 0a 79 40 ........~.[t..y@ 00:20:33.862 00000020 eb e9 34 2e e9 bb eb 84 35 8e 24 03 22 47 fa 20 ..4.....5.$."G. 00:20:33.862 00000030 46 f1 1e 9c 76 25 d1 59 b3 18 6d 78 f7 79 c6 33 F...v%.Y..mx.y.3 00:20:33.862 00000040 87 d7 f6 77 1f 8a b7 3f 05 df 99 91 b3 55 6a bd ...w...?.....Uj. 00:20:33.862 00000050 89 44 4f f8 b7 3e bb 2f 1c 4e 3d e4 d0 9f 28 10 .DO..>./.N=...(. 00:20:33.862 00000060 27 5f 0a 31 5c bd 75 df c4 96 14 2f 1d 29 6c 60 '_.1\.u..../.)l` 00:20:33.862 00000070 cc be 78 e3 ec 62 6b b0 85 7b e1 9f c1 81 3a 7b ..x..bk..{....:{ 00:20:33.862 00000080 be d8 b7 47 9a 13 00 93 ad 9c 9e 49 c1 15 23 79 ...G.......I..#y 00:20:33.862 00000090 c6 4f 10 f7 6e 50 23 93 b0 73 b4 45 c2 13 ad 6f .O..nP#..s.E...o 00:20:33.862 000000a0 93 6d 0f a5 16 39 d8 82 cd c1 4c e8 b0 4f 61 6a .m...9....L..Oaj 00:20:33.862 000000b0 50 7a 9b 01 a7 9c 30 f3 ba 8f b1 62 cc 74 d2 c4 Pz....0....b.t.. 00:20:33.862 000000c0 ab c1 0f ab 49 29 d0 f6 3b e5 df 2b 61 54 64 25 ....I)..;..+aTd% 00:20:33.862 000000d0 50 87 dd 44 3e cf e9 3e 51 66 c4 1f 27 0b e7 03 P..D>..>Qf..'... 00:20:33.862 000000e0 24 f2 88 3e 2b 98 56 52 15 22 51 07 30 e6 cb e8 $..>+.VR."Q.0... 00:20:33.862 000000f0 be 39 0b 4e 6b 46 cc 69 66 da 72 4d 09 4d 2a 7e .9.NkF.if.rM.M*~ 00:20:33.862 00000100 66 82 4e ef 73 97 fa 3e ef cb 16 e0 66 e8 c0 57 f.N.s..>....f..W 00:20:33.862 00000110 55 48 ba 15 90 b6 48 e6 e7 c8 b2 ab 43 9d 33 53 UH....H.....C.3S 00:20:33.862 00000120 3d e9 bf 9f 23 ae 4a 1b b5 f0 6c fb de 9a 22 9b =...#.J...l...". 00:20:33.862 00000130 d4 19 c1 53 f5 0e ad 0d 6b 67 14 af 7d b9 48 33 ...S....kg..}.H3 00:20:33.862 00000140 f2 de 6f b3 91 d8 43 5b 43 24 03 79 22 4d 74 2e ..o...C[C$.y"Mt. 00:20:33.862 00000150 38 e5 c5 84 d3 ee a7 68 6a ce 11 40 7a ce 51 b1 8......hj..@z.Q. 00:20:33.862 00000160 a1 18 f3 25 2d c6 f1 f5 19 34 1d 8e 8d 35 b8 92 ...%-....4...5.. 00:20:33.862 00000170 32 43 3c 7d 22 90 00 dc a6 da f9 1f f4 ce 4e 2c 2C<}".........N, 00:20:33.862 00000180 3e c7 3d 4d 70 ea 9b 01 84 b6 6c 6c c1 a3 d6 e4 >.=Mp.....ll.... 00:20:33.862 00000190 1e d5 da 60 6a b1 a6 ef 05 7e 09 03 2a 1a f1 62 ...`j....~..*..b 00:20:33.862 000001a0 d0 fe 79 10 5d 50 81 2a 45 3a bf 73 3d 4f 35 2a ..y.]P.*E:.s=O5* 00:20:33.862 000001b0 d3 c2 79 16 c1 f6 1d d5 7c 23 05 5f d6 6d 7c d4 ..y.....|#._.m|. 00:20:33.862 000001c0 89 b8 8b fa 97 d7 41 83 d9 56 42 9a 5b 7f d2 cb ......A..VB.[... 00:20:33.862 000001d0 cf 4f 5e 38 5b 89 13 91 63 27 df 21 bd 6d 38 8a .O^8[...c'.!.m8. 00:20:33.862 000001e0 d9 23 92 dd 28 da 36 f8 b5 8b 9e 3e 72 aa c9 a7 .#..(.6....>r... 00:20:33.862 000001f0 4c be 47 8e c2 38 25 29 2a d9 9b ee d5 d9 34 26 L.G..8%)*.....4& 00:20:33.862 dh secret: 00:20:33.862 00000000 75 16 79 7b 83 1a 68 d4 e5 5c e7 1f 37 dc 03 e0 u.y{..h..\..7... 00:20:33.862 00000010 a1 12 44 dc e5 9d 94 04 af 72 b7 cd be b3 fb a9 ..D......r...... 00:20:33.862 00000020 ff 27 61 f4 bc 23 58 07 8c f1 77 29 32 b1 2d cb .'a..#X...w)2.-. 00:20:33.862 00000030 2b a3 3a 35 4d 84 18 cb 44 6b e1 03 c7 bd a5 a0 +.:5M...Dk...... 00:20:33.862 00000040 c7 b1 e4 3d e8 8a c2 63 22 d0 ba 87 65 ea 4a e3 ...=...c"...e.J. 00:20:33.862 00000050 2d 37 7c 9f 33 a5 e7 1c 49 1f 5b ae 2b 21 e4 77 -7|.3...I.[.+!.w 00:20:33.862 00000060 4a ee 7d 25 3f 41 4b bd f4 fa 7b 8e fc b6 6a c3 J.}%?AK...{...j. 00:20:33.862 00000070 04 15 62 8f b3 d3 e4 6f 6c b4 08 e4 00 fd f9 51 ..b....ol......Q 00:20:33.862 00000080 ef e4 7b 3a 3c 50 b4 c3 ac 7f b7 e2 3e 3b 07 ba ..{:;.. 00:20:33.862 00000090 10 c5 f7 6f f3 5c 64 6b 9a 8d 7b 44 4e d7 8a 5a ...o.\dk..{DN..Z 00:20:33.862 000000a0 33 d8 96 5a b6 ef 6b 6b 79 8f f2 b3 0b 1a d1 fa 3..Z..kky....... 00:20:33.862 000000b0 a6 77 2d c6 34 9a 8b ba 62 15 38 09 eb 77 9f bb .w-.4...b.8..w.. 00:20:33.862 000000c0 5c ed 1a 77 45 17 3e b5 e3 ae ef c5 32 df 9f 23 \..wE.>.....2..# 00:20:33.862 000000d0 35 38 7e ce d1 16 73 9b 2c c8 17 f6 0b 5d f0 b7 58~...s.,....].. 00:20:33.862 000000e0 c3 80 d7 2c e4 49 e3 ce d2 e9 cb 23 00 d7 99 af ...,.I.....#.... 00:20:33.862 000000f0 24 11 1e 95 8d 9c 1f 3d 71 63 d7 43 25 ce 62 b5 $......=qc.C%.b. 00:20:33.862 00000100 08 e0 36 7a 62 bc 2a ce 95 03 d7 d5 a5 7c 44 e9 ..6zb.*......|D. 00:20:33.862 00000110 e6 49 a3 64 a2 97 8a 3c 67 67 ac 91 66 46 9d 93 .I.d.....q.Q. 00:20:33.862 00000190 8f 66 f2 af 81 ca 19 37 d6 b6 6d 6a 45 e8 e4 dd .f.....7..mjE... 00:20:33.862 000001a0 a0 85 df a7 52 42 3c 85 e8 95 e1 04 e3 40 ea b6 ....RB<......@.. 00:20:33.862 000001b0 a6 4a 0d 4d dc 94 82 20 23 b6 bf aa b5 5b 66 d1 .J.M... #....[f. 00:20:33.862 000001c0 3f 8f 45 74 cd 23 69 e3 ac 01 bd 51 7a 94 e3 0f ?.Et.#i....Qz... 00:20:33.862 000001d0 6a 97 b5 17 da 50 be cd 11 c8 26 50 cd 3e f4 6c j....P....&P.>.l 00:20:33.862 000001e0 b3 b2 b4 bd a6 2b b3 13 c3 ab 76 91 5a bb 35 fd .....+....v.Z.5. 00:20:33.862 000001f0 54 82 22 52 aa cb 32 89 f8 eb e5 b0 49 50 1e bd T."R..2.....IP.. 00:20:33.862 [2024-09-27 15:25:07.542080] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=1, dhgroup=3, seq=3428451718, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.862 [2024-09-27 15:25:07.542187] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.862 [2024-09-27 15:25:07.578227] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.862 [2024-09-27 15:25:07.578273] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.862 [2024-09-27 15:25:07.578283] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.862 [2024-09-27 15:25:07.578318] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.862 [2024-09-27 15:25:07.740632] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.862 [2024-09-27 15:25:07.740653] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.862 [2024-09-27 15:25:07.740661] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.862 [2024-09-27 15:25:07.740707] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.862 [2024-09-27 15:25:07.740731] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.862 ctrlr pubkey: 00:20:33.862 00000000 9a 54 c0 b8 2c 85 6c 60 2f 1f d5 c2 4f 79 c3 20 .T..,.l`/...Oy. 00:20:33.863 00000010 c2 a6 1f f3 20 c0 4e 06 3b da f4 21 40 74 7d 0b .... .N.;..!@t}. 00:20:33.863 00000020 cc 99 e8 38 fb 58 0b 4f 43 8e 2f 55 13 ab 72 ce ...8.X.OC./U..r. 00:20:33.863 00000030 ab 87 55 ee d2 4f d2 82 98 dd bd a9 c1 16 f3 a3 ..U..O.......... 00:20:33.863 00000040 45 53 a6 10 09 ee 4a 65 2f 2f f6 e7 6c 43 65 1f ES....Je//..lCe. 00:20:33.863 00000050 1f 5a e8 2c c9 d7 32 4a f1 39 86 11 1e ea b1 37 .Z.,..2J.9.....7 00:20:33.863 00000060 e8 51 1a 4d a9 a6 75 e6 08 a7 b5 ed 91 38 21 23 .Q.M..u......8!# 00:20:33.863 00000070 94 c5 e6 3c 45 33 c2 91 b6 15 34 37 25 de 99 30 ..... 00:20:33.863 00000110 55 b2 c4 ee ce 41 6b ce 04 92 aa 62 46 6c 5a d4 U....Ak....bFlZ. 00:20:33.863 00000120 2e d8 46 0b bc 38 0a 10 0f 34 46 17 33 56 c9 f3 ..F..8...4F.3V.. 00:20:33.863 00000130 e7 19 6e 58 e2 b2 76 be af b3 e6 29 99 91 94 26 ..nX..v....)...& 00:20:33.863 00000140 2a 39 7d 9f 1a 7e 5e 60 30 1b 8c 1b 52 a9 41 c4 *9}..~^`0...R.A. 00:20:33.863 00000150 db 12 0b fe 26 a2 79 8e 7c de a0 40 6c 88 39 92 ....&.y.|..@l.9. 00:20:33.863 00000160 98 25 03 ae 6c df 7b e0 ee 8a dd 55 fb 2d 8d c1 .%..l.{....U.-.. 00:20:33.863 00000170 02 b9 5f 93 8a 50 71 b1 61 e1 26 94 07 e9 eb e5 .._..Pq.a.&..... 00:20:33.863 00000180 5d 9d ae a4 e7 1f 2a 0f 40 89 1b 4c a9 72 75 ae ].....*.@..L.ru. 00:20:33.863 00000190 9c 61 b9 86 36 02 0b 1c 71 f0 93 0e 6b f2 8c e6 .a..6...q...k... 00:20:33.863 000001a0 05 e1 98 43 4c b2 9f 49 f5 da bc 41 9e 98 5e 83 ...CL..I...A..^. 00:20:33.863 000001b0 ef 8f 1d 47 03 a8 04 9d 74 14 8a 6f 99 32 df a5 ...G....t..o.2.. 00:20:33.863 000001c0 22 82 5a 5b 1f 09 8d 00 93 f5 d1 46 ff 56 39 75 ".Z[.......F.V9u 00:20:33.863 000001d0 09 16 3c 46 dc de ff 28 cd e7 4d 88 ff 0a 44 d8 .....O:.P 00:20:33.863 00000070 13 05 3f cb 21 39 4c d8 c0 6d 30 48 76 13 b2 f4 ..?.!9L..m0Hv... 00:20:33.863 00000080 10 86 c2 ff c0 1e c9 87 35 6f d2 b3 1a ff 2d 76 ........5o....-v 00:20:33.863 00000090 58 6b 8f 0b 6d 54 de db 75 f8 7c 08 1f b0 27 f2 Xk..mT..u.|...'. 00:20:33.863 000000a0 1e 39 87 1c 9c ac ea db 25 76 8c d8 17 d0 b9 ab .9......%v...... 00:20:33.863 000000b0 7d 09 ca 58 af 46 29 a4 80 57 1c 89 e2 cd 31 86 }..X.F)..W....1. 00:20:33.863 000000c0 1d 22 5c 80 91 df e7 98 64 d0 87 72 e3 5e 01 e3 ."\.....d..r.^.. 00:20:33.863 000000d0 e5 b5 a5 93 e6 c1 cf ce dc ab 93 68 9a e7 a5 25 ...........h...% 00:20:33.863 000000e0 f8 ec 06 ee 3e 6a 9d b4 e5 87 06 5c 78 13 28 13 ....>j.....\x.(. 00:20:33.863 000000f0 51 24 41 a9 20 23 f5 70 a9 be dc 32 a3 91 3b 86 Q$A. #.p...2..;. 00:20:33.863 00000100 89 73 fc 2b 86 2d a4 86 6b 81 db 08 52 86 b1 16 .s.+.-..k...R... 00:20:33.863 00000110 67 88 31 6e d3 9d 16 51 6b 20 b9 aa ec 90 41 ed g.1n...Qk ....A. 00:20:33.863 00000120 c2 5e ef eb be 40 82 e4 b6 6c 27 27 9c 3d 0c f5 .^...@...l''.=.. 00:20:33.863 00000130 e8 17 ee 4d d9 80 77 66 1a 4f e7 1b fc e9 d1 43 ...M..wf.O.....C 00:20:33.863 00000140 91 18 98 9c 10 35 e2 92 38 f7 65 cd 77 32 54 0c .....5..8.e.w2T. 00:20:33.863 00000150 d5 d1 1d 7d c7 80 85 f2 5e 96 33 40 b6 aa 71 a1 ...}....^.3@..q. 00:20:33.863 00000160 4d f0 39 0c 3b e2 d6 b0 5e 6d b3 47 ef c1 8d cd M.9.;...^m.G.... 00:20:33.863 00000170 74 ed 28 d1 6b 35 99 77 81 27 86 e1 9e 52 3e 9b t.(.k5.w.'...R>. 00:20:33.863 00000180 91 91 54 41 31 ec 46 66 c1 db df 13 b3 d1 ff 41 ..TA1.Ff.......A 00:20:33.863 00000190 15 ef 28 24 a7 77 e2 5b 2e 0f 98 1a cf fe 9e 26 ..($.w.[.......& 00:20:33.863 000001a0 05 82 74 4a d0 ef 75 90 ae 35 d1 01 88 13 39 ed ..tJ..u..5....9. 00:20:33.863 000001b0 40 60 1a fe 56 30 34 55 5f 16 18 07 ee c4 a6 6e @`..V04U_......n 00:20:33.863 000001c0 c3 05 63 52 17 c6 6d a5 3f 4a b7 a9 c4 86 79 2d ..cR..m.?J....y- 00:20:33.863 000001d0 b0 ea 63 46 e5 42 6d 95 81 a3 ab ea 90 66 82 a8 ..cF.Bm......f.. 00:20:33.863 000001e0 11 e8 c4 1f 74 d3 94 8a 5c 0b 0f 7e b8 70 05 60 ....t...\..~.p.` 00:20:33.863 000001f0 9d 1d 2a 6f 08 c3 0e ed ca b9 2a 20 bc 99 b8 d8 ..*o......* .... 00:20:33.863 dh secret: 00:20:33.863 00000000 be 59 fe 99 a3 9a f0 1f ca a5 9f 07 fa 53 d3 fa .Y...........S.. 00:20:33.863 00000010 98 27 7b 10 02 87 9d 1b ba a7 d0 b4 57 f9 28 be .'{.........W.(. 00:20:33.863 00000020 44 1f 39 b7 72 a6 31 eb b0 90 ec 1e c0 67 47 c1 D.9.r.1......gG. 00:20:33.863 00000030 a7 bf 4a 72 fa ab c5 7c c9 fd 48 9f 9b d5 83 70 ..Jr...|..H....p 00:20:33.863 00000040 49 7f ab 38 01 5c f6 d4 d3 d7 ab d5 2a 4f b9 7b I..8.\......*O.{ 00:20:33.863 00000050 af ae 50 1c dd 06 22 65 b7 4b 38 a3 0c c0 cd 19 ..P..."e.K8..... 00:20:33.863 00000060 33 e7 f7 81 6a 19 0b c5 e8 0d bd f7 62 24 39 88 3...j.......b$9. 00:20:33.863 00000070 54 72 5a ae f5 5d 9b 8c d1 00 2a 49 fe fd af 49 TrZ..]....*I...I 00:20:33.863 00000080 15 1c 6d cc 87 82 cc 73 f9 07 c5 5d 92 68 24 df ..m....s...].h$. 00:20:33.863 00000090 b2 c0 74 b8 58 23 fb b1 94 f5 c2 ba 78 c4 c1 ad ..t.X#......x... 00:20:33.863 000000a0 9c 31 73 ac 6b 64 f3 40 5d 45 4e 09 ce 0b 07 32 .1s.kd.@]EN....2 00:20:33.863 000000b0 d8 8f 86 9d 6a 8e 4b 6b 2d f8 e5 66 2a fe 28 3c ....j.Kk-..f*.(< 00:20:33.863 000000c0 03 40 a7 3b 22 5b 38 1d 8d 7d 4d 8e 0d 39 01 b5 .@.;"[8..}M..9.. 00:20:33.863 000000d0 d2 0a ec 38 5c a1 64 08 3b 70 58 8f 0d a6 50 9f ...8\.d.;pX...P. 00:20:33.863 000000e0 25 7b dd 63 ae 61 2e 6b 3f cb a3 eb 0f 91 3c 74 %{.c.a.k?....... 00:20:33.864 00000110 55 b2 c4 ee ce 41 6b ce 04 92 aa 62 46 6c 5a d4 U....Ak....bFlZ. 00:20:33.864 00000120 2e d8 46 0b bc 38 0a 10 0f 34 46 17 33 56 c9 f3 ..F..8...4F.3V.. 00:20:33.864 00000130 e7 19 6e 58 e2 b2 76 be af b3 e6 29 99 91 94 26 ..nX..v....)...& 00:20:33.864 00000140 2a 39 7d 9f 1a 7e 5e 60 30 1b 8c 1b 52 a9 41 c4 *9}..~^`0...R.A. 00:20:33.864 00000150 db 12 0b fe 26 a2 79 8e 7c de a0 40 6c 88 39 92 ....&.y.|..@l.9. 00:20:33.864 00000160 98 25 03 ae 6c df 7b e0 ee 8a dd 55 fb 2d 8d c1 .%..l.{....U.-.. 00:20:33.864 00000170 02 b9 5f 93 8a 50 71 b1 61 e1 26 94 07 e9 eb e5 .._..Pq.a.&..... 00:20:33.864 00000180 5d 9d ae a4 e7 1f 2a 0f 40 89 1b 4c a9 72 75 ae ].....*.@..L.ru. 00:20:33.864 00000190 9c 61 b9 86 36 02 0b 1c 71 f0 93 0e 6b f2 8c e6 .a..6...q...k... 00:20:33.864 000001a0 05 e1 98 43 4c b2 9f 49 f5 da bc 41 9e 98 5e 83 ...CL..I...A..^. 00:20:33.864 000001b0 ef 8f 1d 47 03 a8 04 9d 74 14 8a 6f 99 32 df a5 ...G....t..o.2.. 00:20:33.864 000001c0 22 82 5a 5b 1f 09 8d 00 93 f5 d1 46 ff 56 39 75 ".Z[.......F.V9u 00:20:33.864 000001d0 09 16 3c 46 dc de ff 28 cd e7 4d 88 ff 0a 44 d8 .......[.9.f 00:20:33.864 00000010 1d 71 53 12 f9 5f 6c 33 71 d1 d7 2c 66 61 ad 97 .qS.._l3q..,fa.. 00:20:33.864 00000020 a0 ad 80 b1 b4 35 fe e7 64 2d ec d5 61 d7 8e 75 .....5..d-..a..u 00:20:33.864 00000030 02 ae 42 c2 3c f3 c4 a7 80 a8 51 80 7f 81 96 1c ..B.<.....Q..... 00:20:33.864 00000040 b4 69 e9 86 82 40 a0 f3 73 01 78 16 1f c7 ac 09 .i...@..s.x..... 00:20:33.864 00000050 82 88 8b c2 ef 68 48 00 21 4c b4 82 ac 8f a1 30 .....hH.!L.....0 00:20:33.864 00000060 9e fe ce ac 29 66 d6 af 11 87 ee 80 05 6c 94 c0 ....)f.......l.. 00:20:33.864 00000070 e3 92 1b b3 15 d3 30 57 b9 19 17 06 f8 d1 4d f2 ......0W......M. 00:20:33.864 00000080 86 e1 f9 18 b4 22 09 d3 7a 19 94 cd da dc f7 3c ....."..z......< 00:20:33.865 00000090 b7 54 5e 32 bf 87 02 84 f8 32 33 d9 95 18 80 99 .T^2.....23..... 00:20:33.865 000000a0 f2 77 94 fb 1b 52 6e 32 d5 fe 1a 00 b7 aa 24 1a .w...Rn2......$. 00:20:33.865 000000b0 d4 9f 19 db 49 04 9a 0b f0 56 ee 71 be 56 43 4b ....I....V.q.VCK 00:20:33.865 000000c0 67 2f 5e a7 87 54 fe f2 3e 1b fe 34 83 cf 62 b8 g/^..T..>..4..b. 00:20:33.865 000000d0 b8 fa bb 22 31 42 24 04 35 44 8b bf 60 b8 d3 0f ..."1B$.5D..`... 00:20:33.865 000000e0 ec 6d 75 bf f8 5f bb 77 a8 eb a8 92 c1 89 c0 1c .mu.._.w........ 00:20:33.865 000000f0 3b 90 9e da c9 57 97 fa 48 df 7f 89 7b c2 90 60 ;....W..H...{..` 00:20:33.865 00000100 dc 0a ae 85 13 2b be e2 bc ff 81 3a 71 8b eb 69 .....+.....:q..i 00:20:33.865 00000110 ad 38 03 8a af a4 98 9d 28 df db 18 69 37 b1 a4 .8......(...i7.. 00:20:33.865 00000120 c6 a1 ba cc de db 80 f1 bf 47 b4 a0 66 98 20 db .........G..f. . 00:20:33.865 00000130 cf 9f 26 0d c8 9c ce 08 ef 7f 8c 7c cd c3 75 f1 ..&........|..u. 00:20:33.865 00000140 0c 7d 17 88 59 f8 02 4d 68 dd 03 e9 bb f4 af f5 .}..Y..Mh....... 00:20:33.865 00000150 0f 69 c3 d9 4e 10 c5 5e be 3d 21 5c 76 0f bf ef .i..N..^.=!\v... 00:20:33.865 00000160 d8 da 14 94 0f cf 09 21 7b 12 f3 a0 d2 a4 76 3c .......!{.....v< 00:20:33.865 00000170 1a 3e d1 0c e7 f4 bf 95 95 9c b6 bd 7d e7 df 39 .>..........}..9 00:20:33.865 00000180 f4 a7 39 f4 56 fc 3a ea 5a 56 c7 91 81 7d 5e e7 ..9.V.:.ZV...}^. 00:20:33.865 00000190 ca 7c 6c 3d 52 7b 03 a3 80 ad 7a b1 57 2c 82 07 .|l=R{....z.W,.. 00:20:33.865 000001a0 80 72 47 7c 3e 19 f5 36 08 72 43 44 37 3e 91 e6 .rG|>..6.rCD7>.. 00:20:33.865 000001b0 41 2c ca 35 5b 00 52 4b 14 b4 06 7a a8 10 04 34 A,.5[.RK...z...4 00:20:33.865 000001c0 73 dd f8 43 1d 2c 62 0c 26 2d 3a 79 80 f5 dc 21 s..C.,b.&-:y...! 00:20:33.865 000001d0 3c c1 08 ef bf af d3 52 30 a6 10 7c 02 71 4b e8 <......R0..|.qK. 00:20:33.865 000001e0 db c4 34 c3 4a 84 8f ad 8b 1c 6d 9b 36 75 f3 d2 ..4.J.....m.6u.. 00:20:33.865 000001f0 88 47 2f 16 87 1c ce 1c fc af 2d 75 c9 09 b7 2a .G/.......-u...* 00:20:33.865 host pubkey: 00:20:33.865 00000000 62 cf 55 b9 57 60 6b 41 aa 1c d3 4e 99 05 05 8c b.U.W`kA...N.... 00:20:33.865 00000010 75 cb 86 b4 8a 3e f8 aa 44 84 96 94 39 37 90 d2 u....>..D...97.. 00:20:33.865 00000020 19 43 41 3b 70 db 32 9a 3f 85 f0 a0 aa 1d c0 34 .CA;p.2.?......4 00:20:33.865 00000030 96 9a 10 e3 c8 e9 66 67 5a 66 65 65 fd 3a 13 6a ......fgZfee.:.j 00:20:33.865 00000040 2b 85 20 bc 39 39 fb 65 af b9 67 03 4a 4b 40 99 +. .99.e..g.JK@. 00:20:33.865 00000050 2c 24 13 7e 62 5d e8 ed 96 c1 11 7c c0 10 0a 84 ,$.~b].....|.... 00:20:33.865 00000060 a3 14 28 84 6a 33 e3 b8 d8 1a 08 89 a4 bf ee fa ..(.j3.......... 00:20:33.865 00000070 52 bb 17 2a 62 00 70 10 a3 ca cf 0e 39 5c 1e 7e R..*b.p.....9\.~ 00:20:33.865 00000080 85 f7 da d9 b5 2a e5 55 d0 51 c2 f9 3c e6 4f 0c .....*.U.Q..<.O. 00:20:33.865 00000090 33 67 ba 1c a8 3e ab fc c6 51 71 27 5a b3 6e df 3g...>...Qq'Z.n. 00:20:33.865 000000a0 a8 00 83 78 2f 6f 85 27 24 e2 f8 1a 26 20 4c 6e ...x/o.'$...& Ln 00:20:33.865 000000b0 e3 02 6b fc 6a 4e 1d 1f a5 5e 15 c3 49 c2 d6 62 ..k.jN...^..I..b 00:20:33.865 000000c0 ec 9c 1a ee 16 bc e7 dd cd e7 1a 80 97 d3 eb c5 ................ 00:20:33.865 000000d0 c0 a5 ab d5 6b c7 dc 34 a6 19 e0 92 ef 03 b6 83 ....k..4........ 00:20:33.865 000000e0 20 76 45 1b fc f8 0d 0c 7a fc 24 db ac dd 2a 31 vE.....z.$...*1 00:20:33.865 000000f0 32 02 d9 75 51 1f 4c 3e 93 11 aa 84 24 18 3a dc 2..uQ.L>....$.:. 00:20:33.865 00000100 42 14 88 50 5b af 85 67 74 02 01 45 9b fd 07 3c B..P[..gt..E...< 00:20:33.865 00000110 4d 31 36 da 0b c7 39 83 b1 f8 c5 52 67 ce f2 e1 M16...9....Rg... 00:20:33.865 00000120 a5 b2 b4 9a b9 cb d6 73 45 09 e5 6a 0d 51 b8 5a .......sE..j.Q.Z 00:20:33.865 00000130 b1 f2 43 ed 0a ea 9d 5a b5 da 2f f9 28 75 d0 9f ..C....Z../.(u.. 00:20:33.865 00000140 3c a7 01 9e 73 83 f5 6a 89 13 10 c0 0f d5 ea 55 <...s..j.......U 00:20:33.865 00000150 be 0e ee 6b fe 25 fc 82 ff 66 1e d0 b7 09 0c fc ...k.%...f...... 00:20:33.865 00000160 60 83 d4 07 4f 2c cf bf 0c 1c 76 89 94 07 fc a1 `...O,....v..... 00:20:33.865 00000170 29 05 ec c6 3d 6a d9 e4 44 c2 6f 26 80 1d 3d 4c )...=j..D.o&..=L 00:20:33.865 00000180 78 2a df ff b0 91 ad 5a c2 cb dd 17 49 09 e0 e1 x*.....Z....I... 00:20:33.865 00000190 fa a2 28 e5 c6 d0 13 84 9d a8 ca 83 d3 ee 48 b1 ..(...........H. 00:20:33.865 000001a0 83 d2 07 f9 f2 83 f9 90 10 fa dd 2f 7f e9 d4 97 .........../.... 00:20:33.865 000001b0 15 a3 58 24 8f 1b 55 c0 07 66 49 4b 2e cd e5 72 ..X$..U..fIK...r 00:20:33.865 000001c0 4d 4b 03 e4 52 c4 30 b2 5b 2f 56 59 11 ad 31 cf MK..R.0.[/VY..1. 00:20:33.865 000001d0 65 da 0d 71 ab 66 a8 a3 8f 29 45 47 17 6c 0d 49 e..q.f...)EG.l.I 00:20:33.865 000001e0 95 47 8e 2f 87 91 41 ad 6a ca 87 72 6d 98 97 2d .G./..A.j..rm..- 00:20:33.865 000001f0 c3 95 3e 83 73 2c 9d 76 ee 00 f9 52 1a 43 a1 00 ..>.s,.v...R.C.. 00:20:33.865 dh secret: 00:20:33.865 00000000 8f 6d 90 7d 9c a4 d3 cb 4e 7e e9 61 4c d5 03 dc .m.}....N~.aL... 00:20:33.865 00000010 bc af e1 08 42 47 75 f5 d5 88 5b 73 b9 1f 3d d4 ....BGu...[s..=. 00:20:33.865 00000020 58 0e 16 f7 04 f0 49 68 4a 57 f4 b7 49 61 06 64 X.....IhJW..Ia.d 00:20:33.865 00000030 a5 ef c8 64 14 11 6f 53 e3 04 d6 f0 c1 db 81 bc ...d..oS........ 00:20:33.865 00000040 aa 99 a2 f7 a2 1f 6a 6d 8f 47 44 ae cf d4 d1 9d ......jm.GD..... 00:20:33.865 00000050 7f 30 5c 21 a2 2f 59 f2 5a 8b c6 df dc 9c 46 a7 .0\!./Y.Z.....F. 00:20:33.865 00000060 bc 21 86 27 a0 6b fa eb d1 7a dc bf e4 55 07 cc .!.'.k...z...U.. 00:20:33.865 00000070 e1 c7 19 c6 be e2 ca 83 60 d1 b4 34 75 6e 22 f2 ........`..4un". 00:20:33.865 00000080 1b 0e a0 9c 65 53 2c c0 ce ba 2f a8 d3 df 1d e4 ....eS,.../..... 00:20:33.865 00000090 df 74 a0 48 62 bb 99 72 36 4d d1 02 b6 09 42 f9 .t.Hb..r6M....B. 00:20:33.865 000000a0 7e af 5e 10 b9 73 23 bd a1 6d 06 32 e8 3e d6 30 ~.^..s#..m.2.>.0 00:20:33.865 000000b0 f5 0a 0e 0a ff 90 6c 36 62 d7 c2 87 7c a2 ab ad ......l6b...|... 00:20:33.865 000000c0 05 35 09 57 20 bf 82 12 dd 27 63 14 9a e3 86 24 .5.W ....'c....$ 00:20:33.865 000000d0 3a 2b 09 5b 50 cb d1 a5 2c 94 8c 17 d6 cd f0 ce :+.[P...,....... 00:20:33.865 000000e0 a5 28 64 da e7 7b 22 18 a7 74 fa 29 f3 12 77 25 .(d..{"..t.)..w% 00:20:33.865 000000f0 21 8d 1e 1e 69 78 12 a1 6e 47 48 a8 43 40 93 e9 !...ix..nGH.C@.. 00:20:33.865 00000100 07 22 d9 29 43 65 39 10 24 88 0a e2 bb 30 b8 4a .".)Ce9.$....0.J 00:20:33.865 00000110 cd c1 3b e7 0c ff 5e 43 fd 80 76 70 87 49 12 f4 ..;...^C..vp.I.. 00:20:33.865 00000120 46 4a 88 06 67 db c6 27 c7 95 32 29 91 43 e9 ea FJ..g..'..2).C.. 00:20:33.865 00000130 8c 5f c7 c0 86 b5 ee 10 98 72 54 f9 33 fd 51 57 ._.......rT.3.QW 00:20:33.865 00000140 01 07 ba 50 86 ab da b2 63 6c 54 dd ce e9 05 c1 ...P....clT..... 00:20:33.865 00000150 84 00 ee e2 cb d9 d7 ce 4e 4f 9f 5a de 44 7d 7b ........NO.Z.D}{ 00:20:33.865 00000160 14 9b be 5f 1e a0 be f1 c9 53 7b 71 58 a8 b0 94 ..._.....S{qX... 00:20:33.865 00000170 69 7c ec 71 18 6d ee dc 1d f6 c7 c2 38 bb 22 5c i|.q.m......8."\ 00:20:33.865 00000180 36 81 85 b2 f5 83 78 63 88 a5 99 ec 9e 3a 9d b7 6.....xc.....:.. 00:20:33.865 00000190 a1 65 b5 b4 c2 2c f3 86 7b 65 15 16 fd b3 d4 54 .e...,..{e.....T 00:20:33.865 000001a0 ed f9 d8 cb 5a f2 20 58 3a 39 4a 44 79 b7 c5 76 ....Z. X:9JDy..v 00:20:33.865 000001b0 55 db 6c ff 09 e9 7a 50 3f b2 a4 81 74 7b 25 a8 U.l...zP?...t{%. 00:20:33.865 000001c0 83 0c 57 f3 4f b5 e7 a9 62 60 a0 2b e8 de 48 80 ..W.O...b`.+..H. 00:20:33.865 000001d0 ea 5a a9 7a 41 5b fc 5e 2a a6 1b 60 0a 3b 0f f2 .Z.zA[.^*..`.;.. 00:20:33.865 000001e0 29 8a 58 91 cd 6b db b3 b8 86 76 e3 db 4c 6e c6 ).X..k....v..Ln. 00:20:33.865 000001f0 1d 5c 9c a2 34 13 ea 39 17 65 a5 0e 0e 8c 82 3e .\..4..9.e.....> 00:20:33.865 [2024-09-27 15:25:08.110416] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=1, dhgroup=3, seq=3428451721, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.865 [2024-09-27 15:25:08.126852] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.865 [2024-09-27 15:25:08.126888] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.865 [2024-09-27 15:25:08.126907] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.865 [2024-09-27 15:25:08.126927] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.865 [2024-09-27 15:25:08.126941] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.865 [2024-09-27 15:25:08.232781] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.865 [2024-09-27 15:25:08.232799] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.865 [2024-09-27 15:25:08.232806] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.865 [2024-09-27 15:25:08.232816] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.865 [2024-09-27 15:25:08.232871] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.865 ctrlr pubkey: 00:20:33.865 00000000 8f 19 9d 7b 1f 3e 08 ff fa df d3 5b 07 39 d9 66 ...{.>.....[.9.f 00:20:33.865 00000010 1d 71 53 12 f9 5f 6c 33 71 d1 d7 2c 66 61 ad 97 .qS.._l3q..,fa.. 00:20:33.865 00000020 a0 ad 80 b1 b4 35 fe e7 64 2d ec d5 61 d7 8e 75 .....5..d-..a..u 00:20:33.865 00000030 02 ae 42 c2 3c f3 c4 a7 80 a8 51 80 7f 81 96 1c ..B.<.....Q..... 00:20:33.865 00000040 b4 69 e9 86 82 40 a0 f3 73 01 78 16 1f c7 ac 09 .i...@..s.x..... 00:20:33.865 00000050 82 88 8b c2 ef 68 48 00 21 4c b4 82 ac 8f a1 30 .....hH.!L.....0 00:20:33.865 00000060 9e fe ce ac 29 66 d6 af 11 87 ee 80 05 6c 94 c0 ....)f.......l.. 00:20:33.865 00000070 e3 92 1b b3 15 d3 30 57 b9 19 17 06 f8 d1 4d f2 ......0W......M. 00:20:33.865 00000080 86 e1 f9 18 b4 22 09 d3 7a 19 94 cd da dc f7 3c ....."..z......< 00:20:33.865 00000090 b7 54 5e 32 bf 87 02 84 f8 32 33 d9 95 18 80 99 .T^2.....23..... 00:20:33.865 000000a0 f2 77 94 fb 1b 52 6e 32 d5 fe 1a 00 b7 aa 24 1a .w...Rn2......$. 00:20:33.865 000000b0 d4 9f 19 db 49 04 9a 0b f0 56 ee 71 be 56 43 4b ....I....V.q.VCK 00:20:33.865 000000c0 67 2f 5e a7 87 54 fe f2 3e 1b fe 34 83 cf 62 b8 g/^..T..>..4..b. 00:20:33.865 000000d0 b8 fa bb 22 31 42 24 04 35 44 8b bf 60 b8 d3 0f ..."1B$.5D..`... 00:20:33.865 000000e0 ec 6d 75 bf f8 5f bb 77 a8 eb a8 92 c1 89 c0 1c .mu.._.w........ 00:20:33.865 000000f0 3b 90 9e da c9 57 97 fa 48 df 7f 89 7b c2 90 60 ;....W..H...{..` 00:20:33.865 00000100 dc 0a ae 85 13 2b be e2 bc ff 81 3a 71 8b eb 69 .....+.....:q..i 00:20:33.865 00000110 ad 38 03 8a af a4 98 9d 28 df db 18 69 37 b1 a4 .8......(...i7.. 00:20:33.865 00000120 c6 a1 ba cc de db 80 f1 bf 47 b4 a0 66 98 20 db .........G..f. . 00:20:33.865 00000130 cf 9f 26 0d c8 9c ce 08 ef 7f 8c 7c cd c3 75 f1 ..&........|..u. 00:20:33.866 00000140 0c 7d 17 88 59 f8 02 4d 68 dd 03 e9 bb f4 af f5 .}..Y..Mh....... 00:20:33.866 00000150 0f 69 c3 d9 4e 10 c5 5e be 3d 21 5c 76 0f bf ef .i..N..^.=!\v... 00:20:33.866 00000160 d8 da 14 94 0f cf 09 21 7b 12 f3 a0 d2 a4 76 3c .......!{.....v< 00:20:33.866 00000170 1a 3e d1 0c e7 f4 bf 95 95 9c b6 bd 7d e7 df 39 .>..........}..9 00:20:33.866 00000180 f4 a7 39 f4 56 fc 3a ea 5a 56 c7 91 81 7d 5e e7 ..9.V.:.ZV...}^. 00:20:33.866 00000190 ca 7c 6c 3d 52 7b 03 a3 80 ad 7a b1 57 2c 82 07 .|l=R{....z.W,.. 00:20:33.866 000001a0 80 72 47 7c 3e 19 f5 36 08 72 43 44 37 3e 91 e6 .rG|>..6.rCD7>.. 00:20:33.866 000001b0 41 2c ca 35 5b 00 52 4b 14 b4 06 7a a8 10 04 34 A,.5[.RK...z...4 00:20:33.866 000001c0 73 dd f8 43 1d 2c 62 0c 26 2d 3a 79 80 f5 dc 21 s..C.,b.&-:y...! 00:20:33.866 000001d0 3c c1 08 ef bf af d3 52 30 a6 10 7c 02 71 4b e8 <......R0..|.qK. 00:20:33.866 000001e0 db c4 34 c3 4a 84 8f ad 8b 1c 6d 9b 36 75 f3 d2 ..4.J.....m.6u.. 00:20:33.866 000001f0 88 47 2f 16 87 1c ce 1c fc af 2d 75 c9 09 b7 2a .G/.......-u...* 00:20:33.866 host pubkey: 00:20:33.866 00000000 74 87 5d ac 9d ae 68 5a 11 f8 ec 6c 2a 7f b0 bc t.]...hZ...l*... 00:20:33.866 00000010 50 e9 6f 80 8c b9 5a 49 0d 9c 1d d4 35 10 c0 06 P.o...ZI....5... 00:20:33.866 00000020 fa 77 86 30 c1 6b 60 9c 64 53 e2 f4 34 37 fe e4 .w.0.k`.dS..47.. 00:20:33.866 00000030 cb ae 21 79 cc 60 b7 79 bb 3e 9f f4 26 57 01 e0 ..!y.`.y.>..&W.. 00:20:33.866 00000040 a1 b6 59 64 e9 00 b4 e6 86 c8 5f 11 b9 dd 9d 6e ..Yd......_....n 00:20:33.866 00000050 64 a3 dc e3 a5 d0 f5 1c 80 89 cc 06 09 78 d7 fa d............x.. 00:20:33.866 00000060 ed fd 78 b0 b2 5c ac 5f b1 28 c2 78 3a b8 6f 60 ..x..\._.(.x:.o` 00:20:33.866 00000070 bf d1 e4 d0 cb 40 55 fe 3b df e4 ee 0d 38 60 c1 .....@U.;....8`. 00:20:33.866 00000080 56 db 7f 70 4e 29 1d c9 34 3d 3e 30 ad 61 ec 75 V..pN)..4=>0.a.u 00:20:33.866 00000090 72 52 3b 63 59 5d ed 6d 2b ab a3 40 7c ce 71 e2 rR;cY].m+..@|.q. 00:20:33.866 000000a0 94 1c a2 9c 87 1b 93 72 f3 1f b4 76 7f 17 c3 4c .......r...v...L 00:20:33.866 000000b0 8b 32 3a 16 73 b6 f1 72 09 81 1d e8 e0 83 a1 62 .2:.s..r.......b 00:20:33.866 000000c0 a7 81 b5 9d cb 74 00 5f f0 e3 11 f4 6e ad 61 13 .....t._....n.a. 00:20:33.866 000000d0 fb 35 43 a1 fd 41 a9 a7 1b 04 96 b0 68 c1 18 84 .5C..A......h... 00:20:33.866 000000e0 5c a8 39 57 07 62 a2 fb 6d 6f 4c eb 05 ab 71 ca \.9W.b..moL...q. 00:20:33.866 000000f0 3d 83 e1 b0 f6 b3 60 80 c0 96 c0 21 72 10 eb 15 =.....`....!r... 00:20:33.866 00000100 9c 14 a4 bc 94 33 6f f9 28 43 f8 bb a2 c1 0a b1 .....3o.(C...... 00:20:33.866 00000110 7b 28 27 32 30 af e1 33 4f cb 12 36 3c 7a 83 c3 {('20..3O..6... 00:20:33.866 00000130 01 25 63 0a 91 93 f1 09 02 8c 43 e3 a3 6b ec c9 .%c.......C..k.. 00:20:33.866 00000140 60 be 59 37 9e da 39 5b ba 9f 20 47 c2 b1 04 af `.Y7..9[.. G.... 00:20:33.866 00000150 5d 75 51 08 bb 90 9c a0 b4 90 03 6a f1 e5 66 8c ]uQ........j..f. 00:20:33.866 00000160 90 bd 40 cb 03 aa 7b 2d c0 ff 84 09 22 f3 5a 36 ..@...{-....".Z6 00:20:33.866 00000170 b3 78 25 b2 f3 51 5c 4b f3 58 11 05 5f 2f b7 ea .x%..Q\K.X.._/.. 00:20:33.866 00000180 85 cd 1d 66 28 db 4b cb b7 b9 10 39 25 36 a7 99 ...f(.K....9%6.. 00:20:33.866 00000190 9f 45 30 36 6e 7e 14 14 1c fd 06 53 7c 03 83 df .E06n~.....S|... 00:20:33.866 000001a0 df 8b e7 5f 51 ca 66 84 0e 40 d3 43 42 0d 79 01 ..._Q.f..@.CB.y. 00:20:33.866 000001b0 5a 56 d5 77 26 2d 71 4f b7 0f c2 4f 9d 69 48 17 ZV.w&-qO...O.iH. 00:20:33.866 000001c0 f2 88 22 e2 bd 74 2c 36 4b 87 53 63 e7 a2 0f 4f .."..t,6K.Sc...O 00:20:33.866 000001d0 e5 de 68 07 84 8a 6b 3d d4 11 f4 c0 e4 4f b0 7c ..h...k=.....O.| 00:20:33.866 000001e0 89 c3 f8 c6 0a 74 a8 bd 9e 9a 0d 2a f5 fb e4 ea .....t.....*.... 00:20:33.866 000001f0 03 13 5d d9 2d 35 f1 7e 87 3e 0c f7 82 40 ac 42 ..].-5.~.>...@.B 00:20:33.866 dh secret: 00:20:33.866 00000000 e6 e1 12 39 8e 73 41 01 63 1a a8 da 98 83 ef 86 ...9.sA.c....... 00:20:33.866 00000010 28 6e b8 5c f2 b7 0d d1 bb f1 9d f4 54 62 ce f0 (n.\........Tb.. 00:20:33.866 00000020 01 59 d8 07 9b 34 1d 84 a4 87 f2 c1 91 b4 c8 37 .Y...4.........7 00:20:33.866 00000030 fd 5a c9 e2 c1 a5 e2 93 fd 07 3f e0 29 a2 a4 3c .Z........?.)..< 00:20:33.866 00000040 46 ee 98 b8 31 cd c1 b3 13 aa 81 46 60 dc b6 b5 F...1......F`... 00:20:33.866 00000050 3b bc 11 8b 7a a6 d4 b0 5e df c1 1a 1e 0b 30 ba ;...z...^.....0. 00:20:33.866 00000060 a3 a8 6d 24 a4 6a b9 ef 49 bf 4c 91 7f 33 0a e5 ..m$.j..I.L..3.. 00:20:33.866 00000070 74 8c 94 d8 bf b5 2d 3e dc 14 f7 53 43 7f 86 1c t.....->...SC... 00:20:33.866 00000080 87 5a be 4a a2 f9 b1 a9 97 f4 33 c6 0e 54 d9 16 .Z.J......3..T.. 00:20:33.866 00000090 0c f4 84 05 a0 bd 44 17 e4 a9 52 a9 a7 88 f2 1a ......D...R..... 00:20:33.866 000000a0 b6 ae 09 38 04 98 fe 10 ca b3 7e 9d cf 7f b1 12 ...8......~..... 00:20:33.866 000000b0 74 04 1d be e0 1f 64 8d 9b 25 08 15 b6 28 6a ba t.....d..%...(j. 00:20:33.866 000000c0 e8 c8 79 2f ea 82 29 e5 d6 80 40 db 4f 08 5f c1 ..y/..)...@.O._. 00:20:33.866 000000d0 be b3 73 9a 55 c6 0a b4 e4 4e f0 1e 67 15 3a 91 ..s.U....N..g.:. 00:20:33.866 000000e0 c0 11 77 48 d9 72 53 73 2b a6 ca df 55 7c 4e dd ..wH.rSs+...U|N. 00:20:33.866 000000f0 8d 83 f8 e9 bb 4b c9 39 66 38 99 36 ef 96 6e b9 .....K.9f8.6..n. 00:20:33.866 00000100 96 39 6b ac 95 17 bd e7 6c 15 aa ac 22 c4 9e 8d .9k.....l..."... 00:20:33.866 00000110 4c 7a 45 0c a0 a6 1a fd e9 e1 5c b9 ca aa 1f 48 LzE.......\....H 00:20:33.866 00000120 84 89 72 3e 56 eb 90 cd c4 e3 55 43 88 0b 23 64 ..r>V.....UC..#d 00:20:33.866 00000130 86 31 6c 00 f0 90 e8 45 18 a1 72 dc 36 9d e1 c1 .1l....E..r.6... 00:20:33.866 00000140 14 5e 66 fe dd 81 a0 35 14 c7 a2 cc f7 42 81 e6 .^f....5.....B.. 00:20:33.866 00000150 b9 da 91 65 71 30 25 75 b1 58 6e 56 48 2e 8f 00 ...eq0%u.XnVH... 00:20:33.866 00000160 c9 8f a3 d8 cf b1 60 70 1a c2 96 31 fd e8 74 13 ......`p...1..t. 00:20:33.866 00000170 e5 6c 6b 44 c6 89 a3 db 92 54 56 38 51 b9 8f 81 .lkD.....TV8Q... 00:20:33.866 00000180 52 8f 95 93 a6 52 cc b4 3a 06 45 ab 63 08 94 17 R....R..:.E.c... 00:20:33.866 00000190 62 30 51 58 a7 dc 2d c8 93 36 61 65 cd 95 04 bb b0QX..-..6ae.... 00:20:33.866 000001a0 9f 57 f8 3e 38 a4 2d 38 20 2e 11 1e 93 4e 66 13 .W.>8.-8 ....Nf. 00:20:33.866 000001b0 cf 26 0a d9 9a e0 ce 18 41 5c 20 d1 12 d8 82 6e .&......A\ ....n 00:20:33.866 000001c0 b1 24 5d 41 90 4d d6 e1 d5 d9 8e 5f 8d f2 83 b2 .$]A.M....._.... 00:20:33.866 000001d0 d6 1d 54 15 ec 4b 4b 14 98 fe 6c ee aa 9c dd a6 ..T..KK...l..... 00:20:33.866 000001e0 58 62 66 51 70 19 17 1c dd 70 3b a7 8c 94 da 80 XbfQp....p;..... 00:20:33.866 000001f0 bc 94 77 89 76 7b bf 13 fd f7 47 d6 18 14 4c e3 ..w.v{....G...L. 00:20:33.866 [2024-09-27 15:25:08.248965] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=1, dhgroup=3, seq=3428451722, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.866 [2024-09-27 15:25:08.249060] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.866 [2024-09-27 15:25:08.284418] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.866 [2024-09-27 15:25:08.284458] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.866 [2024-09-27 15:25:08.284468] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.866 [2024-09-27 15:25:08.284495] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.866 [2024-09-27 15:25:08.455902] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.866 [2024-09-27 15:25:08.455920] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.866 [2024-09-27 15:25:08.455927] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.866 [2024-09-27 15:25:08.455970] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.866 [2024-09-27 15:25:08.455993] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.866 ctrlr pubkey: 00:20:33.866 00000000 66 34 17 2c bb 89 1d ff 73 07 0a 81 01 bc f1 35 f4.,....s......5 00:20:33.866 00000010 05 1d 8b ef 78 aa 48 88 8b 54 b6 97 c4 bc ef 72 ....x.H..T.....r 00:20:33.866 00000020 45 d2 ee 00 6f 30 50 e7 37 f8 62 e6 35 8b 69 57 E...o0P.7.b.5.iW 00:20:33.866 00000030 e4 88 18 ac 1d dd a8 be bf a4 db 7b 46 73 cf 54 ...........{Fs.T 00:20:33.866 00000040 f7 e9 f8 9d 39 9c 19 e9 d3 dd 58 1e 0f 9e 57 88 ....9.....X...W. 00:20:33.866 00000050 94 97 71 02 19 5e ea ed 7a 4f da 2a 42 4c b5 ab ..q..^..zO.*BL.. 00:20:33.866 00000060 8a 22 cd c9 c6 e1 d1 c7 47 71 01 cb ad e6 f0 75 ."......Gq.....u 00:20:33.866 00000070 86 5e 07 46 38 dd 21 67 a2 87 70 9c 7a 1b c1 00 .^.F8.!g..p.z... 00:20:33.866 00000080 58 8f 30 ea f3 68 e3 fc b5 d0 12 87 72 6b 74 fa X.0..h......rkt. 00:20:33.866 00000090 1a 8b a9 bc a0 53 47 28 6e 1d 7b c1 ea e2 b2 b7 .....SG(n.{..... 00:20:33.866 000000a0 65 85 3c d2 4b 5b 62 df 14 3b ed 78 e3 a8 e5 47 e.<.K[b..;.x...G 00:20:33.866 000000b0 0f b8 1d 5f 16 be 03 b2 f5 7a d8 df f6 52 db de ..._.....z...R.. 00:20:33.866 000000c0 0e 6a 1b 99 30 84 a4 38 59 4e b4 87 79 cc 60 43 .j..0..8YN..y.`C 00:20:33.866 000000d0 ef ab 2c 46 e7 64 e5 b1 45 c5 fb 43 f0 fa e1 15 ..,F.d..E..C.... 00:20:33.866 000000e0 88 27 04 81 de c5 5b 36 6b b4 dd b5 d0 7a dd 99 .'....[6k....z.. 00:20:33.866 000000f0 ec f2 c5 bb f9 2b 47 aa fc cd 49 74 e9 19 cb 76 .....+G...It...v 00:20:33.866 00000100 71 5f 2d 7c bb 81 66 22 a7 b8 c9 a7 e9 9f bf 55 q_-|..f".......U 00:20:33.866 00000110 e4 51 e3 22 5a c1 53 2f 28 a6 ca e4 d9 b8 c7 40 .Q."Z.S/(......@ 00:20:33.866 00000120 0e e6 26 d1 f4 ed 9a 2b 44 65 43 3a 60 b8 c9 7c ..&....+DeC:`..| 00:20:33.866 00000130 97 cd 5d d8 ca b6 54 39 58 8a 80 df 04 7f 13 3a ..]...T9X......: 00:20:33.866 00000140 c5 33 56 01 74 b3 c1 8b 4b ec 7c 4a 8a d4 40 65 .3V.t...K.|J..@e 00:20:33.866 00000150 a5 f7 93 8a 27 12 5a b6 3d 43 d8 f4 68 5d 9f 1f ....'.Z.=C..h].. 00:20:33.866 00000160 a4 80 c1 69 fd 3b 14 66 46 2e 4f 53 52 a8 ab c2 ...i.;.fF.OSR... 00:20:33.866 00000170 6d f0 b6 18 37 95 1b 3e 9a a7 cd f3 21 05 49 74 m...7..>....!.It 00:20:33.866 00000180 28 53 52 b3 43 8c 83 6e d2 a5 94 58 ff 23 91 29 (SR.C..n...X.#.) 00:20:33.866 00000190 cc 4d ad ea 97 0c 9f 20 95 d1 d8 4c 4d 4b 27 fc .M..... ...LMK'. 00:20:33.866 000001a0 61 90 05 23 9f e6 73 c0 3d 52 60 5c 67 c3 de 56 a..#..s.=R`\g..V 00:20:33.866 000001b0 96 e8 a4 6d c3 eb 8e 86 9c fd d3 3c a0 6f ab f5 ...m.......<.o.. 00:20:33.866 000001c0 4b 12 90 30 9f 9c 86 b6 a2 63 2b 1d 82 fe 60 c1 K..0.....c+...`. 00:20:33.866 000001d0 c9 ce 7b 1f be 8b 09 1e 09 08 ca e0 01 98 ae 9a ..{............. 00:20:33.867 000001e0 a3 e4 24 cb be be 41 83 1c 51 91 4d 2a 24 aa 66 ..$...A..Q.M*$.f 00:20:33.867 000001f0 90 c5 03 02 63 b7 76 53 87 ff 3a ce 3a 23 13 46 ....c.vS..:.:#.F 00:20:33.867 host pubkey: 00:20:33.867 00000000 70 59 02 59 92 f6 29 de 86 9d 1b b5 8c 25 42 48 pY.Y..)......%BH 00:20:33.867 00000010 98 e5 d7 44 df 1a 46 55 c7 79 b6 fa d7 01 5b 9b ...D..FU.y....[. 00:20:33.867 00000020 0b 2b e8 30 e7 0f 77 f4 3d f3 d3 bf bd e5 66 9a .+.0..w.=.....f. 00:20:33.867 00000030 e8 66 06 3d e7 d6 10 d9 07 7a c5 e3 c9 93 8a 11 .f.=.....z...... 00:20:33.867 00000040 d0 b2 80 cc 52 66 8a bf ab f8 0e 2b 6a c2 f7 dc ....Rf.....+j... 00:20:33.867 00000050 57 b6 84 36 6b 55 32 46 f9 aa 0d 24 02 cc 24 80 W..6kU2F...$..$. 00:20:33.867 00000060 32 3e 79 12 94 03 f3 19 ff 5d 35 10 a9 7a 9b 6a 2>y......]5..z.j 00:20:33.867 00000070 06 06 5a 2f f9 df 84 22 38 2e 21 19 d4 a4 6c 75 ..Z/..."8.!...lu 00:20:33.867 00000080 e0 7f c9 5a 1c ad ad 2c b8 d1 ee d5 d1 3d a4 e0 ...Z...,.....=.. 00:20:33.867 00000090 d3 61 a0 17 26 90 2f 9d 00 7d f0 fe 3f 57 2b 1e .a..&./..}..?W+. 00:20:33.867 000000a0 60 a3 52 11 b7 55 1d f2 71 67 5a 2a 9f b5 38 3c `.R..U..qgZ*..8< 00:20:33.867 000000b0 24 fa 2f 83 4d 33 31 1d 49 33 f9 b8 46 21 9b 0d $./.M31.I3..F!.. 00:20:33.867 000000c0 e8 40 03 4b a6 cd a9 ec b2 3c cf 99 8f 0b 3f ed .@.K.....<....?. 00:20:33.867 000000d0 5d 14 c8 da 17 57 43 e5 73 af e9 32 9e f6 9e d6 ]....WC.s..2.... 00:20:33.867 000000e0 6a db e2 d3 29 bb 14 29 d1 c6 6a 4c 38 cf 94 3a j...)..)..jL8..: 00:20:33.867 000000f0 07 df 0f e2 5d 25 d0 4e a6 44 ea eb 30 22 6a 8d ....]%.N.D..0"j. 00:20:33.867 00000100 f4 63 96 38 2a 79 5d 62 f3 a0 6b cd 8a be 05 86 .c.8*y]b..k..... 00:20:33.867 00000110 a2 e5 cb db dd e6 c0 52 99 23 b3 18 21 7e b7 42 .......R.#..!~.B 00:20:33.867 00000120 83 fc 67 ed 79 45 22 80 ae 25 c2 0a e0 6e 10 7a ..g.yE"..%...n.z 00:20:33.867 00000130 52 52 bb 9f c2 ec be 11 39 d1 ac af 63 86 3e 11 RR......9...c.>. 00:20:33.867 00000140 ea 61 7e 58 2e 99 c4 32 d2 5b 3e 17 d8 2a 05 e8 .a~X...2.[>..*.. 00:20:33.867 00000150 46 b1 3f 88 7c e6 a1 7b 45 5c 05 67 6e e1 f7 2f F.?.|..{E\.gn../ 00:20:33.867 00000160 8b 28 8e 21 95 52 94 b7 46 0f 9d 0c 31 62 a2 e6 .(.!.R..F...1b.. 00:20:33.867 00000170 47 db bb 7d bb 44 18 36 a5 f6 49 b7 47 46 ab 5c G..}.D.6..I.GF.\ 00:20:33.867 00000180 08 f4 19 16 4b fc 87 b9 8f 58 d0 7a 10 1f 43 83 ....K....X.z..C. 00:20:33.867 00000190 b6 97 39 20 ee 87 97 27 c2 52 86 f4 c3 47 5b ba ..9 ...'.R...G[. 00:20:33.867 000001a0 ba b4 99 5b 39 a7 93 fc 8b 60 a9 ac ae 90 c0 54 ...[9....`.....T 00:20:33.867 000001b0 d3 3c 30 20 69 ac 1d bf 0a ce 65 c2 20 fe 31 80 .<0 i.....e. .1. 00:20:33.867 000001c0 2f 67 ac bb 54 8e 04 37 ed a1 d6 ce a8 d9 70 08 /g..T..7......p. 00:20:33.867 000001d0 9f d2 5e f2 e8 03 e4 aa 8b 26 51 06 bb 01 fc 42 ..^......&Q....B 00:20:33.867 000001e0 37 44 3f d6 75 d6 bd cc 99 5b fe 97 74 54 3e 0b 7D?.u....[..tT>. 00:20:33.867 000001f0 8f 77 f2 36 e3 24 ce d4 e3 10 77 7f 9f 23 77 f2 .w.6.$....w..#w. 00:20:33.867 dh secret: 00:20:33.867 00000000 26 6c 22 cb 50 2e 9f f6 46 03 20 e6 b8 a3 46 f7 &l".P...F. ...F. 00:20:33.867 00000010 22 1f 0f 39 4c 25 52 2d dc e4 db a0 af 76 f6 9e "..9L%R-.....v.. 00:20:33.867 00000020 b6 e0 59 75 44 5b 46 30 9d 04 1f ce 5f 99 31 22 ..YuD[F0...._.1" 00:20:33.867 00000030 8a 86 0b bc 4e d5 b0 a5 75 61 f9 7f 44 81 4f 56 ....N...ua..D.OV 00:20:33.867 00000040 da 11 cb 69 01 f1 a7 b8 bc 9c f6 7a 04 30 e6 ac ...i.......z.0.. 00:20:33.867 00000050 83 b2 e7 4c f9 db d6 a1 59 26 48 dd 78 f2 86 62 ...L....Y&H.x..b 00:20:33.867 00000060 32 27 3f 43 52 6b 14 cc 81 f4 22 59 cb 6d 60 32 2'?CRk...."Y.m`2 00:20:33.867 00000070 e9 de 32 9d 39 84 67 33 f1 eb 0b f7 9f db aa 8e ..2.9.g3........ 00:20:33.867 00000080 3b 01 a9 65 ce 34 41 bb c3 9b 87 4a ea 5a ea 3d ;..e.4A....J.Z.= 00:20:33.867 00000090 ca 13 c0 5b 53 49 fa 4e b7 6e 14 f3 b8 7f 59 1c ...[SI.N.n....Y. 00:20:33.867 000000a0 cd 4f 52 e1 f1 13 d9 b3 30 95 17 31 e4 55 00 b7 .OR.....0..1.U.. 00:20:33.867 000000b0 d1 f9 d1 bf cf 05 a5 f9 dd 70 24 8b 61 63 b8 68 .........p$.ac.h 00:20:33.867 000000c0 7b e1 6a 45 61 a4 2b 77 0e 52 14 3d 2a ac f4 b7 {.jEa.+w.R.=*... 00:20:33.867 000000d0 ec db e8 31 57 b2 91 28 39 09 96 52 31 eb 4f 29 ...1W..(9..R1.O) 00:20:33.867 000000e0 39 08 5e 6c 75 e5 a9 30 c7 a6 c8 33 99 2d b1 d8 9.^lu..0...3.-.. 00:20:33.867 000000f0 b9 f8 ce 8d 8f 70 76 3e da 2d a0 c3 25 c8 2f 24 .....pv>.-..%./$ 00:20:33.867 00000100 84 22 47 1b 12 c8 5d 38 40 19 ec fd 96 cc 79 98 ."G...]8@.....y. 00:20:33.867 00000110 1c 24 8c 22 97 fa 27 3b 6a 99 de a3 84 42 f7 d8 .$."..';j....B.. 00:20:33.867 00000120 fa fc 70 90 ae de b3 2c fb b6 ae 35 84 60 a7 a5 ..p....,...5.`.. 00:20:33.867 00000130 f7 07 93 59 43 f5 19 a1 48 82 b1 0f 3a 9f a6 d5 ...YC...H...:... 00:20:33.867 00000140 52 dd 66 b4 ed ce 8b 8d 3a de 77 ec 5f a9 98 8a R.f.....:.w._... 00:20:33.867 00000150 35 1b 50 56 bb f9 0f 67 f8 ce 89 c6 bb de 6a e7 5.PV...g......j. 00:20:33.867 00000160 68 69 e4 c9 2c 7a 58 32 7a 42 4f 7d 88 19 f1 52 hi..,zX2zBO}...R 00:20:33.867 00000170 da 68 41 21 e8 bc 36 b7 7f 6a 97 69 0b cd df 31 .hA!..6..j.i...1 00:20:33.867 00000180 76 b5 84 fa 71 58 6e 05 1d c8 bb 66 da af 85 d9 v...qXn....f.... 00:20:33.867 00000190 ed 75 14 59 d4 db 03 da 50 f5 4d 76 24 17 53 13 .u.Y....P.Mv$.S. 00:20:33.867 000001a0 1f 36 0b 55 5b 68 7b 87 83 7a 2c eb 3e db 87 6b .6.U[h{..z,.>..k 00:20:33.867 000001b0 31 11 10 a1 04 3d 4f 9b 15 7f 8c 60 86 dc d6 51 1....=O....`...Q 00:20:33.867 000001c0 6f ce 30 3e 43 47 bf 96 15 ae 8e df eb 3a 67 d6 o.0>CG.......:g. 00:20:33.867 000001d0 0d 4e 67 09 25 52 43 f7 8f d0 05 df c3 ac ba 19 .Ng.%RC......... 00:20:33.867 000001e0 ef ae b3 51 e1 2f a1 1b f4 04 88 30 bf 87 4d 86 ...Q./.....0..M. 00:20:33.867 000001f0 2c 55 be 43 82 bc a2 3e 0f ff b9 94 f5 62 1f b3 ,U.C...>.....b.. 00:20:33.867 [2024-09-27 15:25:08.471838] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=1, dhgroup=3, seq=3428451723, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.867 [2024-09-27 15:25:08.488802] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.867 [2024-09-27 15:25:08.488828] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.867 [2024-09-27 15:25:08.488844] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.867 [2024-09-27 15:25:08.488849] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.867 [2024-09-27 15:25:08.594584] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.867 [2024-09-27 15:25:08.594601] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.867 [2024-09-27 15:25:08.594609] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.867 [2024-09-27 15:25:08.594619] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.867 [2024-09-27 15:25:08.594673] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.867 ctrlr pubkey: 00:20:33.867 00000000 66 34 17 2c bb 89 1d ff 73 07 0a 81 01 bc f1 35 f4.,....s......5 00:20:33.867 00000010 05 1d 8b ef 78 aa 48 88 8b 54 b6 97 c4 bc ef 72 ....x.H..T.....r 00:20:33.867 00000020 45 d2 ee 00 6f 30 50 e7 37 f8 62 e6 35 8b 69 57 E...o0P.7.b.5.iW 00:20:33.867 00000030 e4 88 18 ac 1d dd a8 be bf a4 db 7b 46 73 cf 54 ...........{Fs.T 00:20:33.867 00000040 f7 e9 f8 9d 39 9c 19 e9 d3 dd 58 1e 0f 9e 57 88 ....9.....X...W. 00:20:33.867 00000050 94 97 71 02 19 5e ea ed 7a 4f da 2a 42 4c b5 ab ..q..^..zO.*BL.. 00:20:33.867 00000060 8a 22 cd c9 c6 e1 d1 c7 47 71 01 cb ad e6 f0 75 ."......Gq.....u 00:20:33.867 00000070 86 5e 07 46 38 dd 21 67 a2 87 70 9c 7a 1b c1 00 .^.F8.!g..p.z... 00:20:33.867 00000080 58 8f 30 ea f3 68 e3 fc b5 d0 12 87 72 6b 74 fa X.0..h......rkt. 00:20:33.867 00000090 1a 8b a9 bc a0 53 47 28 6e 1d 7b c1 ea e2 b2 b7 .....SG(n.{..... 00:20:33.867 000000a0 65 85 3c d2 4b 5b 62 df 14 3b ed 78 e3 a8 e5 47 e.<.K[b..;.x...G 00:20:33.867 000000b0 0f b8 1d 5f 16 be 03 b2 f5 7a d8 df f6 52 db de ..._.....z...R.. 00:20:33.867 000000c0 0e 6a 1b 99 30 84 a4 38 59 4e b4 87 79 cc 60 43 .j..0..8YN..y.`C 00:20:33.867 000000d0 ef ab 2c 46 e7 64 e5 b1 45 c5 fb 43 f0 fa e1 15 ..,F.d..E..C.... 00:20:33.867 000000e0 88 27 04 81 de c5 5b 36 6b b4 dd b5 d0 7a dd 99 .'....[6k....z.. 00:20:33.867 000000f0 ec f2 c5 bb f9 2b 47 aa fc cd 49 74 e9 19 cb 76 .....+G...It...v 00:20:33.867 00000100 71 5f 2d 7c bb 81 66 22 a7 b8 c9 a7 e9 9f bf 55 q_-|..f".......U 00:20:33.867 00000110 e4 51 e3 22 5a c1 53 2f 28 a6 ca e4 d9 b8 c7 40 .Q."Z.S/(......@ 00:20:33.867 00000120 0e e6 26 d1 f4 ed 9a 2b 44 65 43 3a 60 b8 c9 7c ..&....+DeC:`..| 00:20:33.867 00000130 97 cd 5d d8 ca b6 54 39 58 8a 80 df 04 7f 13 3a ..]...T9X......: 00:20:33.867 00000140 c5 33 56 01 74 b3 c1 8b 4b ec 7c 4a 8a d4 40 65 .3V.t...K.|J..@e 00:20:33.867 00000150 a5 f7 93 8a 27 12 5a b6 3d 43 d8 f4 68 5d 9f 1f ....'.Z.=C..h].. 00:20:33.867 00000160 a4 80 c1 69 fd 3b 14 66 46 2e 4f 53 52 a8 ab c2 ...i.;.fF.OSR... 00:20:33.867 00000170 6d f0 b6 18 37 95 1b 3e 9a a7 cd f3 21 05 49 74 m...7..>....!.It 00:20:33.867 00000180 28 53 52 b3 43 8c 83 6e d2 a5 94 58 ff 23 91 29 (SR.C..n...X.#.) 00:20:33.867 00000190 cc 4d ad ea 97 0c 9f 20 95 d1 d8 4c 4d 4b 27 fc .M..... ...LMK'. 00:20:33.867 000001a0 61 90 05 23 9f e6 73 c0 3d 52 60 5c 67 c3 de 56 a..#..s.=R`\g..V 00:20:33.867 000001b0 96 e8 a4 6d c3 eb 8e 86 9c fd d3 3c a0 6f ab f5 ...m.......<.o.. 00:20:33.867 000001c0 4b 12 90 30 9f 9c 86 b6 a2 63 2b 1d 82 fe 60 c1 K..0.....c+...`. 00:20:33.867 000001d0 c9 ce 7b 1f be 8b 09 1e 09 08 ca e0 01 98 ae 9a ..{............. 00:20:33.867 000001e0 a3 e4 24 cb be be 41 83 1c 51 91 4d 2a 24 aa 66 ..$...A..Q.M*$.f 00:20:33.867 000001f0 90 c5 03 02 63 b7 76 53 87 ff 3a ce 3a 23 13 46 ....c.vS..:.:#.F 00:20:33.867 host pubkey: 00:20:33.867 00000000 f7 2f dd 4c bd 38 bf 74 57 cd e3 1e a3 60 34 ce ./.L.8.tW....`4. 00:20:33.867 00000010 c0 e2 b5 cc 15 86 ba a6 b5 84 20 89 05 b1 dc b0 .......... ..... 00:20:33.867 00000020 80 34 b5 9e 9a 35 0c c6 09 56 c5 9f cd 34 a3 e1 .4...5...V...4.. 00:20:33.867 00000030 aa be 90 e3 f6 27 c5 93 0f fb 45 1c 5c f4 54 47 .....'....E.\.TG 00:20:33.867 00000040 57 ab ac b2 a9 af 59 a5 8e 88 27 ec f9 65 fc af W.....Y...'..e.. 00:20:33.867 00000050 45 08 72 2a 13 14 de 91 92 e8 6a 1f 0e 52 58 48 E.r*......j..RXH 00:20:33.867 00000060 2a 61 05 e7 02 48 f5 65 f8 63 9e 84 f8 5b 1d 92 *a...H.e.c...[.. 00:20:33.867 00000070 d7 00 8c e6 e4 e7 fb cb 7f 8e ef d3 db 14 ac 29 ...............) 00:20:33.867 00000080 08 e7 7d 9a 13 53 4a 28 d7 59 f3 c9 94 db a9 53 ..}..SJ(.Y.....S 00:20:33.867 00000090 36 db 46 b5 9e e9 f4 99 cf 8e ac eb 65 8b 61 37 6.F.........e.a7 00:20:33.867 000000a0 1a 4d 4c 90 73 08 36 fa 0a 96 30 df d3 cb e7 7e .ML.s.6...0....~ 00:20:33.867 000000b0 58 e5 19 b2 6d ae 0b 70 83 bd c1 fe 9c 23 1c 98 X...m..p.....#.. 00:20:33.867 000000c0 b5 1b 08 f2 3c 4c 25 e2 62 11 8a c3 f3 44 1c 1d .....m. 00:20:33.868 000001d0 c8 0d 49 d8 3f 8b 0e 1e c8 b5 42 03 76 d3 d5 17 ..I.?.....B.v... 00:20:33.868 000001e0 44 5c 1c 89 a2 b6 48 5a 16 6b bf 87 64 be d1 a0 D\....HZ.k..d... 00:20:33.868 000001f0 44 88 ff 93 a5 e8 d9 6f b2 05 97 3b ac e0 ec 42 D......o...;...B 00:20:33.868 dh secret: 00:20:33.868 00000000 82 87 cd 20 a9 13 82 3e 00 7c bd 93 8d 06 0f 33 ... ...>.|.....3 00:20:33.868 00000010 82 bb 9c fd c4 a1 d5 9f 01 f4 5d fe 8a 6c 53 d1 ..........]..lS. 00:20:33.868 00000020 a1 4d 32 35 bf ce b1 9c 28 08 f1 fd 3b 0b d4 8b .M25....(...;... 00:20:33.868 00000030 1b 87 72 1a 44 11 b3 db 15 a0 03 f6 57 ea c9 60 ..r.D.......W..` 00:20:33.868 00000040 31 0e ca ce 56 35 1f 3f ad 71 08 75 58 7c d2 c0 1...V5.?.q.uX|.. 00:20:33.868 00000050 4d 23 ef 9e 20 73 c9 dd 87 e3 1a cc a2 11 79 74 M#.. s........yt 00:20:33.868 00000060 f6 b3 2f 5f 47 7b 94 46 e1 2e fe 15 8b b8 54 0d ../_G{.F......T. 00:20:33.868 00000070 ed 78 af d9 c4 d6 de 3d ab 86 d4 74 8a 7e e0 b9 .x.....=...t.~.. 00:20:33.868 00000080 6e 74 d6 e4 56 89 d0 46 01 2c 44 f8 f4 4e 50 59 nt..V..F.,D..NPY 00:20:33.868 00000090 7f 26 d4 4b c6 00 12 31 bf 1f 37 0a f9 18 b2 69 .&.K...1..7....i 00:20:33.868 000000a0 79 21 42 ad 6a 5f 24 da 95 b6 c2 6b c5 ed d4 a3 y!B.j_$....k.... 00:20:33.868 000000b0 40 d2 c8 74 a6 93 e3 ab e8 3a 12 03 d3 fd 30 7a @..t.....:....0z 00:20:33.868 000000c0 8a 89 cd 1f 9d fa 1a 7a c0 ea d2 bb f0 77 47 88 .......z.....wG. 00:20:33.868 000000d0 e3 5c a5 66 c8 94 a5 04 35 a9 ca 02 58 a7 ab d9 .\.f....5...X... 00:20:33.868 000000e0 9b 6e 50 b3 47 95 ac 91 75 a9 d8 1f fd 77 e5 bc .nP.G...u....w.. 00:20:33.868 000000f0 9a fb 5c 07 b1 fe dc 68 67 c8 a0 73 44 4c 61 82 ..\....hg..sDLa. 00:20:33.868 00000100 95 d9 14 31 1d d7 b6 29 cf 31 d6 79 44 01 bc 4c ...1...).1.yD..L 00:20:33.868 00000110 fa dd fc f1 08 1f 53 c2 5d 68 a3 28 a5 ad f5 95 ......S.]h.(.... 00:20:33.868 00000120 88 ae 7a 4d 6a a9 42 2a fb 1f 34 f7 49 61 96 6c ..zMj.B*..4.Ia.l 00:20:33.868 00000130 cb 24 7b ae 0f 62 96 34 95 78 8e b4 26 c2 7a c3 .${..b.4.x..&.z. 00:20:33.868 00000140 3d af 70 3f aa f3 b8 7b 6b d0 ca c5 af b3 aa f5 =.p?...{k....... 00:20:33.868 00000150 a1 47 54 03 77 38 50 fd f2 a8 9f 1c 86 28 d7 c6 .GT.w8P......(.. 00:20:33.868 00000160 f2 2f 36 60 e7 12 2c 02 14 6a e6 b7 0f dd e5 e8 ./6`..,..j...... 00:20:33.868 00000170 8a de 63 21 2e 0a 0c e8 a2 0e 41 5b 19 02 07 dd ..c!......A[.... 00:20:33.868 00000180 17 2f 7b 71 84 f6 4c 33 5c 43 2c 0b 87 f0 48 ca ./{q..L3\C,...H. 00:20:33.868 00000190 5f 4f a2 6c 62 da 2f a6 e6 02 e2 e9 b2 2d 39 61 _O.lb./......-9a 00:20:33.868 000001a0 cd 25 1b a7 01 69 f9 4d ea 96 94 95 c1 27 b4 f3 .%...i.M.....'.. 00:20:33.868 000001b0 9a d6 f5 6b 31 9d 83 37 8e e7 21 4c 72 60 ae 18 ...k1..7..!Lr`.. 00:20:33.868 000001c0 23 90 c7 5f 5b d5 aa 92 7b 2d 66 9b 04 dd 66 b4 #.._[...{-f...f. 00:20:33.868 000001d0 74 d7 84 c7 87 c9 ff a0 ce b6 4f 4b 44 7a fa 0c t.........OKDz.. 00:20:33.868 000001e0 4c b4 cb 72 d8 86 f2 23 17 3e 7c 7a 64 29 9d ba L..r...#.>|zd).. 00:20:33.868 000001f0 a6 c4 7b 33 c9 6c 98 28 ff eb 83 3e 66 2a 3b eb ..{3.l.(...>f*;. 00:20:33.868 [2024-09-27 15:25:08.610402] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=1, dhgroup=3, seq=3428451724, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.868 [2024-09-27 15:25:08.610467] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.868 [2024-09-27 15:25:08.646864] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.868 [2024-09-27 15:25:08.646890] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.868 [2024-09-27 15:25:08.646897] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.868 [2024-09-27 15:25:08.820911] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.868 [2024-09-27 15:25:08.820931] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.868 [2024-09-27 15:25:08.820938] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.868 [2024-09-27 15:25:08.820987] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.868 [2024-09-27 15:25:08.821011] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.868 ctrlr pubkey: 00:20:33.868 00000000 e0 e6 8e 5a d9 db 3e 66 b3 f9 42 06 ab da b2 ca ...Z..>f..B..... 00:20:33.868 00000010 26 80 66 4f 40 d2 b7 fc f1 ae fc fd 79 0e 78 0a &.fO@.......y.x. 00:20:33.868 00000020 f7 f6 a0 9d c3 89 5e 93 ee 92 2d 1f 53 80 e9 e2 ......^...-.S... 00:20:33.868 00000030 f8 07 55 dd ee 1f 75 ee f0 99 58 d8 1a 6c 61 de ..U...u...X..la. 00:20:33.868 00000040 48 03 91 b5 bd 89 83 e8 a2 a9 76 9a 18 c2 2d 48 H.........v...-H 00:20:33.868 00000050 e0 3f f7 5a 44 4d a5 98 ce 34 e0 a5 a1 12 ce 61 .?.ZDM...4.....a 00:20:33.868 00000060 7c 16 58 cc d2 71 0f 06 10 29 2b df 47 eb be ca |.X..q...)+.G... 00:20:33.868 00000070 fb a7 64 96 62 76 36 0b 98 dc 72 72 d1 68 a9 ab ..d.bv6...rr.h.. 00:20:33.868 00000080 99 92 77 57 43 e1 f8 44 b7 87 94 d4 45 d8 7b 40 ..wWC..D....E.{@ 00:20:33.868 00000090 a7 e2 0c be 3e f4 76 04 96 86 3f db b8 17 10 85 ....>.v...?..... 00:20:33.868 000000a0 bc 1d a1 23 54 e3 b7 11 67 d1 89 eb a3 a4 0b 6f ...#T...g......o 00:20:33.868 000000b0 b1 24 f7 86 0e 5e a3 d4 94 7a af 45 de c8 ce 21 .$...^...z.E...! 00:20:33.868 000000c0 7b 84 55 12 8e 77 b4 fd ee 4d 93 50 3a a0 c2 51 {.U..w...M.P:..Q 00:20:33.868 000000d0 d6 1b 58 7e 98 b0 df d4 60 d7 68 f9 07 b2 a1 b2 ..X~....`.h..... 00:20:33.868 000000e0 f0 fd 88 25 48 48 16 f6 0d 4b 2c d6 28 02 71 31 ...%HH...K,.(.q1 00:20:33.868 000000f0 15 ad c9 6f e1 da 2b 87 fb b3 71 75 f6 84 53 4b ...o..+...qu..SK 00:20:33.868 00000100 b4 bf 00 8d e0 90 c7 5c cc 5e d0 e2 94 fc b6 10 .......\.^...... 00:20:33.868 00000110 2d 77 c6 a8 e1 a7 08 9c 4f 34 9e 6a e2 01 b8 4a -w......O4.j...J 00:20:33.868 00000120 71 15 a1 77 87 07 6f af f0 2d f3 8a f7 31 5a 22 q..w..o..-...1Z" 00:20:33.868 00000130 6e 1b d7 25 d2 0f f6 38 8a 68 5c 27 17 c5 9b a3 n..%...8.h\'.... 00:20:33.868 00000140 05 29 c8 b2 ce e4 39 d6 1f ed b8 7a 08 41 cd 90 .)....9....z.A.. 00:20:33.868 00000150 83 24 b5 71 93 b5 39 3d a7 39 91 35 aa 2c 03 73 .$.q..9=.9.5.,.s 00:20:33.868 00000160 9d 42 92 af 79 73 19 19 f7 dc 94 e6 2e 2f e7 c9 .B..ys......./.. 00:20:33.868 00000170 42 bb 82 82 d3 46 c8 a8 74 d3 b5 e6 b9 28 a1 8a B....F..t....(.. 00:20:33.868 00000180 5d 56 9d 8a 7f e3 37 48 dc 8b 2a 77 ed bd 25 25 ]V....7H..*w..%% 00:20:33.868 00000190 67 5a 45 df 77 ea c1 be 0c 77 5c cc c0 f4 5a 64 gZE.w....w\...Zd 00:20:33.868 000001a0 e3 d0 e5 a9 ca e8 b5 44 b7 37 60 ef 5c ab 2e 90 .......D.7`.\... 00:20:33.868 000001b0 4e 62 ef d0 2a 03 a7 2f ce b6 c0 e1 be 40 52 c7 Nb..*../.....@R. 00:20:33.868 000001c0 4e 73 9f 77 62 d1 82 cd 4b 3f 70 2f e7 87 e2 b5 Ns.wb...K?p/.... 00:20:33.868 000001d0 56 75 eb 5b f2 9b 49 67 59 ab 14 16 44 6d 03 46 Vu.[..IgY...Dm.F 00:20:33.868 000001e0 9f 67 65 90 ac 76 3e 2e af c5 99 cf b0 d1 c5 84 .ge..v>......... 00:20:33.868 000001f0 8f 3d 76 d6 4d 50 23 cd 66 5a b6 f6 9d 8a 4a eb .=v.MP#.fZ....J. 00:20:33.868 00000200 f0 0e e4 98 47 b8 09 2b f8 61 76 65 e2 58 33 c2 ....G..+.ave.X3. 00:20:33.869 00000210 09 f4 f0 0f 39 67 32 88 c9 34 bc 03 f9 3c 12 1a ....9g2..4...<.. 00:20:33.869 00000220 f3 b6 9e a8 55 ab fa 1e ea 24 81 04 74 8d 9e 69 ....U....$..t..i 00:20:33.869 00000230 cb c4 ad 14 b1 8e a1 1c b2 19 78 b4 85 ca 41 dd ..........x...A. 00:20:33.869 00000240 64 4c 26 87 e4 5a fb f3 ac 08 b0 75 dd 3f bf 02 dL&..Z.....u.?.. 00:20:33.869 00000250 36 09 87 e6 dd 98 08 ce 18 34 bc 7a a2 a1 0b 6a 6........4.z...j 00:20:33.869 00000260 73 48 b3 d5 12 78 47 92 26 03 d8 04 f2 69 24 75 sH...xG.&....i$u 00:20:33.869 00000270 b2 db ab 6d 66 3a 9a b0 76 56 82 7e 71 7f d5 64 ...mf:..vV.~q..d 00:20:33.869 00000280 22 55 08 20 07 13 7e d1 01 2a 6b 72 0e fe 4a 6c "U. ..~..*kr..Jl 00:20:33.869 00000290 76 1b b6 ad 25 b3 5a d2 8b a2 f2 68 5c 34 ad 5f v...%.Z....h\4._ 00:20:33.869 000002a0 28 34 2d 33 dc e8 ad 69 82 d4 77 1c 63 8c 87 33 (4-3...i..w.c..3 00:20:33.869 000002b0 08 d2 99 80 70 60 e8 88 1e 1a c2 b6 be 24 bb 29 ....p`.......$.) 00:20:33.869 000002c0 72 93 fa 9b 17 21 9c 18 c2 3b 15 48 fc 1b 93 19 r....!...;.H.... 00:20:33.869 000002d0 c2 da 37 52 a1 0a 6b 91 eb 7d 6a a2 36 33 23 8d ..7R..k..}j.63#. 00:20:33.869 000002e0 27 93 27 08 b0 ec a9 43 73 dc e5 ce e0 b2 5e 0e '.'....Cs.....^. 00:20:33.869 000002f0 9d 21 37 33 6f 61 51 65 03 f4 5a fa 20 02 3b 72 .!73oaQe..Z. .;r 00:20:33.869 host pubkey: 00:20:33.869 00000000 b1 81 76 07 84 82 2f ef f2 fa 1e 35 97 8a 10 ea ..v.../....5.... 00:20:33.869 00000010 c8 d3 af e4 89 5d f6 e3 b6 ce 43 79 bc 86 c6 fc .....]....Cy.... 00:20:33.869 00000020 da b3 47 69 99 96 8b b5 b7 e1 ac 85 c7 67 c3 6d ..Gi.........g.m 00:20:33.869 00000030 69 05 9b 14 ce 49 ed 53 bb 47 d0 89 f7 6e 0d 3d i....I.S.G...n.= 00:20:33.869 00000040 5f 1c 4e 48 0f a0 03 c0 c0 6c a6 44 21 79 c4 ee _.NH.....l.D!y.. 00:20:33.869 00000050 69 aa a0 66 1a 49 ec 1c d3 bb 26 e1 50 74 46 c4 i..f.I....&.PtF. 00:20:33.869 00000060 19 31 11 22 71 1a 39 6f ab e5 3a 6c 95 80 94 78 .1."q.9o..:l...x 00:20:33.869 00000070 5f b5 18 8e e8 54 61 da 7a ac 53 a8 39 c0 a6 0a _....Ta.z.S.9... 00:20:33.869 00000080 67 be 71 f5 65 b1 59 5c e7 06 66 25 f7 b4 e6 ed g.q.e.Y\..f%.... 00:20:33.869 00000090 66 58 10 59 54 48 08 d3 19 be 62 9b 07 81 f9 0e fX.YTH....b..... 00:20:33.869 000000a0 e2 85 8f 43 8b ae 9d 6e 72 c2 48 50 f6 4e 7c 74 ...C...nr.HP.N|t 00:20:33.869 000000b0 6d f4 17 ec 58 ea bf 33 9f f7 c7 c1 3d 47 b8 50 m...X..3....=G.P 00:20:33.869 000000c0 15 ad ed ff 05 a3 d8 21 fe 11 26 ff c9 13 e6 33 .......!..&....3 00:20:33.869 000000d0 49 a4 40 0d cd 59 a4 4c 49 13 d1 38 6d 48 fc dc I.@..Y.LI..8mH.. 00:20:33.869 000000e0 2d f1 3f a4 73 a6 5e 26 c0 22 1b 65 03 02 1c 32 -.?.s.^&.".e...2 00:20:33.869 000000f0 ef 6e 98 9d de 0b c1 96 ee a3 bd 6d 7c ab 9e 83 .n.........m|... 00:20:33.869 00000100 23 83 75 18 51 23 59 f1 d7 65 f2 c3 d5 35 bc 17 #.u.Q#Y..e...5.. 00:20:33.869 00000110 c2 78 2c f9 46 99 84 27 47 f0 25 02 2b 60 1c 38 .x,.F..'G.%.+`.8 00:20:33.869 00000120 d0 57 d4 f2 04 cc b4 eb 3f ff 4b aa 61 e6 0b 70 .W......?.K.a..p 00:20:33.869 00000130 d0 ac 96 05 49 bd 9a bc 93 c6 c6 a1 2d 5e b9 43 ....I.......-^.C 00:20:33.869 00000140 61 93 29 65 77 01 8f db b0 e2 98 a0 9c cf 87 d3 a.)ew........... 00:20:33.869 00000150 7f a2 81 ea f3 d4 1f 66 3d 2c 8a c6 d9 e7 a6 16 .......f=,...... 00:20:33.869 00000160 6f 25 4e 25 59 44 20 b0 1b 7d a1 ad 86 2e bf 1b o%N%YD ..}...... 00:20:33.869 00000170 24 45 84 66 3e c6 eb 6f c3 2c 1f 27 1a c2 4a 3a $E.f>..o.,.'..J: 00:20:33.869 00000180 05 56 db 56 70 23 16 01 ef 55 49 39 af 10 db a4 .V.Vp#...UI9.... 00:20:33.869 00000190 ee 2b 2b 56 69 63 03 b8 f2 db 87 b9 d1 2e 1f 61 .++Vic.........a 00:20:33.869 000001a0 21 01 d5 62 21 18 bc 9d 57 fd 75 d7 69 c5 6c 9d !..b!...W.u.i.l. 00:20:33.869 000001b0 ad a8 92 3b 31 e1 58 b1 ea d6 5d fb c2 52 08 21 ...;1.X...]..R.! 00:20:33.869 000001c0 13 08 e3 a7 03 a3 59 f4 1c 96 ab 79 79 2d a8 c6 ......Y....yy-.. 00:20:33.869 000001d0 b0 fd 1f a2 b6 13 0d 56 51 bb 1d df 5d ac 7f 03 .......VQ...]... 00:20:33.869 000001e0 e7 96 5e 3a d1 4d be 5c 48 f8 41 f3 7c 60 86 b4 ..^:.M.\H.A.|`.. 00:20:33.869 000001f0 3f f3 84 d6 98 cf e5 4a dc ad 32 8c 55 68 6e 4b ?......J..2.UhnK 00:20:33.869 00000200 7c 9e b7 21 d5 16 05 a1 de f2 6d af 88 66 e3 b6 |..!......m..f.. 00:20:33.869 00000210 95 10 03 9d dc ac 49 29 67 6e c7 bc f1 bb 2f 1b ......I)gn..../. 00:20:33.869 00000220 cd f1 a0 63 6d 09 12 24 3c b1 f4 ec 34 ac 32 fb ...cm..$<...4.2. 00:20:33.869 00000230 7b 09 b1 ec 4a 05 61 ac 37 b2 53 4e 1c f1 8d 3e {...J.a.7.SN...> 00:20:33.869 00000240 4e 67 3f 07 2c f4 3e 49 2e 07 5b 4b 72 32 13 f5 Ng?.,.>I..[Kr2.. 00:20:33.869 00000250 59 0d ca 92 1f 31 cb 39 df ad 3f 87 c9 81 66 4e Y....1.9..?...fN 00:20:33.869 00000260 2a 83 e2 2d 06 a3 c9 52 92 72 76 56 b8 6d 74 4a *..-...R.rvV.mtJ 00:20:33.869 00000270 12 e3 86 74 d9 ab 87 c0 ed 57 b6 2f 55 fe cc a4 ...t.....W./U... 00:20:33.869 00000280 10 e2 39 13 85 48 05 72 80 11 b3 79 00 d8 51 a8 ..9..H.r...y..Q. 00:20:33.869 00000290 25 85 5c 04 65 6f 37 cd d4 a2 fe 09 77 ac e4 bc %.\.eo7.....w... 00:20:33.869 000002a0 cf 2c 62 ec 8a 0d 4e fe ca db d7 d1 bf 12 b7 c2 .,b...N......... 00:20:33.869 000002b0 a8 10 13 e8 17 42 ba bb 23 79 d7 4d 78 08 45 a7 .....B..#y.Mx.E. 00:20:33.869 000002c0 51 b5 9c b7 fe 41 fa 0e 8d 8a 7e 88 07 0c 1a f5 Q....A....~..... 00:20:33.869 000002d0 a7 47 a0 07 63 4c 18 d5 5c 39 11 c9 28 50 3f 67 .G..cL..\9..(P?g 00:20:33.869 000002e0 91 18 00 d6 de 6d ea 0d 80 e4 d8 9e 97 74 c6 1b .....m.......t.. 00:20:33.869 000002f0 70 0c f7 dd f4 47 54 ac cd 15 a1 40 66 14 8c 00 p....GT....@f... 00:20:33.869 dh secret: 00:20:33.869 00000000 e3 7d 86 de c2 f7 af 5b 77 b3 14 00 bb ea e0 b0 .}.....[w....... 00:20:33.869 00000010 cf bb ba a3 45 fd 2a df fe f0 3d 25 88 be 3b 4e ....E.*...=%..;N 00:20:33.869 00000020 59 ca 3a e7 7e db ec 4c 6b b4 54 e2 63 19 8f 4d Y.:.~..Lk.T.c..M 00:20:33.869 00000030 48 05 fa 31 52 ba 2c 5d 4a c8 9e a8 65 43 55 48 H..1R.,]J...eCUH 00:20:33.869 00000040 b3 3a 9e a4 7b f3 88 f7 97 1c b2 53 d0 d5 b2 0f .:..{......S.... 00:20:33.869 00000050 cb 61 1f 61 ae 97 bc 24 96 12 5a b7 27 e0 79 38 .a.a...$..Z.'.y8 00:20:33.869 00000060 10 6b e2 63 2a 50 38 43 28 18 97 16 e7 21 d4 64 .k.c*P8C(....!.d 00:20:33.869 00000070 86 96 d1 15 2e b5 e4 1a 03 60 bd 9b 21 e3 96 ee .........`..!... 00:20:33.869 00000080 91 d5 ea c1 ce 60 7f 78 24 ae 5a 19 03 f5 ab be .....`.x$.Z..... 00:20:33.869 00000090 55 ac 65 91 20 48 eb c4 9f 59 30 6d bd 2d 69 b7 U.e. H...Y0m.-i. 00:20:33.869 000000a0 9e 65 50 16 ad 82 b5 a0 b2 1f 05 b4 76 98 f2 15 .eP.........v... 00:20:33.869 000000b0 47 d7 5f bf 9f 01 16 90 a3 fc f3 6c 94 00 15 94 G._........l.... 00:20:33.869 000000c0 48 28 c0 4e a7 25 a5 28 fb 42 02 7d a2 39 3b 9d H(.N.%.(.B.}.9;. 00:20:33.869 000000d0 97 0a 9d 50 25 48 dc 24 51 b2 70 23 d7 94 2b 9f ...P%H.$Q.p#..+. 00:20:33.869 000000e0 0e 19 a8 86 a5 dd ea db 80 ba ca ee 5e f2 bc 54 ............^..T 00:20:33.869 000000f0 47 bf 36 64 1c 91 91 fb bc c8 7a 4c 4d ee 31 13 G.6d......zLM.1. 00:20:33.869 00000100 88 b8 d3 b4 63 ba e6 a5 d7 16 c1 6b 0a 8e 41 09 ....c......k..A. 00:20:33.869 00000110 07 94 29 9d 35 89 4f 70 a9 70 91 d8 d6 81 f8 c4 ..).5.Op.p...... 00:20:33.869 00000120 5e 75 b4 38 b7 5d b2 3b 3c 2c db 0c 7a 13 f4 bb ^u.8.].;<,..z... 00:20:33.869 00000130 3b 3e 0e 5a 90 6d b2 03 7e b2 3c 98 9f df 48 51 ;>.Z.m..~.<...HQ 00:20:33.869 00000140 e3 1a 31 8c 68 4b 24 02 1e 97 81 2f 69 90 06 99 ..1.hK$..../i... 00:20:33.869 00000150 48 57 31 e9 c4 aa c5 fe 99 1f 81 d3 e1 00 b0 cb HW1............. 00:20:33.869 00000160 fc 51 84 35 0f b9 2a c5 2e 7d ce 89 21 d2 3f d0 .Q.5..*..}..!.?. 00:20:33.869 00000170 37 91 17 2d d3 25 e3 a8 a7 8b 23 b5 db 70 7d 02 7..-.%....#..p}. 00:20:33.869 00000180 d5 a7 e6 60 f0 a0 18 50 d2 b3 61 c9 61 16 9e 43 ...`...P..a.a..C 00:20:33.869 00000190 e6 21 99 2e a8 4b 00 b2 78 79 fa af 45 2e d0 19 .!...K..xy..E... 00:20:33.869 000001a0 fd 37 f0 40 db 8d 74 ba 7e 86 02 49 31 f5 3a 25 .7.@..t.~..I1.:% 00:20:33.869 000001b0 e8 22 8e d0 62 e7 7f 89 6d b2 fe 50 51 91 d6 2e ."..b...m..PQ... 00:20:33.869 000001c0 67 12 ad 0c 40 c7 8d f8 e5 2c 33 2d 48 04 d6 5c g...@....,3-H..\ 00:20:33.869 000001d0 73 97 5d 10 79 ab a8 69 49 72 9d 2b 38 9d 9d 1a s.].y..iIr.+8... 00:20:33.869 000001e0 c7 87 ed 58 2d 75 8b 42 47 c2 72 1e 86 ea 19 8c ...X-u.BG.r..... 00:20:33.869 000001f0 1d 64 34 3e 2a 46 c2 01 29 d2 26 bf 8f 7b 6e 5e .d4>*F..).&..{n^ 00:20:33.869 00000200 86 9e f3 03 74 ad ea cf e8 88 e1 49 d2 07 3f fc ....t......I..?. 00:20:33.869 00000210 ff 5a 09 08 18 28 dd 20 9f d5 3b ad 13 ec ea ad .Z...(. ..;..... 00:20:33.869 00000220 1c 90 c2 07 48 2a 17 d3 38 5f a7 c3 27 e3 bd 8a ....H*..8_..'... 00:20:33.869 00000230 a2 4d 4c 29 30 05 ff a1 b6 bc a1 23 6f 50 3b 7b .ML)0......#oP;{ 00:20:33.869 00000240 54 de 43 50 5f 66 a3 ff 93 ef 0a 40 3e 8d 1d b5 T.CP_f.....@>... 00:20:33.869 00000250 9c 02 b2 32 7c 81 1e 7f f7 bd 3d 9a 5b 4f d7 b7 ...2|.....=.[O.. 00:20:33.869 00000260 45 d9 70 41 0c 60 fd f8 1a d8 0f 2a ea cc f6 eb E.pA.`.....*.... 00:20:33.869 00000270 90 13 79 4d 97 4b cc 05 62 d8 36 24 0a 31 7e b4 ..yM.K..b.6$.1~. 00:20:33.869 00000280 27 f8 6a 81 01 ae 6d d3 d7 32 b4 38 77 61 d8 d0 '.j...m..2.8wa.. 00:20:33.869 00000290 ee c6 72 f3 3c 30 8a 70 d0 9f bd f2 6d 1f 3c 57 ..r.<0.p....m.f..B..... 00:20:33.870 00000010 26 80 66 4f 40 d2 b7 fc f1 ae fc fd 79 0e 78 0a &.fO@.......y.x. 00:20:33.870 00000020 f7 f6 a0 9d c3 89 5e 93 ee 92 2d 1f 53 80 e9 e2 ......^...-.S... 00:20:33.870 00000030 f8 07 55 dd ee 1f 75 ee f0 99 58 d8 1a 6c 61 de ..U...u...X..la. 00:20:33.870 00000040 48 03 91 b5 bd 89 83 e8 a2 a9 76 9a 18 c2 2d 48 H.........v...-H 00:20:33.870 00000050 e0 3f f7 5a 44 4d a5 98 ce 34 e0 a5 a1 12 ce 61 .?.ZDM...4.....a 00:20:33.870 00000060 7c 16 58 cc d2 71 0f 06 10 29 2b df 47 eb be ca |.X..q...)+.G... 00:20:33.870 00000070 fb a7 64 96 62 76 36 0b 98 dc 72 72 d1 68 a9 ab ..d.bv6...rr.h.. 00:20:33.870 00000080 99 92 77 57 43 e1 f8 44 b7 87 94 d4 45 d8 7b 40 ..wWC..D....E.{@ 00:20:33.870 00000090 a7 e2 0c be 3e f4 76 04 96 86 3f db b8 17 10 85 ....>.v...?..... 00:20:33.870 000000a0 bc 1d a1 23 54 e3 b7 11 67 d1 89 eb a3 a4 0b 6f ...#T...g......o 00:20:33.870 000000b0 b1 24 f7 86 0e 5e a3 d4 94 7a af 45 de c8 ce 21 .$...^...z.E...! 00:20:33.870 000000c0 7b 84 55 12 8e 77 b4 fd ee 4d 93 50 3a a0 c2 51 {.U..w...M.P:..Q 00:20:33.870 000000d0 d6 1b 58 7e 98 b0 df d4 60 d7 68 f9 07 b2 a1 b2 ..X~....`.h..... 00:20:33.870 000000e0 f0 fd 88 25 48 48 16 f6 0d 4b 2c d6 28 02 71 31 ...%HH...K,.(.q1 00:20:33.870 000000f0 15 ad c9 6f e1 da 2b 87 fb b3 71 75 f6 84 53 4b ...o..+...qu..SK 00:20:33.870 00000100 b4 bf 00 8d e0 90 c7 5c cc 5e d0 e2 94 fc b6 10 .......\.^...... 00:20:33.870 00000110 2d 77 c6 a8 e1 a7 08 9c 4f 34 9e 6a e2 01 b8 4a -w......O4.j...J 00:20:33.870 00000120 71 15 a1 77 87 07 6f af f0 2d f3 8a f7 31 5a 22 q..w..o..-...1Z" 00:20:33.870 00000130 6e 1b d7 25 d2 0f f6 38 8a 68 5c 27 17 c5 9b a3 n..%...8.h\'.... 00:20:33.870 00000140 05 29 c8 b2 ce e4 39 d6 1f ed b8 7a 08 41 cd 90 .)....9....z.A.. 00:20:33.870 00000150 83 24 b5 71 93 b5 39 3d a7 39 91 35 aa 2c 03 73 .$.q..9=.9.5.,.s 00:20:33.870 00000160 9d 42 92 af 79 73 19 19 f7 dc 94 e6 2e 2f e7 c9 .B..ys......./.. 00:20:33.870 00000170 42 bb 82 82 d3 46 c8 a8 74 d3 b5 e6 b9 28 a1 8a B....F..t....(.. 00:20:33.870 00000180 5d 56 9d 8a 7f e3 37 48 dc 8b 2a 77 ed bd 25 25 ]V....7H..*w..%% 00:20:33.870 00000190 67 5a 45 df 77 ea c1 be 0c 77 5c cc c0 f4 5a 64 gZE.w....w\...Zd 00:20:33.870 000001a0 e3 d0 e5 a9 ca e8 b5 44 b7 37 60 ef 5c ab 2e 90 .......D.7`.\... 00:20:33.870 000001b0 4e 62 ef d0 2a 03 a7 2f ce b6 c0 e1 be 40 52 c7 Nb..*../.....@R. 00:20:33.870 000001c0 4e 73 9f 77 62 d1 82 cd 4b 3f 70 2f e7 87 e2 b5 Ns.wb...K?p/.... 00:20:33.870 000001d0 56 75 eb 5b f2 9b 49 67 59 ab 14 16 44 6d 03 46 Vu.[..IgY...Dm.F 00:20:33.870 000001e0 9f 67 65 90 ac 76 3e 2e af c5 99 cf b0 d1 c5 84 .ge..v>......... 00:20:33.870 000001f0 8f 3d 76 d6 4d 50 23 cd 66 5a b6 f6 9d 8a 4a eb .=v.MP#.fZ....J. 00:20:33.870 00000200 f0 0e e4 98 47 b8 09 2b f8 61 76 65 e2 58 33 c2 ....G..+.ave.X3. 00:20:33.870 00000210 09 f4 f0 0f 39 67 32 88 c9 34 bc 03 f9 3c 12 1a ....9g2..4...<.. 00:20:33.870 00000220 f3 b6 9e a8 55 ab fa 1e ea 24 81 04 74 8d 9e 69 ....U....$..t..i 00:20:33.870 00000230 cb c4 ad 14 b1 8e a1 1c b2 19 78 b4 85 ca 41 dd ..........x...A. 00:20:33.870 00000240 64 4c 26 87 e4 5a fb f3 ac 08 b0 75 dd 3f bf 02 dL&..Z.....u.?.. 00:20:33.870 00000250 36 09 87 e6 dd 98 08 ce 18 34 bc 7a a2 a1 0b 6a 6........4.z...j 00:20:33.870 00000260 73 48 b3 d5 12 78 47 92 26 03 d8 04 f2 69 24 75 sH...xG.&....i$u 00:20:33.870 00000270 b2 db ab 6d 66 3a 9a b0 76 56 82 7e 71 7f d5 64 ...mf:..vV.~q..d 00:20:33.870 00000280 22 55 08 20 07 13 7e d1 01 2a 6b 72 0e fe 4a 6c "U. ..~..*kr..Jl 00:20:33.870 00000290 76 1b b6 ad 25 b3 5a d2 8b a2 f2 68 5c 34 ad 5f v...%.Z....h\4._ 00:20:33.870 000002a0 28 34 2d 33 dc e8 ad 69 82 d4 77 1c 63 8c 87 33 (4-3...i..w.c..3 00:20:33.870 000002b0 08 d2 99 80 70 60 e8 88 1e 1a c2 b6 be 24 bb 29 ....p`.......$.) 00:20:33.870 000002c0 72 93 fa 9b 17 21 9c 18 c2 3b 15 48 fc 1b 93 19 r....!...;.H.... 00:20:33.870 000002d0 c2 da 37 52 a1 0a 6b 91 eb 7d 6a a2 36 33 23 8d ..7R..k..}j.63#. 00:20:33.870 000002e0 27 93 27 08 b0 ec a9 43 73 dc e5 ce e0 b2 5e 0e '.'....Cs.....^. 00:20:33.870 000002f0 9d 21 37 33 6f 61 51 65 03 f4 5a fa 20 02 3b 72 .!73oaQe..Z. .;r 00:20:33.870 host pubkey: 00:20:33.870 00000000 66 df 76 1e 15 8c 02 46 38 87 a9 4a d9 e3 66 da f.v....F8..J..f. 00:20:33.870 00000010 27 90 5d 30 e7 12 ec 81 af 9c af 6f 30 36 4d 9c '.]0.......o06M. 00:20:33.870 00000020 5c 4e c9 99 a3 e8 02 74 80 0a 00 f6 af 02 1e d1 \N.....t........ 00:20:33.870 00000030 10 5b 86 0f 93 c0 d8 dc 11 56 42 67 eb 90 50 65 .[.......VBg..Pe 00:20:33.870 00000040 62 25 19 45 be 41 66 2c 5a 39 a9 c3 f2 1a 21 30 b%.E.Af,Z9....!0 00:20:33.870 00000050 73 da f3 95 3e 10 1c 4e d4 8a 44 a8 c5 bb 30 e0 s...>..N..D...0. 00:20:33.870 00000060 56 31 79 4e 48 d4 0a c8 67 a1 99 80 9d 51 f3 5e V1yNH...g....Q.^ 00:20:33.870 00000070 48 b3 41 4f 54 c4 e8 25 ad e3 b1 1a 02 9d 01 c7 H.AOT..%........ 00:20:33.870 00000080 6d bb e5 17 61 8a 35 ae 46 ff 39 d5 7b de 56 91 m...a.5.F.9.{.V. 00:20:33.870 00000090 20 eb fb 4e da 8d 79 c5 e0 7b 3a f2 23 b3 c2 73 ..N..y..{:.#..s 00:20:33.870 000000a0 ac c6 1f 09 70 4c 8b 7b 90 70 8d 9e dd 45 03 d1 ....pL.{.p...E.. 00:20:33.870 000000b0 8b 90 28 c0 f7 e6 fb b4 ff 7c 79 de 7c cd 74 4c ..(......|y.|.tL 00:20:33.870 000000c0 78 e6 da ed 06 9a 08 b1 2e 03 6a ea a2 0b 35 4e x.........j...5N 00:20:33.870 000000d0 54 a3 c6 3f 04 5e 7f a6 28 a3 69 5f b1 a5 61 b5 T..?.^..(.i_..a. 00:20:33.870 000000e0 1a 16 d6 a7 d9 44 9a d2 22 8e e2 bf 54 e8 72 e7 .....D.."...T.r. 00:20:33.870 000000f0 0c 9e 47 8c 00 6f 55 4c eb 19 9b b2 62 04 a4 71 ..G..oUL....b..q 00:20:33.870 00000100 02 a2 00 62 b6 1c ec 06 0f 65 b9 d3 ac b0 d1 24 ...b.....e.....$ 00:20:33.870 00000110 a0 0a d0 80 69 5e 3f 83 0f cd 2a fe 2b 9e e4 e2 ....i^?...*.+... 00:20:33.870 00000120 11 bb 66 78 a2 b8 9c fe 86 27 fe c1 d5 0e 09 2c ..fx.....'....., 00:20:33.870 00000130 97 f2 2f ce 71 37 a8 dd 21 ab 84 16 13 74 f1 61 ../.q7..!....t.a 00:20:33.870 00000140 61 3c dc d9 cc 76 2e 45 be ed 06 dd c6 2e 75 e4 a<...v.E......u. 00:20:33.870 00000150 20 f3 91 66 c6 81 77 28 4c 28 cc ab fe 6c 76 23 ..f..w(L(...lv# 00:20:33.870 00000160 b3 64 c4 b4 36 a7 7d 69 30 bc ff 9f fd 38 d1 99 .d..6.}i0....8.. 00:20:33.870 00000170 59 72 64 24 0a 8d 88 d7 85 6b 6a 80 a3 e0 47 3f Yrd$.....kj...G? 00:20:33.870 00000180 76 85 d9 ed bd 8e 30 b7 8f be b5 3c 68 5e 85 7f v.....0.....AW~b. 00:20:33.870 000001f0 9f ab f5 bd b2 85 22 e7 8e dd b8 5d 0a 7b 61 51 ......"....].{aQ 00:20:33.870 00000200 84 55 ae 03 a4 e7 45 69 be 15 07 df bf 02 69 94 .U....Ei......i. 00:20:33.870 00000210 0d 69 19 fe ef f0 f0 c2 fa 68 9d 5f 48 39 75 30 .i.......h._H9u0 00:20:33.870 00000220 98 53 58 04 e3 06 d6 e6 a9 13 13 80 17 76 06 ad .SX..........v.. 00:20:33.870 00000230 de 50 61 2b a9 74 49 39 6d dd 6d b2 6a 99 f8 08 .Pa+.tI9m.m.j... 00:20:33.870 00000240 67 e4 d7 14 22 90 0e 6e 76 e0 21 d8 d4 0a 01 4b g..."..nv.!....K 00:20:33.870 00000250 41 76 b2 ce bf f1 3f 79 49 6f d3 fe 74 fc f1 ae Av....?yIo..t... 00:20:33.870 00000260 3f d8 f4 fa 53 5f 41 c1 09 1d 26 b3 29 e8 2c e0 ?...S_A...&.).,. 00:20:33.870 00000270 4e bd fa c3 fb 5e 34 e6 53 53 b7 96 6f 18 97 77 N....^4.SS..o..w 00:20:33.870 00000280 81 61 5b bc 0e 70 97 4a cc 59 22 2c b2 25 a7 20 .a[..p.J.Y",.%. 00:20:33.870 00000290 2b b0 7a aa 37 af 75 48 0b fc fb ed e9 00 a7 93 +.z.7.uH........ 00:20:33.870 000002a0 ad 63 95 8a 80 d1 61 ff 6b ae 69 0a 9a 9d f4 ae .c....a.k.i..... 00:20:33.870 000002b0 b3 b2 67 41 8f d9 f9 71 98 68 dd 92 33 20 59 01 ..gA...q.h..3 Y. 00:20:33.870 000002c0 5a f6 3b 25 8b 3a 39 c0 7c 5c 1c 3b 88 85 41 10 Z.;%.:9.|\.;..A. 00:20:33.870 000002d0 19 91 f1 b6 da 12 43 ec d4 36 54 b4 91 09 d6 98 ......C..6T..... 00:20:33.870 000002e0 e7 5a 7d 1b 75 52 2e 7a 55 c0 7c 06 3e 09 69 9c .Z}.uR.zU.|.>.i. 00:20:33.870 000002f0 1f 9b 32 a9 21 b0 d7 13 ea b7 63 63 27 2c c4 26 ..2.!.....cc',.& 00:20:33.870 dh secret: 00:20:33.870 00000000 5e dc 14 8f c5 81 0d 87 00 96 70 5d 43 b5 0a 0b ^.........p]C... 00:20:33.870 00000010 be 39 ca 09 0c 6d 77 89 0d 97 a5 e0 8d a3 f7 32 .9...mw........2 00:20:33.870 00000020 12 00 2d 31 b4 89 27 2c 05 0e 4e 43 bd 27 5f dd ..-1..',..NC.'_. 00:20:33.870 00000030 8c 9f ba f6 1f 39 4f 46 be f0 ce 5c 25 7c b3 57 .....9OF...\%|.W 00:20:33.870 00000040 45 d0 b7 53 85 36 a1 11 cd 32 14 01 0d 1c 22 e7 E..S.6...2....". 00:20:33.870 00000050 89 bd 21 28 13 1b 20 23 19 d8 8f 1b 94 a7 a7 70 ..!(.. #.......p 00:20:33.870 00000060 29 61 26 47 d5 c9 70 cc 87 37 df 21 c4 4c 5b 63 )a&G..p..7.!.L[c 00:20:33.870 00000070 39 6c 56 e4 d2 a2 2b a9 d1 a4 33 71 56 30 70 60 9lV...+...3qV0p` 00:20:33.870 00000080 b9 40 7a 81 08 e8 cc 5c ab 97 79 e1 a2 19 43 35 .@z....\..y...C5 00:20:33.870 00000090 23 b5 dd d5 70 a9 88 71 8d c8 2e 4a fd ca f2 85 #...p..q...J.... 00:20:33.870 000000a0 61 e5 83 f9 4a 7a b6 dc cd 1a 0d 14 a4 68 74 a3 a...Jz.......ht. 00:20:33.870 000000b0 86 eb bd 6b fc 2c 89 4b 6a 22 60 e2 b0 6b 87 31 ...k.,.Kj"`..k.1 00:20:33.870 000000c0 69 08 a5 57 98 69 05 58 97 e7 ea 20 87 91 5f 1a i..W.i.X... .._. 00:20:33.870 000000d0 9b 10 97 ac 5c f2 c9 89 d9 b6 66 74 ad 86 8e 4f ....\.....ft...O 00:20:33.870 000000e0 1d 4c de 7a 39 dd 2a 6f 83 5e a0 3b 5d 3f a8 a9 .L.z9.*o.^.;]?.. 00:20:33.870 000000f0 ea 21 84 12 91 ae 8e 02 b7 93 0b 50 68 d1 e3 cc .!.........Ph... 00:20:33.870 00000100 57 10 ae 76 28 46 4c ff aa 09 99 41 84 b6 9e aa W..v(FL....A.... 00:20:33.870 00000110 23 a6 f8 44 4c ce 08 a1 56 f1 d6 ed 54 9b 18 a4 #..DL...V...T... 00:20:33.870 00000120 ac 2f 2a df bb 34 45 3a ec a2 d4 7d 75 05 de 75 ./*..4E:...}u..u 00:20:33.870 00000130 99 db b4 cc 76 f2 d1 b5 0b f5 9e e8 68 cc fb c2 ....v.......h... 00:20:33.870 00000140 34 e1 cc b7 1e 0f 8d 8f a7 0d 0a 25 e1 80 b9 6a 4..........%...j 00:20:33.870 00000150 d2 30 62 a6 30 94 0a c5 f0 27 b5 5b 0b 69 31 d8 .0b.0....'.[.i1. 00:20:33.870 00000160 a9 c1 19 4a 10 1a d8 85 bb fd d4 c4 99 10 4a ba ...J..........J. 00:20:33.870 00000170 9b bd 13 26 a1 e5 60 53 c6 c2 8d f3 7d 92 6c 0e ...&..`S....}.l. 00:20:33.870 00000180 de 48 7f ed 8c bc 54 e5 b5 83 c3 bd e2 bf 75 b7 .H....T.......u. 00:20:33.870 00000190 ff bd 6c cf 61 a8 02 44 78 cc e4 9f 02 c2 92 9f ..l.a..Dx....... 00:20:33.870 000001a0 e1 cd e2 fc 9c d8 7e ae d9 9f d6 e4 90 04 51 09 ......~.......Q. 00:20:33.871 000001b0 91 58 01 ff dd 27 3b 53 03 03 de bb aa 69 ef 5c .X...';S.....i.\ 00:20:33.871 000001c0 55 12 2f 95 fc c5 2b ac 67 fb 23 4c cd c7 b9 ea U./...+.g.#L.... 00:20:33.871 000001d0 87 0a 48 09 31 c8 f9 04 86 48 a8 60 b9 f8 5d c2 ..H.1....H.`..]. 00:20:33.871 000001e0 39 3e be 91 62 5e c4 52 a5 85 71 9a 55 99 c4 52 9>..b^.R..q.U..R 00:20:33.871 000001f0 b4 a9 c6 32 87 bc c4 2d f7 7b 68 b2 d8 99 2d 9a ...2...-.{h...-. 00:20:33.871 00000200 81 ea 68 6e 31 fb db ef 21 a5 3d d2 83 74 ba 85 ..hn1...!.=..t.. 00:20:33.871 00000210 6c 9f 56 25 47 79 2b f0 27 d5 e7 90 b3 a4 cb 50 l.V%Gy+.'......P 00:20:33.871 00000220 48 85 06 ad f1 b1 91 ed 8d 02 07 de 48 0a 9f 1c H...........H... 00:20:33.871 00000230 06 d6 6a a7 ea 53 5c 9a 34 9c 80 48 38 eb d1 ac ..j..S\.4..H8... 00:20:33.871 00000240 0c b2 15 6a 7a 32 fe 3c b2 46 41 dc 96 67 db 12 ...jz2.<.FA..g.. 00:20:33.871 00000250 61 31 2a b7 30 9a 99 63 97 ef d5 d3 5e 49 e0 cb a1*.0..c....^I.. 00:20:33.871 00000260 ca 01 c1 79 12 14 4f 66 09 dd ca fa 7c 92 6a 83 ...y..Of....|.j. 00:20:33.871 00000270 82 d6 ff 05 5a 42 07 4a fd 59 77 4d 79 d3 1a 77 ....ZB.J.YwMy..w 00:20:33.871 00000280 80 c5 75 a1 0b 4b bf e6 4e 61 53 96 76 8b 41 ac ..u..K..NaS.v.A. 00:20:33.871 00000290 3e 9a 69 69 31 d7 03 b0 7c 5e f6 04 8b df 40 0b >.ii1...|^....@. 00:20:33.871 000002a0 4f fd cd 67 52 b9 ed 99 14 b2 15 e3 27 1b f8 75 O..gR.......'..u 00:20:33.871 000002b0 93 5d 9e 88 04 94 33 be 23 fa 9b 2f d4 ba 2a f4 .]....3.#../..*. 00:20:33.871 000002c0 e3 72 e2 af 60 b0 14 5c d6 29 97 7e 1c 18 a1 a1 .r..`..\.).~.... 00:20:33.871 000002d0 02 10 57 9b 86 0b b3 e1 7d 3d 47 3d fe 6d 8e 9b ..W.....}=G=.m.. 00:20:33.871 000002e0 cc 2d 40 46 6d 15 3a a4 31 49 c5 67 7b 0f bf 52 .-@Fm.:.1I.g{..R 00:20:33.871 000002f0 b2 5a 5d 29 95 d3 44 6e 44 b6 8d fa 08 67 2b 5b .Z])..DnD....g+[ 00:20:33.871 [2024-09-27 15:25:09.062079] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=1, dhgroup=4, seq=3428451726, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.871 [2024-09-27 15:25:09.062186] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.871 [2024-09-27 15:25:09.120570] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.871 [2024-09-27 15:25:09.120615] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.871 [2024-09-27 15:25:09.120625] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.871 [2024-09-27 15:25:09.120652] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.871 [2024-09-27 15:25:09.293392] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.871 [2024-09-27 15:25:09.293411] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.871 [2024-09-27 15:25:09.293418] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.871 [2024-09-27 15:25:09.293463] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.871 [2024-09-27 15:25:09.293485] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.871 ctrlr pubkey: 00:20:33.871 00000000 66 69 2a 5c 24 65 ff 27 15 65 c0 9c 2a d8 dc a0 fi*\$e.'.e..*... 00:20:33.871 00000010 77 ce 75 89 b6 8b af 2b 92 26 bd 02 92 b3 ac f7 w.u....+.&...... 00:20:33.871 00000020 06 81 9d 5b f0 15 c0 09 c8 79 78 3c af d0 d5 fd ...[.....yx<.... 00:20:33.871 00000030 c9 aa e0 69 34 49 5b 2c a4 1e 6b 75 70 f1 44 9b ...i4I[,..kup.D. 00:20:33.871 00000040 08 f7 0b 01 5f cb 49 d6 5a 97 71 11 91 85 22 bb ...._.I.Z.q...". 00:20:33.871 00000050 fd d7 2d bb 11 bc 4c 39 e7 e4 61 dd 13 73 8d 71 ..-...L9..a..s.q 00:20:33.871 00000060 74 dc 13 7e 9b 7e 7e cb 09 33 61 c0 13 0d fb 45 t..~.~~..3a....E 00:20:33.871 00000070 9f 9d cd 84 e9 e4 da 91 af b4 e1 aa 91 58 50 68 .............XPh 00:20:33.871 00000080 cf e4 41 fe 8e 02 33 79 ff 95 cd d0 b7 03 1f 4d ..A...3y.......M 00:20:33.871 00000090 7d 89 5a e1 1a 59 11 96 05 4d 86 ec d3 b7 2a 77 }.Z..Y...M....*w 00:20:33.871 000000a0 22 ef 05 42 95 8e d4 ff a0 2d cb e0 66 ec 02 e9 "..B.....-..f... 00:20:33.871 000000b0 8d 70 1d 9c fa e8 de 6d 0e 89 71 04 03 e7 d2 0e .p.....m..q..... 00:20:33.871 000000c0 d7 1d a7 14 3d 12 ff 44 82 f1 dd 83 d9 c2 d6 59 ....=..D.......Y 00:20:33.871 000000d0 49 47 91 39 03 8c b8 ae b9 1a 57 a6 7e 46 4e 2d IG.9......W.~FN- 00:20:33.871 000000e0 ff de 0d d8 02 37 2a 19 d6 a3 5b 5a b2 e2 3d 40 .....7*...[Z..=@ 00:20:33.871 000000f0 ba 1d 75 58 cd 25 82 17 5c f1 a1 d0 b6 f3 d0 82 ..uX.%..\....... 00:20:33.871 00000100 8f 4c 1c 84 4b 7f 65 d4 0b c5 44 96 e6 a8 5c bd .L..K.e...D...\. 00:20:33.871 00000110 03 ff d0 32 b1 4d df 82 03 f2 c6 b6 38 7b 4b c7 ...2.M......8{K. 00:20:33.871 00000120 66 31 58 1f dd 99 ea 68 12 93 bd 0c 30 ab f8 6c f1X....h....0..l 00:20:33.871 00000130 a3 ee 85 e1 5b e5 50 ae af a0 c6 af 01 de f7 bc ....[.P......... 00:20:33.871 00000140 69 d8 08 8c 8a 20 2e 3f ba 82 57 9f 43 20 b8 7c i.... .?..W.C .| 00:20:33.871 00000150 38 12 7e f7 71 cf bc 55 f0 67 97 13 3d e1 91 7c 8.~.q..U.g..=..| 00:20:33.871 00000160 52 fa ad 25 20 af 1b 34 07 65 31 72 e1 37 9e ae R..% ..4.e1r.7.. 00:20:33.871 00000170 31 29 fa 3c a2 9f 24 fa c5 85 98 d5 62 32 ad 50 1).<..$.....b2.P 00:20:33.871 00000180 16 b0 3f 11 46 34 48 65 26 a0 1a 05 98 c6 6b 1f ..?.F4He&.....k. 00:20:33.871 00000190 15 50 81 0d c3 98 bd 5c 67 c3 27 d5 ef 00 6b 29 .P.....\g.'...k) 00:20:33.871 000001a0 fa a1 1a e9 59 28 31 92 29 c0 24 16 54 69 be b9 ....Y(1.).$.Ti.. 00:20:33.871 000001b0 08 fb d5 c0 81 38 80 b0 af d8 67 a3 76 43 72 da .....8....g.vCr. 00:20:33.871 000001c0 d0 3e b9 92 48 94 57 8a 09 71 9b 4e c2 51 ed 05 .>..H.W..q.N.Q.. 00:20:33.871 000001d0 22 91 d1 06 bd b0 2b 54 53 31 d2 d0 b6 2c b2 24 ".....+TS1...,.$ 00:20:33.871 000001e0 01 4d 8e 35 86 c3 aa be 2b 4f 12 50 54 08 51 10 .M.5....+O.PT.Q. 00:20:33.871 000001f0 44 ce 45 c2 e0 93 09 88 c7 63 6d 14 02 b5 be 95 D.E......cm..... 00:20:33.871 00000200 98 aa 4e 74 61 eb 1f 83 21 b1 01 71 b2 bf c6 f7 ..Nta...!..q.... 00:20:33.871 00000210 69 5b 95 09 5e 5a 7b 54 23 22 8d 58 02 4a 26 92 i[..^Z{T#".X.J&. 00:20:33.871 00000220 0a e1 87 ea 12 40 77 a2 38 ec 40 ac ce 0f 03 45 .....@w.8.@....E 00:20:33.871 00000230 80 0d 26 22 8e 1e 2f a8 f3 64 e3 5f eb ee 43 30 ..&"../..d._..C0 00:20:33.871 00000240 80 e4 07 24 d1 01 0c f2 74 09 a9 8f e8 f3 37 4d ...$....t.....7M 00:20:33.871 00000250 e2 69 0d 7b 1c d8 9e 01 4f bb 43 6a e4 82 6c 72 .i.{....O.Cj..lr 00:20:33.871 00000260 8c 87 d6 08 0f 03 ca a3 1e fe 09 7a a4 5e e1 e1 ...........z.^.. 00:20:33.871 00000270 ee 54 59 2b 24 51 13 09 83 ff e5 16 be ab 1b 2a .TY+$Q.........* 00:20:33.871 00000280 9a fb 21 48 2e b5 6d 25 da ae 2b 79 81 49 6b e9 ..!H..m%..+y.Ik. 00:20:33.871 00000290 6d 2f 8a 57 ee 79 be 5e 31 83 e7 ec af 36 15 07 m/.W.y.^1....6.. 00:20:33.871 000002a0 5b bb 09 19 73 73 ab fc dd c7 47 64 a6 d6 e7 41 [...ss....Gd...A 00:20:33.871 000002b0 a2 0d 84 d5 9e a4 a0 3b 93 fb 00 2f 48 fc 8c 07 .......;.../H... 00:20:33.871 000002c0 8c 16 37 bc ce 2b 03 0c e3 df 7b 3e 08 b1 7f ff ..7..+....{>.... 00:20:33.871 000002d0 40 e6 12 1b ea e1 22 fe f1 92 e3 2e ec c4 d6 91 @....."......... 00:20:33.871 000002e0 4f 41 31 ee 32 cb 0c 29 fb cf 0a 57 c4 2f 8a 4b OA1.2..)...W./.K 00:20:33.871 000002f0 48 ef f8 51 d6 b2 d7 04 6e c1 3a 41 36 57 d9 b1 H..Q....n.:A6W.. 00:20:33.871 host pubkey: 00:20:33.871 00000000 45 0c 21 a8 e5 a5 4d cd 07 83 51 dd 7e 0e 6e 2c E.!...M...Q.~.n, 00:20:33.871 00000010 cb 0d b0 80 16 4f 48 48 25 f1 35 67 47 c9 8f 06 .....OHH%.5gG... 00:20:33.871 00000020 07 40 e3 b9 9b 41 b6 e8 72 93 98 54 00 98 ab 8f .@...A..r..T.... 00:20:33.871 00000030 e8 a4 51 dc 49 0a 53 0c 82 15 23 e0 03 2c 47 ea ..Q.I.S...#..,G. 00:20:33.871 00000040 44 93 3f 01 4c 86 7b 70 d3 db 5c 0e 27 c4 6f 45 D.?.L.{p..\.'.oE 00:20:33.871 00000050 91 eb 00 91 50 dc c2 0f 59 69 e8 b5 80 ee 95 f5 ....P...Yi...... 00:20:33.871 00000060 41 85 2c 2e bd 38 6f 07 bb f0 2e 69 ac 29 4f 6f A.,..8o....i.)Oo 00:20:33.871 00000070 17 ce 10 fa aa 9c 44 8d b4 cb 20 9b ba 7c 2d 95 ......D... ..|-. 00:20:33.871 00000080 84 6c 85 d8 30 1c f5 3d 93 d2 fa 58 7a 84 17 11 .l..0..=...Xz... 00:20:33.871 00000090 7f 70 47 5e d8 83 c8 8b 04 6b d3 bf 18 69 68 9d .pG^.....k...ih. 00:20:33.871 000000a0 41 c8 34 89 ce 09 f5 65 f5 25 90 02 da 40 b5 97 A.4....e.%...@.. 00:20:33.871 000000b0 4d f4 db fe 92 62 ad 06 77 37 c5 7e d9 ac 09 b5 M....b..w7.~.... 00:20:33.871 000000c0 f9 36 07 99 12 fa ac 9e 0a 3b 8e 27 b6 5d ee 7c .6.......;.'.].| 00:20:33.871 000000d0 4f 12 ac 13 a8 31 51 dd 17 5d fd fe ce 7c 94 3d O....1Q..]...|.= 00:20:33.871 000000e0 7e 36 1e b5 f9 37 a3 a4 3a 40 95 16 66 e7 65 99 ~6...7..:@..f.e. 00:20:33.871 000000f0 38 9d c6 78 9c 07 da d3 15 31 48 e6 bd ff 6f 78 8..x.....1H...ox 00:20:33.871 00000100 32 ce 77 f3 c5 89 4c 47 81 d8 fb d5 0a 11 bd 6b 2.w...LG.......k 00:20:33.871 00000110 9a be 96 f7 09 cb d7 af 05 22 47 6a 8f 68 f4 af ........."Gj.h.. 00:20:33.871 00000120 d7 96 c4 49 ac 63 20 2e eb cc d9 4d a4 f4 4f 4b ...I.c ....M..OK 00:20:33.871 00000130 ed 1d 92 ed f6 32 32 00 dd 43 48 31 72 50 91 db .....22..CH1rP.. 00:20:33.871 00000140 7f b4 44 6c ab fb 4e 56 56 48 3e 25 15 a4 35 25 ..Dl..NVVH>%..5% 00:20:33.871 00000150 4d 0b bf 50 ea 89 63 0c 52 e1 15 c6 0c 78 0f 1e M..P..c.R....x.. 00:20:33.871 00000160 50 f3 c8 5f 03 1e db 91 cc fa ca fe 94 1f 99 c6 P.._............ 00:20:33.871 00000170 2e 54 00 cf 0f de 03 6a d7 cc c8 ca 61 ab 95 c8 .T.....j....a... 00:20:33.871 00000180 c7 55 04 d9 f4 a2 bc 3e 23 e3 8e 00 95 05 24 b2 .U.....>#.....$. 00:20:33.871 00000190 3f 55 92 e0 42 89 20 2e f3 5d 22 6c 24 b6 2f 59 ?U..B. ..]"l$./Y 00:20:33.871 000001a0 a3 15 7f a6 94 53 d8 d5 3e aa b9 5e 9b 87 d0 ae .....S..>..^.... 00:20:33.871 000001b0 66 12 fa c0 c6 ec a4 a7 64 ee d1 0b 58 0a d3 50 f.......d...X..P 00:20:33.871 000001c0 3d 29 07 3d 5e 16 13 33 e9 e3 0a 95 39 6e 6c 8b =).=^..3....9nl. 00:20:33.871 000001d0 e3 56 ad e6 25 97 c3 4d ae 3d ee 72 5b 65 46 6b .V..%..M.=.r[eFk 00:20:33.871 000001e0 95 49 3b cb 5d 14 0a e3 e4 1a aa dc ea e3 2b 42 .I;.].........+B 00:20:33.871 000001f0 3a 46 b8 39 64 9b 14 6d ad 5c be 3b 08 d3 51 01 :F.9d..m.\.;..Q. 00:20:33.871 00000200 bb 31 c0 30 8d b3 19 72 53 fe 15 70 c9 f4 21 8f .1.0...rS..p..!. 00:20:33.871 00000210 9c 47 c9 ae 56 dc d7 20 16 68 01 11 53 30 37 34 .G..V.. .h..S074 00:20:33.871 00000220 ae 26 0f a9 87 55 86 f5 cb 30 04 0b 9a a1 79 2d .&...U...0....y- 00:20:33.871 00000230 57 44 d3 2a 13 d5 b0 a2 e6 d6 e1 7d 94 f2 1a 59 WD.*.......}...Y 00:20:33.871 00000240 59 a6 52 a2 b2 73 59 74 bd 16 49 e0 3a be 05 74 Y.R..sYt..I.:..t 00:20:33.871 00000250 81 1e 05 79 ae 9a 93 8f 6f be 97 cd 09 83 76 72 ...y....o.....vr 00:20:33.871 00000260 f8 bc c9 34 5d dc c5 1d 3e fd e0 df a9 d7 a5 53 ...4]...>......S 00:20:33.871 00000270 15 04 27 ba 43 5f e3 26 b9 38 5c d4 82 52 59 41 ..'.C_.&.8\..RYA 00:20:33.872 00000280 2f c2 ae b9 51 48 7d b4 aa 78 ca 9d d2 06 aa 0a /...QH}..x...... 00:20:33.872 00000290 24 33 6f 75 01 95 ab f4 93 1f 53 6d 2e c5 3e 7d $3ou......Sm..>} 00:20:33.872 000002a0 82 ff cf 0c da 24 57 9d 53 c0 9a 35 ae 04 f1 c6 .....$W.S..5.... 00:20:33.872 000002b0 44 e7 2b 18 51 85 0d c6 80 01 64 c0 ad 44 70 1f D.+.Q.....d..Dp. 00:20:33.872 000002c0 ab e6 41 fd 18 78 3e 41 1d a8 4b 00 c6 3f 5c 98 ..A..x>A..K..?\. 00:20:33.872 000002d0 ae 87 60 bf 36 b4 4e 30 29 c0 a9 a8 f8 53 07 0c ..`.6.N0)....S.. 00:20:33.872 000002e0 d2 f5 18 c6 30 84 2c dc 12 c5 a0 57 02 66 0f 9e ....0.,....W.f.. 00:20:33.872 000002f0 57 19 f1 2b 3d bf 4e 91 80 94 b5 9e 5d 02 7e 26 W..+=.N.....].~& 00:20:33.872 dh secret: 00:20:33.872 00000000 0c 30 8f 74 66 c6 91 00 38 1c 0a c7 c9 b1 8a af .0.tf...8....... 00:20:33.872 00000010 20 73 f8 0f 40 c9 f5 3b 68 89 81 43 97 dd da 31 s..@..;h..C...1 00:20:33.872 00000020 10 cf ce 47 58 e1 04 97 90 5d f2 dd 40 6f e7 c2 ...GX....]..@o.. 00:20:33.872 00000030 5b 4c c1 c7 98 9b 13 94 16 95 94 69 a0 a7 2b ba [L.........i..+. 00:20:33.872 00000040 1e e2 6a b1 63 3a 3f d8 6c 4a 7b 90 0f 80 76 45 ..j.c:?.lJ{...vE 00:20:33.872 00000050 0f 75 d0 ee 56 e7 22 a9 58 29 4c 37 48 cc f1 cf .u..V.".X)L7H... 00:20:33.872 00000060 92 df 49 51 02 c3 d4 7d 37 87 0a 26 61 7e 5c 90 ..IQ...}7..&a~\. 00:20:33.872 00000070 3c 2a 6c e2 85 b8 83 47 b1 d7 35 a3 3f 9d 9b a3 <*l....G..5.?... 00:20:33.872 00000080 49 0b 82 65 68 d7 3c e6 97 22 ff e0 0a 76 8a db I..eh.<.."...v.. 00:20:33.872 00000090 e1 d0 4f 2c 7b 5b 22 22 b7 09 e0 3c 2c d7 4e b6 ..O,{[""...<,.N. 00:20:33.872 000000a0 aa b1 b3 54 40 11 24 c0 b6 90 6c 82 59 fe f6 3c ...T@.$...l.Y..< 00:20:33.872 000000b0 11 f4 88 20 4f ca cc 65 5a a8 6c 13 b0 a3 15 9e ... O..eZ.l..... 00:20:33.872 000000c0 b0 78 1f 89 9d 83 80 d5 24 00 02 42 38 d2 26 09 .x......$..B8.&. 00:20:33.872 000000d0 0a 51 09 57 87 40 7f 78 da aa 45 83 7c ea b2 78 .Q.W.@.x..E.|..x 00:20:33.872 000000e0 88 98 4c 4d 54 6a 2f 5b 0b db 78 d6 ca 5a e0 f0 ..LMTj/[..x..Z.. 00:20:33.872 000000f0 a3 46 98 1d a4 d7 a0 8e 83 45 21 16 8c 32 f5 fa .F.......E!..2.. 00:20:33.872 00000100 6f d4 f9 6e 1c c7 ec 10 96 09 49 21 f1 d9 0c 5a o..n......I!...Z 00:20:33.872 00000110 db 83 61 d3 43 39 3e 5e 0f 13 b2 07 e1 cc 8e c4 ..a.C9>^........ 00:20:33.872 00000120 71 06 10 a6 07 9f af 0f a6 aa 41 d9 0b cf f7 b2 q.........A..... 00:20:33.872 00000130 b2 e1 f8 2f 44 d8 1d 51 47 ec 6f 65 ea bc 4b 73 .../D..QG.oe..Ks 00:20:33.872 00000140 34 69 82 58 46 24 c5 d2 96 3c 5b de ee a0 94 7f 4i.XF$...<[..... 00:20:33.872 00000150 c5 eb ea dc f2 2b eb 51 38 64 3a 35 28 08 b9 25 .....+.Q8d:5(..% 00:20:33.872 00000160 f3 b0 e8 64 65 33 77 ea b9 9f 7c 79 4c 9a ca 14 ...de3w...|yL... 00:20:33.872 00000170 18 40 37 7f dc 2c 69 10 17 f4 58 b3 44 5b 45 47 .@7..,i...X.D[EG 00:20:33.872 00000180 d0 ca 90 78 9c 18 33 a6 79 00 ea af a7 0b 8a 96 ...x..3.y....... 00:20:33.872 00000190 cb 2a 69 b2 57 66 ad 0b a5 a0 47 c6 34 1a 5f 16 .*i.Wf....G.4._. 00:20:33.872 000001a0 bd 30 65 a2 ed 44 70 70 ac fc ec 78 6c 26 f9 e2 .0e..Dpp...xl&.. 00:20:33.872 000001b0 23 47 96 c9 18 b7 ad 58 4b df 52 92 ae 05 21 f2 #G.....XK.R...!. 00:20:33.872 000001c0 97 b8 f5 8b 3e cf 5c 9d 30 76 bf ab 98 04 53 36 ....>.\.0v....S6 00:20:33.872 000001d0 9e 6a b3 76 2c 0f ca 2c b3 0b 29 2f 3e 15 bb 84 .j.v,..,..)/>... 00:20:33.872 000001e0 38 ce 4d b2 46 aa 7b 9c e6 6e e1 59 a4 1a 48 c6 8.M.F.{..n.Y..H. 00:20:33.872 000001f0 1f 53 08 af 27 be b4 b4 e2 56 17 70 f8 75 da 33 .S..'....V.p.u.3 00:20:33.872 00000200 b2 82 b2 17 38 c7 be 6f 60 0b 01 86 ad fa e4 81 ....8..o`....... 00:20:33.872 00000210 fd 28 ed f1 17 64 4c 09 4f 3d e8 1a bf 0b 99 4f .(...dL.O=.....O 00:20:33.872 00000220 35 cc e6 0b 02 b1 c4 63 5e 4a 6b 60 1a 2a 5f 5e 5......c^Jk`.*_^ 00:20:33.872 00000230 e9 f2 39 0f 56 54 42 fc 3c 79 4d 03 56 3e 93 bc ..9.VTB... 00:20:33.872 00000240 78 b0 f7 07 7b a0 34 45 43 2d c7 8d e0 fb d9 74 x...{.4EC-.....t 00:20:33.872 00000250 20 84 dd b9 96 72 0d 33 11 ef b8 49 9d 29 5b 26 ....r.3...I.)[& 00:20:33.872 00000260 95 3d 1f b0 9c 79 d1 4d a5 b0 02 63 57 76 2f 79 .=...y.M...cWv/y 00:20:33.872 00000270 e7 a0 69 73 e9 fa 73 df 6b ae f2 ae 09 b0 41 ea ..is..s.k.....A. 00:20:33.872 00000280 6b e0 90 c4 cb 79 f9 19 b1 ce 1d ef 55 0d 6b 79 k....y......U.ky 00:20:33.872 00000290 1f 71 74 2e 19 0a ca b3 28 f0 3b b5 f3 4f f2 fe .qt.....(.;..O.. 00:20:33.872 000002a0 40 de 10 40 a0 35 90 ae af ae 44 bd 80 33 4f dc @..@.5....D..3O. 00:20:33.872 000002b0 6e 13 1b 39 5a 18 14 a8 97 9e 50 6a ad 31 87 eb n..9Z.....Pj.1.. 00:20:33.872 000002c0 39 a0 32 56 c3 cd 9d 91 ab 5c 61 7b 67 a9 c0 d3 9.2V.....\a{g... 00:20:33.872 000002d0 d4 75 e8 37 68 fb 00 04 69 d4 1a 5d 9d 15 55 27 .u.7h...i..]..U' 00:20:33.872 000002e0 6a 9d 4a 5e 4f 07 4d f5 57 10 29 27 67 8f 8e 05 j.J^O.M.W.)'g... 00:20:33.872 000002f0 9b ac 5d df ee 04 75 6c bf 67 ad e6 34 7f b3 40 ..]...ul.g..4..@ 00:20:33.872 [2024-09-27 15:25:09.341217] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=4, seq=3428451727, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.872 [2024-09-27 15:25:09.377352] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.872 [2024-09-27 15:25:09.377399] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.872 [2024-09-27 15:25:09.377416] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.872 [2024-09-27 15:25:09.377437] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.872 [2024-09-27 15:25:09.377452] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.872 [2024-09-27 15:25:09.483075] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.872 [2024-09-27 15:25:09.483093] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.872 [2024-09-27 15:25:09.483100] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.872 [2024-09-27 15:25:09.483110] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.872 [2024-09-27 15:25:09.483164] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.872 ctrlr pubkey: 00:20:33.872 00000000 66 69 2a 5c 24 65 ff 27 15 65 c0 9c 2a d8 dc a0 fi*\$e.'.e..*... 00:20:33.872 00000010 77 ce 75 89 b6 8b af 2b 92 26 bd 02 92 b3 ac f7 w.u....+.&...... 00:20:33.872 00000020 06 81 9d 5b f0 15 c0 09 c8 79 78 3c af d0 d5 fd ...[.....yx<.... 00:20:33.872 00000030 c9 aa e0 69 34 49 5b 2c a4 1e 6b 75 70 f1 44 9b ...i4I[,..kup.D. 00:20:33.872 00000040 08 f7 0b 01 5f cb 49 d6 5a 97 71 11 91 85 22 bb ...._.I.Z.q...". 00:20:33.872 00000050 fd d7 2d bb 11 bc 4c 39 e7 e4 61 dd 13 73 8d 71 ..-...L9..a..s.q 00:20:33.872 00000060 74 dc 13 7e 9b 7e 7e cb 09 33 61 c0 13 0d fb 45 t..~.~~..3a....E 00:20:33.872 00000070 9f 9d cd 84 e9 e4 da 91 af b4 e1 aa 91 58 50 68 .............XPh 00:20:33.872 00000080 cf e4 41 fe 8e 02 33 79 ff 95 cd d0 b7 03 1f 4d ..A...3y.......M 00:20:33.872 00000090 7d 89 5a e1 1a 59 11 96 05 4d 86 ec d3 b7 2a 77 }.Z..Y...M....*w 00:20:33.872 000000a0 22 ef 05 42 95 8e d4 ff a0 2d cb e0 66 ec 02 e9 "..B.....-..f... 00:20:33.872 000000b0 8d 70 1d 9c fa e8 de 6d 0e 89 71 04 03 e7 d2 0e .p.....m..q..... 00:20:33.872 000000c0 d7 1d a7 14 3d 12 ff 44 82 f1 dd 83 d9 c2 d6 59 ....=..D.......Y 00:20:33.872 000000d0 49 47 91 39 03 8c b8 ae b9 1a 57 a6 7e 46 4e 2d IG.9......W.~FN- 00:20:33.872 000000e0 ff de 0d d8 02 37 2a 19 d6 a3 5b 5a b2 e2 3d 40 .....7*...[Z..=@ 00:20:33.872 000000f0 ba 1d 75 58 cd 25 82 17 5c f1 a1 d0 b6 f3 d0 82 ..uX.%..\....... 00:20:33.872 00000100 8f 4c 1c 84 4b 7f 65 d4 0b c5 44 96 e6 a8 5c bd .L..K.e...D...\. 00:20:33.872 00000110 03 ff d0 32 b1 4d df 82 03 f2 c6 b6 38 7b 4b c7 ...2.M......8{K. 00:20:33.872 00000120 66 31 58 1f dd 99 ea 68 12 93 bd 0c 30 ab f8 6c f1X....h....0..l 00:20:33.872 00000130 a3 ee 85 e1 5b e5 50 ae af a0 c6 af 01 de f7 bc ....[.P......... 00:20:33.872 00000140 69 d8 08 8c 8a 20 2e 3f ba 82 57 9f 43 20 b8 7c i.... .?..W.C .| 00:20:33.872 00000150 38 12 7e f7 71 cf bc 55 f0 67 97 13 3d e1 91 7c 8.~.q..U.g..=..| 00:20:33.872 00000160 52 fa ad 25 20 af 1b 34 07 65 31 72 e1 37 9e ae R..% ..4.e1r.7.. 00:20:33.872 00000170 31 29 fa 3c a2 9f 24 fa c5 85 98 d5 62 32 ad 50 1).<..$.....b2.P 00:20:33.872 00000180 16 b0 3f 11 46 34 48 65 26 a0 1a 05 98 c6 6b 1f ..?.F4He&.....k. 00:20:33.872 00000190 15 50 81 0d c3 98 bd 5c 67 c3 27 d5 ef 00 6b 29 .P.....\g.'...k) 00:20:33.872 000001a0 fa a1 1a e9 59 28 31 92 29 c0 24 16 54 69 be b9 ....Y(1.).$.Ti.. 00:20:33.872 000001b0 08 fb d5 c0 81 38 80 b0 af d8 67 a3 76 43 72 da .....8....g.vCr. 00:20:33.872 000001c0 d0 3e b9 92 48 94 57 8a 09 71 9b 4e c2 51 ed 05 .>..H.W..q.N.Q.. 00:20:33.872 000001d0 22 91 d1 06 bd b0 2b 54 53 31 d2 d0 b6 2c b2 24 ".....+TS1...,.$ 00:20:33.872 000001e0 01 4d 8e 35 86 c3 aa be 2b 4f 12 50 54 08 51 10 .M.5....+O.PT.Q. 00:20:33.872 000001f0 44 ce 45 c2 e0 93 09 88 c7 63 6d 14 02 b5 be 95 D.E......cm..... 00:20:33.872 00000200 98 aa 4e 74 61 eb 1f 83 21 b1 01 71 b2 bf c6 f7 ..Nta...!..q.... 00:20:33.872 00000210 69 5b 95 09 5e 5a 7b 54 23 22 8d 58 02 4a 26 92 i[..^Z{T#".X.J&. 00:20:33.872 00000220 0a e1 87 ea 12 40 77 a2 38 ec 40 ac ce 0f 03 45 .....@w.8.@....E 00:20:33.872 00000230 80 0d 26 22 8e 1e 2f a8 f3 64 e3 5f eb ee 43 30 ..&"../..d._..C0 00:20:33.872 00000240 80 e4 07 24 d1 01 0c f2 74 09 a9 8f e8 f3 37 4d ...$....t.....7M 00:20:33.872 00000250 e2 69 0d 7b 1c d8 9e 01 4f bb 43 6a e4 82 6c 72 .i.{....O.Cj..lr 00:20:33.872 00000260 8c 87 d6 08 0f 03 ca a3 1e fe 09 7a a4 5e e1 e1 ...........z.^.. 00:20:33.872 00000270 ee 54 59 2b 24 51 13 09 83 ff e5 16 be ab 1b 2a .TY+$Q.........* 00:20:33.872 00000280 9a fb 21 48 2e b5 6d 25 da ae 2b 79 81 49 6b e9 ..!H..m%..+y.Ik. 00:20:33.872 00000290 6d 2f 8a 57 ee 79 be 5e 31 83 e7 ec af 36 15 07 m/.W.y.^1....6.. 00:20:33.872 000002a0 5b bb 09 19 73 73 ab fc dd c7 47 64 a6 d6 e7 41 [...ss....Gd...A 00:20:33.872 000002b0 a2 0d 84 d5 9e a4 a0 3b 93 fb 00 2f 48 fc 8c 07 .......;.../H... 00:20:33.872 000002c0 8c 16 37 bc ce 2b 03 0c e3 df 7b 3e 08 b1 7f ff ..7..+....{>.... 00:20:33.872 000002d0 40 e6 12 1b ea e1 22 fe f1 92 e3 2e ec c4 d6 91 @....."......... 00:20:33.872 000002e0 4f 41 31 ee 32 cb 0c 29 fb cf 0a 57 c4 2f 8a 4b OA1.2..)...W./.K 00:20:33.872 000002f0 48 ef f8 51 d6 b2 d7 04 6e c1 3a 41 36 57 d9 b1 H..Q....n.:A6W.. 00:20:33.872 host pubkey: 00:20:33.872 00000000 43 07 bd ae e0 f0 1a a9 3a 2a 6f e9 44 fb 33 54 C.......:*o.D.3T 00:20:33.872 00000010 8c 2c 37 ea d9 19 d3 ec a8 b8 1c 56 ca d2 78 8e .,7........V..x. 00:20:33.872 00000020 85 a5 16 de a5 bd 37 b0 c8 17 9c 30 ec a9 12 b4 ......7....0.... 00:20:33.872 00000030 06 b7 2d 8d c7 21 98 ef b4 f8 97 f1 ae 4a 5d c1 ..-..!.......J]. 00:20:33.873 00000040 02 68 ca 55 ac e0 b1 05 0e 75 8a 8c 4b cc 6d 2c .h.U.....u..K.m, 00:20:33.873 00000050 03 13 5d d7 2a 18 34 4d c9 ee 0a 59 7c 11 73 61 ..].*.4M...Y|.sa 00:20:33.873 00000060 67 4c b8 76 27 ac f7 fd b2 e4 e6 22 03 f6 61 a3 gL.v'......"..a. 00:20:33.873 00000070 37 9c c4 19 ba 53 c2 0a e0 5f a4 93 ce 8f a1 f0 7....S..._...... 00:20:33.873 00000080 9a fd bb 3a e4 eb 35 f8 38 7f b9 70 73 f2 f6 15 ...:..5.8..ps... 00:20:33.873 00000090 c6 b8 e4 89 ef d8 05 10 ec 3c a4 b3 5b 9d 96 69 .........<..[..i 00:20:33.873 000000a0 1a 05 5e d6 f9 7c 13 82 0a 6e 44 e8 96 5f bd 62 ..^..|...nD.._.b 00:20:33.873 000000b0 ca e8 7c 21 d7 38 e6 fa 58 4e fe 01 8b 1f 5c 13 ..|!.8..XN....\. 00:20:33.873 000000c0 cf 43 3d 43 5e 85 c6 a8 71 e3 73 94 e4 f0 c3 2c .C=C^...q.s...., 00:20:33.873 000000d0 71 f0 27 78 eb 23 12 f8 2a c0 13 0b c4 03 1b 91 q.'x.#..*....... 00:20:33.873 000000e0 e5 0a b5 9a ff 59 1a 82 32 42 c6 01 f7 cc 43 e1 .....Y..2B....C. 00:20:33.873 000000f0 48 f5 b6 db e5 22 c1 5f 0c 17 99 5e e4 c6 c4 ca H...."._...^.... 00:20:33.873 00000100 35 e0 d3 88 7f b7 99 a9 b4 14 e1 aa 69 7f 8c 7e 5...........i..~ 00:20:33.873 00000110 4f 62 09 e8 1e ac 1e c2 b2 1c 7c de ed 1f b5 6d Ob........|....m 00:20:33.873 00000120 74 fc 4c 5d c4 e3 d9 90 81 9a 8a f7 3c d0 51 61 t.L]........<.Qa 00:20:33.873 00000130 13 2b d4 67 5b c4 18 ba cc e4 79 f3 7f 20 54 38 .+.g[.....y.. T8 00:20:33.873 00000140 66 50 46 f8 e8 b8 8b d6 c0 1d 1f cf 38 86 cd 3f fPF.........8..? 00:20:33.873 00000150 42 b9 f6 bc 06 27 b6 f9 79 c4 e2 aa f5 c1 17 45 B....'..y......E 00:20:33.873 00000160 8d ed 80 ab b8 5e 43 8d ce c7 a3 5d fe da 06 00 .....^C....].... 00:20:33.873 00000170 26 c5 91 c0 02 b5 6a 52 03 6d b9 23 20 af ae d4 &.....jR.m.# ... 00:20:33.873 00000180 d6 22 00 c1 24 1e c5 4c 15 01 82 42 19 de 49 37 ."..$..L...B..I7 00:20:33.873 00000190 7f 21 fa d6 01 21 6a 55 33 3c 40 50 88 09 9f be .!...!jU3<@P.... 00:20:33.873 000001a0 7f e0 ad 3a 9a 16 17 43 d4 bc 18 4a 79 1d 5d 68 ...:...C...Jy.]h 00:20:33.873 000001b0 22 21 4f 02 59 4d 2d 39 c9 bd ec 99 97 47 43 a4 "!O.YM-9.....GC. 00:20:33.873 000001c0 45 60 46 13 7a 58 5e 67 02 42 17 a3 ed 77 31 b7 E`F.zX^g.B...w1. 00:20:33.873 000001d0 b4 e1 8d c4 f6 54 26 ba d1 b7 03 c6 e6 0f 66 c1 .....T&.......f. 00:20:33.873 000001e0 80 98 86 ce ac dc 84 e5 7b 5b e9 39 69 a3 ec 6d ........{[.9i..m 00:20:33.873 000001f0 b1 bd 76 0f ce 45 22 52 b5 29 5e cf d2 4c 3f 7a ..v..E"R.)^..L?z 00:20:33.873 00000200 6a b5 f0 c3 e3 1d ed 84 f7 d6 46 22 19 b2 db 0a j.........F".... 00:20:33.873 00000210 93 96 0f a3 38 46 ce fa 07 77 1c f8 9d fc 36 51 ....8F...w....6Q 00:20:33.873 00000220 e6 71 6d 7a 39 58 80 54 23 01 c3 54 6a 31 33 5d .qmz9X.T#..Tj13] 00:20:33.873 00000230 25 dc a4 5d 8c 2b ab 41 6c 95 38 6f 78 9b 8e a5 %..].+.Al.8ox... 00:20:33.873 00000240 9c 16 bb 4f 5b c0 78 06 ee cd 99 8d a1 da 9a 52 ...O[.x........R 00:20:33.873 00000250 29 f7 4c f2 3e 2c dc 7d a3 88 38 86 cd c1 f4 60 ).L.>,.}..8....` 00:20:33.873 00000260 32 10 f9 96 32 b8 72 70 f2 d7 7b eb 47 ce 60 b6 2...2.rp..{.G.`. 00:20:33.873 00000270 14 bf bf de 30 ca 95 78 a5 ee 0a fe d7 92 fd 8a ....0..x........ 00:20:33.873 00000280 ba 8c 25 85 6f b8 d1 ac 87 26 7d b6 b2 9d de 6d ..%.o....&}....m 00:20:33.873 00000290 ab 42 b9 40 05 0e e2 e0 f6 bf 1e 1e c2 80 98 73 .B.@...........s 00:20:33.873 000002a0 a1 58 61 a2 54 02 9a f0 15 58 c3 f2 c3 d1 63 13 .Xa.T....X....c. 00:20:33.873 000002b0 b6 64 e3 df 5e 54 df a0 d9 91 c4 84 a9 2b af 36 .d..^T.......+.6 00:20:33.873 000002c0 61 bc 9b e7 02 da 51 c5 05 14 18 57 e2 60 57 e6 a.....Q....W.`W. 00:20:33.873 000002d0 e7 64 ae 51 46 20 d1 45 ea 69 23 e8 8f 24 f4 48 .d.QF .E.i#..$.H 00:20:33.873 000002e0 26 03 9e 8e 89 e4 20 2e ea e3 b9 92 73 bb c8 78 &..... .....s..x 00:20:33.873 000002f0 08 b6 e9 39 c7 0b d7 55 c2 90 10 22 7d 9e cc 58 ...9...U..."}..X 00:20:33.873 dh secret: 00:20:33.873 00000000 26 68 d5 de 33 13 43 d1 2e 63 bb 51 44 44 e8 d9 &h..3.C..c.QDD.. 00:20:33.873 00000010 6a 48 6e 83 8e 03 b0 25 79 26 8d f1 54 d5 bb 62 jHn....%y&..T..b 00:20:33.873 00000020 c3 d2 fc 86 61 ca c9 ed a3 23 e9 70 8f 7c c9 1f ....a....#.p.|.. 00:20:33.873 00000030 06 e2 1a eb 4c 06 37 16 c1 3a 85 f0 0a eb 7d f9 ....L.7..:....}. 00:20:33.873 00000040 b2 9d e6 c9 c2 12 0a 37 86 5e 9c 6f d4 7e c6 89 .......7.^.o.~.. 00:20:33.873 00000050 45 d9 7f 10 93 87 43 3d 06 ae db f4 41 01 d6 ad E.....C=....A... 00:20:33.873 00000060 fd d7 61 7f 7a 83 ae 2d b9 14 05 97 9f 97 93 93 ..a.z..-........ 00:20:33.873 00000070 62 79 dd 67 02 23 39 a3 a8 e9 7e 53 a0 b8 24 09 by.g.#9...~S..$. 00:20:33.873 00000080 26 e9 55 95 17 e4 1a c5 78 a1 32 d5 af aa 51 e0 &.U.....x.2...Q. 00:20:33.873 00000090 80 e8 7b f1 6f 99 cf 93 40 53 3a bc bb 34 6d 66 ..{.o...@S:..4mf 00:20:33.873 000000a0 1f 7c df 44 0c ce 09 4c 13 36 51 a7 e8 5f 7f 84 .|.D...L.6Q.._.. 00:20:33.873 000000b0 dc b6 bf 29 c8 75 8c 10 a8 d8 d2 3f 52 32 a0 12 ...).u.....?R2.. 00:20:33.873 000000c0 ca 8f 79 e9 8e a8 c5 ed 70 21 9d 03 22 3e 9f 0a ..y.....p!..">.. 00:20:33.873 000000d0 73 06 3a 38 8f 39 3e 04 e8 ad b5 e7 c5 45 6c fd s.:8.9>......El. 00:20:33.873 000000e0 f7 71 e5 1a 4f 30 5f 14 27 5c e4 6b 4b 93 d0 2f .q..O0_.'\.kK../ 00:20:33.873 000000f0 c1 1a 44 1d ca 8d e1 98 af 15 4d 02 d3 e2 71 b4 ..D.......M...q. 00:20:33.873 00000100 41 2b e1 b3 37 2c eb 0c f4 fd 55 dc 09 4a 70 17 A+..7,....U..Jp. 00:20:33.873 00000110 9e 6f 9e 3b a0 d4 e5 f5 59 f3 94 93 d5 3e b8 bd .o.;....Y....>.. 00:20:33.873 00000120 37 20 0d 19 4a 84 9d a4 b6 21 c7 e9 71 8e c3 ae 7 ..J....!..q... 00:20:33.873 00000130 d8 2f e2 db 88 92 73 4d 53 d2 dd 18 96 87 14 94 ./....sMS....... 00:20:33.873 00000140 00 3f 9b b4 d7 ad 50 2e e7 63 40 8d 14 4a ca 78 .?....P..c@..J.x 00:20:33.873 00000150 bb 7d 8e 99 3b 21 1a 47 82 f8 08 1e 55 61 84 1e .}..;!.G....Ua.. 00:20:33.873 00000160 b6 9a 22 5d 4e 66 25 d7 89 13 92 9c eb 63 89 2d .."]Nf%......c.- 00:20:33.873 00000170 80 79 20 1d 27 75 3f 14 5d 60 bf f3 19 47 b3 6f .y .'u?.]`...G.o 00:20:33.873 00000180 41 bf a0 ba f3 e1 98 93 95 8c 7c d8 9d f3 fa e6 A.........|..... 00:20:33.873 00000190 1b e5 5e 34 f1 2e b1 d6 0d 86 4b e5 87 a9 2b 9f ..^4......K...+. 00:20:33.873 000001a0 e6 54 9a 0d c5 04 9c 3e 67 67 1c 93 6f aa 7d 8a .T.....>gg..o.}. 00:20:33.873 000001b0 80 02 66 0b fe 6a a2 4e c1 57 de ef 7a 6e eb cb ..f..j.N.W..zn.. 00:20:33.873 000001c0 35 18 79 bd 68 74 23 4c 3d a5 5d f8 75 9c 0b 21 5.y.ht#L=.].u..! 00:20:33.873 000001d0 af f2 c0 8c 4e ab f3 b9 b4 1e 21 23 36 2c 7e b1 ....N.....!#6,~. 00:20:33.873 000001e0 10 e9 2d 95 14 9a 33 17 fc 00 4b 7e de f4 98 b0 ..-...3...K~.... 00:20:33.873 000001f0 01 6e 9e b9 b1 ea 7f fd 2b a8 c7 f0 17 19 02 d1 .n......+....... 00:20:33.873 00000200 23 9c dc ab d5 ab 85 8f 81 bb 2f 42 6e 10 01 5c #........./Bn..\ 00:20:33.873 00000210 cd 61 90 c1 bd f8 1b d8 91 a5 fb af 48 86 05 2c .a..........H.., 00:20:33.873 00000220 b5 c3 6b 3f fb f3 e2 c2 8a 83 ff 9b a0 5b 2c 17 ..k?.........[,. 00:20:33.873 00000230 54 52 c0 3f 12 d7 d5 5e f1 da 34 32 86 f3 73 3d TR.?...^..42..s= 00:20:33.873 00000240 04 5e 3a b7 a9 a4 3d 85 25 f7 28 04 24 f6 9c 13 .^:...=.%.(.$... 00:20:33.873 00000250 1d fa 8c ce 93 2b 06 b0 6a ea 82 aa 4b 85 f7 2e .....+..j...K... 00:20:33.873 00000260 63 f3 98 16 fd 66 c5 34 8f 71 0b 7f a9 51 02 64 c....f.4.q...Q.d 00:20:33.873 00000270 89 c3 44 cb 09 5a 49 d6 e4 39 14 26 3c b6 df 43 ..D..ZI..9.&<..C 00:20:33.873 00000280 a0 ee 7d e7 c7 42 40 c5 03 05 38 ea 0d 73 71 43 ..}..B@...8..sqC 00:20:33.873 00000290 41 2a 79 20 5b 32 3b 21 b5 6d 3c dc 99 d5 87 72 A*y [2;!.m<....r 00:20:33.873 000002a0 76 0f 05 d8 16 f7 aa 06 26 56 93 54 ef 06 24 bd v.......&V.T..$. 00:20:33.873 000002b0 e8 f5 2c b4 76 a0 d9 62 a7 d7 27 3e 8c 6d b1 d7 ..,.v..b..'>.m.. 00:20:33.873 000002c0 65 03 c3 31 d4 bf 33 8e 3e 51 c5 a8 d8 90 99 69 e..1..3.>Q.....i 00:20:33.873 000002d0 8c 41 a7 2f c4 07 d0 f1 d9 8c e3 24 f8 ad 39 49 .A./.......$..9I 00:20:33.873 000002e0 35 f8 23 1f 58 69 49 42 1c d6 10 b0 54 a3 a7 ee 5.#.XiIB....T... 00:20:33.873 000002f0 7f 49 99 7c 61 ee c4 ef e4 e1 9c 9a ac 6b f0 ec .I.|a........k.. 00:20:33.873 [2024-09-27 15:25:09.531176] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=1, dhgroup=4, seq=3428451728, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.873 [2024-09-27 15:25:09.531282] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.873 [2024-09-27 15:25:09.586324] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.873 [2024-09-27 15:25:09.586371] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.873 [2024-09-27 15:25:09.586381] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.873 [2024-09-27 15:25:09.586407] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.873 [2024-09-27 15:25:09.753786] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.873 [2024-09-27 15:25:09.753806] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.873 [2024-09-27 15:25:09.753813] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.873 [2024-09-27 15:25:09.753857] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.873 [2024-09-27 15:25:09.753881] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.873 ctrlr pubkey: 00:20:33.874 00000000 62 ac 47 4c e2 b9 ba 88 71 28 ce b5 5e 92 4b 9b b.GL....q(..^.K. 00:20:33.874 00000010 49 bf 3b ff 81 02 e9 9e 82 b4 e1 30 a5 b3 1d ff I.;........0.... 00:20:33.874 00000020 a6 60 55 e9 93 db b8 04 3e d2 c6 b8 6b 65 e4 48 .`U.....>...ke.H 00:20:33.874 00000030 43 55 1b 6e 3d 57 4c f5 04 d6 93 68 5f a9 d6 a1 CU.n=WL....h_... 00:20:33.874 00000040 e6 3c f4 2c f7 3d 22 ee 56 3f 24 4e 36 ea 26 06 .<.,.=".V?$N6.&. 00:20:33.874 00000050 a1 7a 51 aa 49 35 a9 9c af 1e 64 53 55 6a 21 1f .zQ.I5....dSUj!. 00:20:33.874 00000060 2b 7a 60 5f 6c 5a 66 ca 83 81 7d ce a3 13 36 b3 +z`_lZf...}...6. 00:20:33.874 00000070 ef f3 e5 3b 07 11 0c 82 b3 85 84 f9 0e 34 95 93 ...;.........4.. 00:20:33.874 00000080 55 98 f0 89 15 4f f9 c5 ea 11 a1 37 89 3d 89 c0 U....O.....7.=.. 00:20:33.874 00000090 a8 4b c8 c9 13 c2 80 57 cd 18 0f a8 e9 ea 49 9e .K.....W......I. 00:20:33.874 000000a0 c4 9d e2 30 d0 51 6d 44 30 09 b0 d2 40 14 df e7 ...0.QmD0...@... 00:20:33.874 000000b0 fc 10 d5 a3 08 3b 81 ea 26 b1 41 a5 cc fd da 1f .....;..&.A..... 00:20:33.874 000000c0 ad 79 b7 41 4b 85 e5 a7 49 2a 27 5e 21 6d ed 77 .y.AK...I*'^!m.w 00:20:33.874 000000d0 fb dd fb 18 dd 50 a2 18 8b b8 fc 3d 55 e4 fc db .....P.....=U... 00:20:33.874 000000e0 2a 59 a8 d9 ea 9e 92 ed 25 a8 21 78 08 41 97 42 *Y......%.!x.A.B 00:20:33.874 000000f0 f1 60 83 2f 1c b6 90 72 07 f7 11 64 04 09 5e f8 .`./...r...d..^. 00:20:33.874 00000100 8b 92 6c 29 89 67 5f 72 20 d4 37 d6 d0 20 8c f2 ..l).g_r .7.. .. 00:20:33.874 00000110 84 5c ed 24 f2 1f 6b 95 76 de 90 d1 26 60 4a 57 .\.$..k.v...&`JW 00:20:33.874 00000120 4c 26 56 e9 b5 d5 a2 b8 4d 87 43 92 10 d1 79 8c L&V.....M.C...y. 00:20:33.874 00000130 08 cf d4 42 1c 7c bb a9 12 65 09 45 df 88 d5 b9 ...B.|...e.E.... 00:20:33.874 00000140 a3 ef c1 1a d0 f0 37 51 f1 aa 87 21 d9 5a 81 8d ......7Q...!.Z.. 00:20:33.874 00000150 9c 96 87 d3 69 43 44 de d2 0e 8f 69 0c 56 76 2b ....iCD....i.Vv+ 00:20:33.874 00000160 8a ef 9a c5 db 2c c7 5e 42 d5 41 5d 4e eb e0 93 .....,.^B.A]N... 00:20:33.874 00000170 69 98 71 c5 8a 0b d3 f4 b9 47 cd 98 fa 57 8c 6e i.q......G...W.n 00:20:33.874 00000180 3a 15 eb 91 68 f3 b7 5b 6f c8 0e 7e 17 d7 2b 9e :...h..[o..~..+. 00:20:33.874 00000190 ff fa 92 c7 01 cc f8 a5 04 e9 2f 13 99 57 35 62 ........../..W5b 00:20:33.874 000001a0 12 82 88 4b 45 83 2f 38 0b 6b c0 04 a0 b0 0c 56 ...KE./8.k.....V 00:20:33.874 000001b0 fc 20 55 f8 d5 46 55 06 ff 1e c7 6e a9 67 b0 6e . U..FU....n.g.n 00:20:33.874 000001c0 e3 f5 3d 25 e8 a8 2b d5 16 7b 32 37 f6 a8 97 1b ..=%..+..{27.... 00:20:33.874 000001d0 19 3b af df 01 99 8f 24 b3 2b e2 87 95 83 ec 13 .;.....$.+...... 00:20:33.874 000001e0 8d 88 83 1e 6d 31 6f d2 cc 56 f7 44 53 aa 3e 81 ....m1o..V.DS.>. 00:20:33.874 000001f0 27 ee 2f 0b 92 8c b4 5c 6b 99 5e 48 7c 81 91 ff './....\k.^H|... 00:20:33.874 00000200 e2 52 14 46 65 6d 47 da 1a 1b e0 58 75 32 67 ce .R.FemG....Xu2g. 00:20:33.874 00000210 ef 27 95 15 a8 78 6b 5d 28 c3 74 e1 1a 49 b3 72 .'...xk](.t..I.r 00:20:33.874 00000220 7d 45 f2 28 fd 7e 85 1d 98 42 1a 02 1c bc 3d f2 }E.(.~...B....=. 00:20:33.874 00000230 cd a2 2d d6 01 c5 7f 0c 33 e6 3a 16 15 a1 34 17 ..-.....3.:...4. 00:20:33.874 00000240 31 fb 99 7f 33 82 89 92 0c 7a 4e 6a 1f 8e 60 1e 1...3....zNj..`. 00:20:33.874 00000250 44 d4 c3 a8 9d 88 3c 43 0b 63 b6 81 90 15 ab 6b D..... 00:20:33.874 00000090 9f 42 2e 93 e6 f9 2a 39 35 b9 f1 69 95 57 63 e2 .B....*95..i.Wc. 00:20:33.874 000000a0 0d ed 07 b9 5b de 39 92 11 e3 d1 af be 6a 4d 75 ....[.9......jMu 00:20:33.874 000000b0 21 1d ab cc 4a b8 ed 32 c7 ef 79 59 88 4d 2c 9c !...J..2..yY.M,. 00:20:33.874 000000c0 dc 4f 16 45 d2 d5 ff 40 d5 10 7b a8 9d 71 1c a5 .O.E...@..{..q.. 00:20:33.874 000000d0 01 76 c1 b7 1c 0a e3 47 7d 27 33 5e 61 be c5 21 .v.....G}'3^a..! 00:20:33.874 000000e0 c0 e4 8a 78 e7 da 25 40 80 24 b5 5b 27 d4 61 78 ...x..%@.$.['.ax 00:20:33.874 000000f0 76 75 3a 3a 57 3c 6e 39 53 11 a9 88 0c fc c3 5c vu::W..a..[.?]. 00:20:33.874 00000170 ca b1 a9 5d 42 cb 76 9e 8a 5f ff 3a d4 c7 d8 2e ...]B.v.._.:.... 00:20:33.874 00000180 a3 f7 93 1e 9a 9d 11 f8 3a 47 52 82 7f 93 59 ef ........:GR...Y. 00:20:33.874 00000190 c5 5a 85 b2 8d 86 e6 c5 3e 63 08 4c 95 49 a3 36 .Z......>c.L.I.6 00:20:33.874 000001a0 bf 9d de 0c 3f 95 0d 15 68 84 4d cf 75 4c 02 8d ....?...h.M.uL.. 00:20:33.874 000001b0 b5 83 94 f1 b2 f7 b6 57 69 f1 65 d8 95 c5 bd b3 .......Wi.e..... 00:20:33.874 000001c0 08 7e 99 b3 b8 96 9d 31 a7 4a bf 90 be f8 35 be .~.....1.J....5. 00:20:33.874 000001d0 a7 9d 81 a0 75 a9 21 b3 8a 66 10 2b 34 91 1f 06 ....u.!..f.+4... 00:20:33.874 000001e0 cc 56 a4 9d 7c b1 a7 7a 70 3b 8d 2f 72 70 41 42 .V..|..zp;./rpAB 00:20:33.874 000001f0 85 37 69 fa fc c8 1b 39 b1 b2 c7 5f cf 35 ce e6 .7i....9..._.5.. 00:20:33.874 00000200 df bf 66 4c b6 e7 bb 4a 77 b7 84 a4 8c 1b dc 57 ..fL...Jw......W 00:20:33.874 00000210 45 c5 59 ac be f2 a8 eb 8f a0 c4 cc 85 7e 0a 14 E.Y..........~.. 00:20:33.874 00000220 d1 45 94 42 33 7b 1d 8e 59 ec 17 1e de 97 83 70 .E.B3{..Y......p 00:20:33.874 00000230 9f 57 b0 70 d8 76 c0 23 c6 fb 46 16 67 97 c6 ac .W.p.v.#..F.g... 00:20:33.874 00000240 c9 a7 47 93 83 e8 7b eb f3 a1 b4 3e 75 07 ea ec ..G...{....>u... 00:20:33.874 00000250 da cb 8d 14 b6 41 49 4c f2 d9 df e0 7e 2c 88 03 .....AIL....~,.. 00:20:33.874 00000260 86 a4 f3 d3 24 6a 53 08 61 b1 69 9f f3 45 98 71 ....$jS.a.i..E.q 00:20:33.874 00000270 83 52 af b6 ad 8b a7 1a c9 45 38 78 5d c7 28 d0 .R.......E8x].(. 00:20:33.874 00000280 6b 8c 11 1f 44 3d d3 90 07 79 c2 12 85 8e 48 50 k...D=...y....HP 00:20:33.874 00000290 fc b0 59 1a b5 ac 8e 34 7a 18 98 ca 47 cc 8a 13 ..Y....4z...G... 00:20:33.874 000002a0 18 42 39 4a 1c 91 08 fa 54 3f c1 d2 17 65 b6 ba .B9J....T?...e.. 00:20:33.874 000002b0 72 70 9d c5 1b 42 09 47 d5 dd 2c 4f 38 f3 b2 f9 rp...B.G..,O8... 00:20:33.874 000002c0 81 59 79 04 76 95 09 cc c4 61 75 87 d0 7d 6c 38 .Yy.v....au..}l8 00:20:33.874 000002d0 70 e2 a3 37 32 87 57 c1 44 38 04 2c 20 32 e8 a1 p..72.W.D8., 2.. 00:20:33.874 000002e0 21 67 36 49 98 3e f9 28 bf 58 7f 2a e7 0e f6 33 !g6I.>.(.X.*...3 00:20:33.874 000002f0 d9 fe c0 a1 71 75 1a 9a a0 e6 66 df 8d dc ed 0c ....qu....f..... 00:20:33.874 dh secret: 00:20:33.874 00000000 b0 a3 de 9b a8 6f 2b d2 fb 06 0e 19 87 80 40 28 .....o+.......@( 00:20:33.874 00000010 02 b3 6f fc df 0c 09 69 85 63 92 ff 97 1c d2 68 ..o....i.c.....h 00:20:33.874 00000020 40 85 70 e2 e6 8b de c5 4c 98 83 f4 62 f9 44 1a @.p.....L...b.D. 00:20:33.874 00000030 55 7b 4b 9e ef 91 b3 f3 8c 78 79 b3 66 e3 9b 49 U{K......xy.f..I 00:20:33.874 00000040 af 5b 70 30 93 7b 7f 29 2a 67 b5 46 ca d2 88 1f .[p0.{.)*g.F.... 00:20:33.874 00000050 ab 7c 87 99 34 a4 39 36 cc 6d 7b ee 5e 54 4e 3c .|..4.96.m{.^TN< 00:20:33.874 00000060 3c 27 bf 1d 8f df 8a f1 f8 ad c4 48 a7 b1 8c 01 <'.........H.... 00:20:33.874 00000070 7f 8e a6 e0 09 ce ed 95 b9 1b 0b c4 ce 68 f9 95 .............h.. 00:20:33.874 00000080 47 c9 53 f2 74 37 c5 5c 30 0c 66 d9 2e 0e b1 c3 G.S.t7.\0.f..... 00:20:33.874 00000090 d7 04 e2 cb 84 c8 81 ba cb 70 63 2f a8 e7 04 95 .........pc/.... 00:20:33.874 000000a0 4c 2d 3f 47 f0 66 88 9e 28 4e 2f 72 18 3e 96 57 L-?G.f..(N/r.>.W 00:20:33.874 000000b0 d2 ac 18 96 21 69 d8 31 6e bd dc 47 9c 02 3e 9d ....!i.1n..G..>. 00:20:33.874 000000c0 ac f9 16 f5 02 6b 56 8b 82 13 ef 79 e9 7e 5d e7 .....kV....y.~]. 00:20:33.874 000000d0 41 5f a1 54 1b 1f 42 f4 ac 55 b0 f8 26 bf 4c 4c A_.T..B..U..&.LL 00:20:33.874 000000e0 4b b1 30 87 90 be 8f ed c9 7c a3 70 db 49 ad 1e K.0......|.p.I.. 00:20:33.874 000000f0 f8 3b 41 d3 f8 77 7b 4e 8e 11 be 31 04 3f 26 0e .;A..w{N...1.?&. 00:20:33.874 00000100 d0 2c 35 16 3e 68 57 86 17 24 e7 b3 0c 14 d3 4b .,5.>hW..$.....K 00:20:33.874 00000110 57 db c6 6e 80 c7 04 e0 bc 8e 5a 28 08 b7 56 78 W..n......Z(..Vx 00:20:33.874 00000120 7c 8f 79 5d 72 38 08 9c 04 f1 ec 31 7c 20 c8 6a |.y]r8.....1| .j 00:20:33.874 00000130 70 da 8e 50 3b fe dd d1 89 26 50 22 68 ce f6 c4 p..P;....&P"h... 00:20:33.874 00000140 84 93 ca 04 e8 78 dc 6e 13 71 c1 15 da d8 2e 10 .....x.n.q...... 00:20:33.874 00000150 4f 42 c6 a8 7a f8 e7 21 b5 59 dc 94 6d b6 d1 9e OB..z..!.Y..m... 00:20:33.874 00000160 89 9c 5a fb 7c fe d7 1f cf 28 b3 53 5b f0 2b 76 ..Z.|....(.S[.+v 00:20:33.874 00000170 02 fe 35 85 75 15 8a ed 39 c3 00 d5 94 d5 9e 22 ..5.u...9......" 00:20:33.874 00000180 63 2d c2 68 c8 ce e4 4b 99 e1 7b ff 56 6b 79 e8 c-.h...K..{.Vky. 00:20:33.874 00000190 19 51 d6 80 b9 bd 98 65 3c 03 79 5d a1 9f 53 15 .Q.....e<.y]..S. 00:20:33.874 000001a0 3c d0 d3 87 b6 10 74 5c 30 b1 37 bc 09 32 1c 99 <.....t\0.7..2.. 00:20:33.874 000001b0 0c 57 22 25 2d 86 46 fe fb 39 a9 8e 0c 76 32 3f .W"%-.F..9...v2? 00:20:33.874 000001c0 8e 44 b2 a4 97 5e 64 b6 40 95 a2 bd 68 9a 9e dd .D...^d.@...h... 00:20:33.874 000001d0 bd fb b3 54 2c d5 47 d7 08 07 cc 42 cc ff a3 31 ...T,.G....B...1 00:20:33.874 000001e0 d2 29 11 1b 3e 5d ed 03 37 9c cc e0 8e 00 1a 77 .)..>]..7......w 00:20:33.874 000001f0 75 a1 e1 96 f0 26 18 a1 b5 58 e7 12 03 0a 7e af u....&...X....~. 00:20:33.874 00000200 d7 8c 6a 41 b8 0c b8 36 d4 dc 4f 74 23 1b 03 81 ..jA...6..Ot#... 00:20:33.875 00000210 37 08 88 db d5 e2 51 1d 50 1b ad 0f da 41 61 de 7.....Q.P....Aa. 00:20:33.875 00000220 f2 04 72 87 3c 80 cc d5 54 d5 bf d2 d3 c9 a3 65 ..r.<...T......e 00:20:33.875 00000230 57 fc d9 50 59 91 d1 83 d8 c1 04 a4 6b f9 ec 16 W..PY.......k... 00:20:33.875 00000240 d1 bb dd 72 71 5d 74 d9 21 fe fd b4 59 db c7 10 ...rq]t.!...Y... 00:20:33.875 00000250 41 49 db c0 6f 4d 27 90 2b 2d e7 cc 04 3e 22 f5 AI..oM'.+-...>". 00:20:33.875 00000260 97 e7 74 cf bf 81 42 6c b3 b4 0f 64 83 31 7e c3 ..t...Bl...d.1~. 00:20:33.875 00000270 a9 85 a7 b7 0e 60 b4 08 f1 ef 8b 5b ec d5 5c f8 .....`.....[..\. 00:20:33.875 00000280 f2 b2 37 6c f2 4f 35 f1 63 6b 4f ae d2 87 16 20 ..7l.O5.ckO.... 00:20:33.875 00000290 3a d0 74 3b 39 3d 72 50 9a 00 b3 21 5b ed 7f 04 :.t;9=rP...![... 00:20:33.875 000002a0 9f fe e9 e1 a2 20 3a da ca fb 2f 88 d4 6e fd 50 ..... :.../..n.P 00:20:33.875 000002b0 26 11 15 79 18 c4 c9 00 ac fa 7c b0 e9 62 63 3f &..y......|..bc? 00:20:33.875 000002c0 59 de 88 cf 60 79 b7 5a e5 b9 20 95 97 0c 46 5a Y...`y.Z.. ...FZ 00:20:33.875 000002d0 5c 2a c6 b1 00 5b bf 99 aa 01 d6 f8 4f 94 7c 69 \*...[......O.|i 00:20:33.875 000002e0 f9 a7 34 f3 05 be e7 02 23 1a ae 9c b7 8e e7 d9 ..4.....#....... 00:20:33.875 000002f0 a1 ff c0 74 02 88 36 2c 91 19 fd 23 a6 60 80 fa ...t..6,...#.`.. 00:20:33.875 [2024-09-27 15:25:09.801995] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=1, dhgroup=4, seq=3428451729, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.875 [2024-09-27 15:25:09.836674] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.875 [2024-09-27 15:25:09.836714] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.875 [2024-09-27 15:25:09.836730] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.875 [2024-09-27 15:25:09.836750] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.875 [2024-09-27 15:25:09.836764] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.875 [2024-09-27 15:25:09.942684] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.875 [2024-09-27 15:25:09.942702] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.875 [2024-09-27 15:25:09.942709] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.875 [2024-09-27 15:25:09.942719] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.875 [2024-09-27 15:25:09.942776] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.875 ctrlr pubkey: 00:20:33.875 00000000 62 ac 47 4c e2 b9 ba 88 71 28 ce b5 5e 92 4b 9b b.GL....q(..^.K. 00:20:33.875 00000010 49 bf 3b ff 81 02 e9 9e 82 b4 e1 30 a5 b3 1d ff I.;........0.... 00:20:33.875 00000020 a6 60 55 e9 93 db b8 04 3e d2 c6 b8 6b 65 e4 48 .`U.....>...ke.H 00:20:33.875 00000030 43 55 1b 6e 3d 57 4c f5 04 d6 93 68 5f a9 d6 a1 CU.n=WL....h_... 00:20:33.875 00000040 e6 3c f4 2c f7 3d 22 ee 56 3f 24 4e 36 ea 26 06 .<.,.=".V?$N6.&. 00:20:33.875 00000050 a1 7a 51 aa 49 35 a9 9c af 1e 64 53 55 6a 21 1f .zQ.I5....dSUj!. 00:20:33.875 00000060 2b 7a 60 5f 6c 5a 66 ca 83 81 7d ce a3 13 36 b3 +z`_lZf...}...6. 00:20:33.875 00000070 ef f3 e5 3b 07 11 0c 82 b3 85 84 f9 0e 34 95 93 ...;.........4.. 00:20:33.875 00000080 55 98 f0 89 15 4f f9 c5 ea 11 a1 37 89 3d 89 c0 U....O.....7.=.. 00:20:33.875 00000090 a8 4b c8 c9 13 c2 80 57 cd 18 0f a8 e9 ea 49 9e .K.....W......I. 00:20:33.875 000000a0 c4 9d e2 30 d0 51 6d 44 30 09 b0 d2 40 14 df e7 ...0.QmD0...@... 00:20:33.875 000000b0 fc 10 d5 a3 08 3b 81 ea 26 b1 41 a5 cc fd da 1f .....;..&.A..... 00:20:33.875 000000c0 ad 79 b7 41 4b 85 e5 a7 49 2a 27 5e 21 6d ed 77 .y.AK...I*'^!m.w 00:20:33.875 000000d0 fb dd fb 18 dd 50 a2 18 8b b8 fc 3d 55 e4 fc db .....P.....=U... 00:20:33.875 000000e0 2a 59 a8 d9 ea 9e 92 ed 25 a8 21 78 08 41 97 42 *Y......%.!x.A.B 00:20:33.875 000000f0 f1 60 83 2f 1c b6 90 72 07 f7 11 64 04 09 5e f8 .`./...r...d..^. 00:20:33.875 00000100 8b 92 6c 29 89 67 5f 72 20 d4 37 d6 d0 20 8c f2 ..l).g_r .7.. .. 00:20:33.875 00000110 84 5c ed 24 f2 1f 6b 95 76 de 90 d1 26 60 4a 57 .\.$..k.v...&`JW 00:20:33.875 00000120 4c 26 56 e9 b5 d5 a2 b8 4d 87 43 92 10 d1 79 8c L&V.....M.C...y. 00:20:33.875 00000130 08 cf d4 42 1c 7c bb a9 12 65 09 45 df 88 d5 b9 ...B.|...e.E.... 00:20:33.875 00000140 a3 ef c1 1a d0 f0 37 51 f1 aa 87 21 d9 5a 81 8d ......7Q...!.Z.. 00:20:33.875 00000150 9c 96 87 d3 69 43 44 de d2 0e 8f 69 0c 56 76 2b ....iCD....i.Vv+ 00:20:33.875 00000160 8a ef 9a c5 db 2c c7 5e 42 d5 41 5d 4e eb e0 93 .....,.^B.A]N... 00:20:33.875 00000170 69 98 71 c5 8a 0b d3 f4 b9 47 cd 98 fa 57 8c 6e i.q......G...W.n 00:20:33.875 00000180 3a 15 eb 91 68 f3 b7 5b 6f c8 0e 7e 17 d7 2b 9e :...h..[o..~..+. 00:20:33.875 00000190 ff fa 92 c7 01 cc f8 a5 04 e9 2f 13 99 57 35 62 ........../..W5b 00:20:33.875 000001a0 12 82 88 4b 45 83 2f 38 0b 6b c0 04 a0 b0 0c 56 ...KE./8.k.....V 00:20:33.875 000001b0 fc 20 55 f8 d5 46 55 06 ff 1e c7 6e a9 67 b0 6e . U..FU....n.g.n 00:20:33.875 000001c0 e3 f5 3d 25 e8 a8 2b d5 16 7b 32 37 f6 a8 97 1b ..=%..+..{27.... 00:20:33.875 000001d0 19 3b af df 01 99 8f 24 b3 2b e2 87 95 83 ec 13 .;.....$.+...... 00:20:33.875 000001e0 8d 88 83 1e 6d 31 6f d2 cc 56 f7 44 53 aa 3e 81 ....m1o..V.DS.>. 00:20:33.875 000001f0 27 ee 2f 0b 92 8c b4 5c 6b 99 5e 48 7c 81 91 ff './....\k.^H|... 00:20:33.875 00000200 e2 52 14 46 65 6d 47 da 1a 1b e0 58 75 32 67 ce .R.FemG....Xu2g. 00:20:33.875 00000210 ef 27 95 15 a8 78 6b 5d 28 c3 74 e1 1a 49 b3 72 .'...xk](.t..I.r 00:20:33.875 00000220 7d 45 f2 28 fd 7e 85 1d 98 42 1a 02 1c bc 3d f2 }E.(.~...B....=. 00:20:33.875 00000230 cd a2 2d d6 01 c5 7f 0c 33 e6 3a 16 15 a1 34 17 ..-.....3.:...4. 00:20:33.875 00000240 31 fb 99 7f 33 82 89 92 0c 7a 4e 6a 1f 8e 60 1e 1...3....zNj..`. 00:20:33.875 00000250 44 d4 c3 a8 9d 88 3c 43 0b 63 b6 81 90 15 ab 6b D......mB..IS_z.P,. 00:20:33.876 000002f0 3c 11 e4 67 46 0d f9 09 1e 6d f2 3f 40 94 4d fb <..gF....m.?@.M. 00:20:33.876 dh secret: 00:20:33.876 00000000 03 59 73 93 c8 ca ee 7f 6c 78 ab 6e 92 a6 ce 6c .Ys.....lx.n...l 00:20:33.876 00000010 1b 01 59 75 97 15 5a 76 f7 54 4f 65 18 81 3e 0b ..Yu..Zv.TOe..>. 00:20:33.876 00000020 20 36 77 d7 1a 02 16 e3 45 8d a3 a0 a3 0d fe 6e 6w.....E......n 00:20:33.876 00000030 64 a8 c0 53 f4 6c 3a ec 23 61 03 84 43 22 16 f5 d..S.l:.#a..C".. 00:20:33.876 00000040 15 ec 58 52 cd c7 05 7a 67 b4 99 8f 20 3c ae 76 ..XR...zg... <.v 00:20:33.876 00000050 b4 bf af 80 55 e2 90 06 5d 83 3c ba 64 60 61 56 ....U...].<.d`aV 00:20:33.876 00000060 27 a5 3d 6a b1 bc 98 98 b3 b7 73 fb d4 40 27 15 '.=j......s..@'. 00:20:33.876 00000070 f3 78 80 dc 1a 51 2f 72 d1 20 d6 ef 99 ae 4e d7 .x...Q/r. ....N. 00:20:33.876 00000080 e8 87 55 2d 21 3e 17 c3 78 e1 c2 40 ca 57 56 06 ..U-!>..x..@.WV. 00:20:33.876 00000090 33 53 40 c9 1f 35 3f 5b 81 15 21 58 57 9b 6f 5c 3S@..5?[..!XW.o\ 00:20:33.876 000000a0 fe b9 be bb bf 9a 34 2c 13 df 12 cf 83 fd 0d c9 ......4,........ 00:20:33.876 000000b0 f9 b9 40 9b 4a 46 18 0f cc c1 88 e8 b0 08 79 9c ..@.JF........y. 00:20:33.876 000000c0 91 e6 ad 8d 17 56 f5 4b b0 b3 2d 3d 17 96 d4 c1 .....V.K..-=.... 00:20:33.876 000000d0 18 fd df b0 a4 6b d1 a9 e6 36 1e 3c ec 57 c8 c1 .....k...6.<.W.. 00:20:33.876 000000e0 e7 39 f1 be df f6 89 10 89 ae 44 0c b1 27 64 fe .9........D..'d. 00:20:33.876 000000f0 84 7f b3 5e ee cc 3f 1d fa 78 54 88 b1 19 9f 44 ...^..?..xT....D 00:20:33.876 00000100 52 0c e9 e8 be 15 14 c5 ac b0 1b 52 77 e6 a3 07 R..........Rw... 00:20:33.876 00000110 4b 67 b9 59 2e 7d 07 65 e0 7a 41 f3 d1 fe ce 26 Kg.Y.}.e.zA....& 00:20:33.876 00000120 ac 1c 47 70 bc 54 1e 98 96 d7 d4 90 eb 97 b5 e2 ..Gp.T.......... 00:20:33.876 00000130 c3 4b 0e d3 1c b9 5e 10 d6 87 a2 ec 82 06 4e 50 .K....^.......NP 00:20:33.876 00000140 7f b1 2d 4a 75 2c d1 e9 2d 68 19 4a 53 85 18 9e ..-Ju,..-h.JS... 00:20:33.876 00000150 19 c6 c5 a8 87 07 b7 67 7a 8f 94 71 7c 87 03 e3 .......gz..q|... 00:20:33.876 00000160 94 15 8a ec fd ce d1 89 72 b5 b1 3d 56 ee cc f8 ........r..=V... 00:20:33.876 00000170 ba 7f 0e 62 01 f7 54 cd 61 de e4 58 b3 0c 57 eb ...b..T.a..X..W. 00:20:33.876 00000180 d2 b7 36 e0 09 7e 7b 42 b9 02 6d 03 fb f4 cf fd ..6..~{B..m..... 00:20:33.876 00000190 db ee 90 28 e4 97 6c 3f bf 8d c3 54 2f b3 72 b9 ...(..l?...T/.r. 00:20:33.876 000001a0 fa 5e d1 54 fc e1 d6 9f fe c9 c3 a9 26 1b 12 2f .^.T........&../ 00:20:33.876 000001b0 d2 20 3c 5c 15 f1 27 ee 6a d8 85 c5 16 7e 1a d7 . <\..'.j....~.. 00:20:33.876 000001c0 8a 0f 32 23 a9 f5 9f 9c 88 68 1b 29 f7 69 58 5c ..2#.....h.).iX\ 00:20:33.876 000001d0 db 8b 96 70 23 a3 f8 7b 4b b9 14 ed ce 34 b3 c6 ...p#..{K....4.. 00:20:33.876 000001e0 fd 5d 44 1f 30 c2 b3 55 d7 42 4a 14 a2 3f b7 42 .]D.0..U.BJ..?.B 00:20:33.876 000001f0 f2 a8 40 3f 28 46 38 9a 76 90 79 01 f4 91 a4 fc ..@?(F8.v.y..... 00:20:33.876 00000200 aa 85 6e e1 63 f8 0d d8 9e f9 09 d3 01 ba c0 a2 ..n.c........... 00:20:33.876 00000210 8a f7 ee 83 42 8d 68 9d 6f 93 0c c4 ff bc d6 60 ....B.h.o......` 00:20:33.876 00000220 b0 e6 c7 68 37 3a 33 fd dc 73 75 14 61 76 1f c1 ...h7:3..su.av.. 00:20:33.876 00000230 9e 07 8e 86 fd 82 53 c0 6c 40 e4 90 61 2c ce d0 ......S.l@..a,.. 00:20:33.876 00000240 66 02 22 c7 d1 63 85 24 4f fa e2 d6 c9 e7 d1 bf f."..c.$O....... 00:20:33.876 00000250 e1 98 79 60 85 09 a0 9e c0 15 35 be 82 59 c7 a5 ..y`......5..Y.. 00:20:33.876 00000260 54 15 05 d6 cc 5c b8 a5 58 bc 08 c9 15 03 1c 7b T....\..X......{ 00:20:33.876 00000270 21 c5 55 8b 12 41 42 38 6a 17 ea a5 d3 31 da 3e !.U..AB8j....1.> 00:20:33.876 00000280 dc fd 64 f7 b7 53 82 d3 9d 93 5e 08 c9 91 ef af ..d..S....^..... 00:20:33.876 00000290 18 cc 51 f2 7e 98 91 84 10 4a 62 6a d1 6d ef 94 ..Q.~....Jbj.m.. 00:20:33.876 000002a0 ba 14 cc 2f ed 04 ee 2a 61 30 07 7c e8 2d 1c 50 .../...*a0.|.-.P 00:20:33.876 000002b0 1f 1b a4 d1 65 c6 da 53 59 05 aa 56 28 63 72 37 ....e..SY..V(cr7 00:20:33.876 000002c0 68 6d 8c 50 0b a1 2d bd b1 1e 02 fc da 6f f9 fb hm.P..-......o.. 00:20:33.876 000002d0 44 ac 69 c7 4f 73 a1 f7 9a 07 eb 42 7d 60 8e f5 D.i.Os.....B}`.. 00:20:33.876 000002e0 f4 c0 f8 31 0b 19 2c 9b fe 12 d1 4c 3e 99 8a 2f ...1..,....L>../ 00:20:33.876 000002f0 15 d5 a9 23 b4 b3 6f e1 43 39 dd 90 c6 59 cf b8 ...#..o.C9...Y.. 00:20:33.876 [2024-09-27 15:25:09.991406] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=1, dhgroup=4, seq=3428451730, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.876 [2024-09-27 15:25:09.991502] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.876 [2024-09-27 15:25:10.050688] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.876 [2024-09-27 15:25:10.050742] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.876 [2024-09-27 15:25:10.050753] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.876 [2024-09-27 15:25:10.050780] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.876 [2024-09-27 15:25:10.228693] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.876 [2024-09-27 15:25:10.228715] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.876 [2024-09-27 15:25:10.228728] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.876 [2024-09-27 15:25:10.228775] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.876 [2024-09-27 15:25:10.228800] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.876 ctrlr pubkey: 00:20:33.876 00000000 ae 18 8f 06 3a 2d 4b 82 cd 6f 18 ae 82 ca f9 83 ....:-K..o...... 00:20:33.876 00000010 8e 9f bb 75 53 13 76 62 1e e4 80 61 36 4f 47 73 ...uS.vb...a6OGs 00:20:33.876 00000020 df 3e 32 62 6a ba 6c ec 77 50 d1 8d e6 fa 55 ab .>2bj.l.wP....U. 00:20:33.876 00000030 99 ef 48 8f 53 77 01 a4 a1 39 13 1a 71 94 a9 de ..H.Sw...9..q... 00:20:33.876 00000040 da 08 99 9e 02 9d ce 85 7d 35 39 7e f1 1d 35 e9 ........}59~..5. 00:20:33.876 00000050 2b 3c 1c 85 62 7f 60 49 df d5 91 d6 cb 7c 2b b3 +<..b.`I.....|+. 00:20:33.876 00000060 3b fb ae e4 23 0d 96 9e 58 79 ed 6d 0b 44 06 fd ;...#...Xy.m.D.. 00:20:33.876 00000070 d0 fb d3 b4 58 0a 35 d1 de 08 77 00 96 ff 69 54 ....X.5...w...iT 00:20:33.876 00000080 fb 21 27 04 7a 27 09 9b 63 d5 bb 23 39 08 c9 00 .!'.z'..c..#9... 00:20:33.876 00000090 63 71 c8 8d 1b 5f f2 70 13 6d 37 23 cb 0d 6d 69 cq..._.p.m7#..mi 00:20:33.876 000000a0 4a 3c 06 e8 0c 41 23 22 5c 7f 37 09 f9 1b ec b1 J<...A#"\.7..... 00:20:33.876 000000b0 e9 df c3 a0 65 a3 65 89 4d f7 40 4b b3 d0 e3 87 ....e.e.M.@K.... 00:20:33.876 000000c0 1f ee bf cd d6 29 55 c3 1a f2 4f e4 81 90 47 97 .....)U...O...G. 00:20:33.876 000000d0 20 6f 51 88 eb 53 3c 93 31 93 50 f0 07 17 f1 42 oQ..S<.1.P....B 00:20:33.876 000000e0 f2 f6 5c bc 9f 36 7e 06 ca 33 dc 0b cd 67 5e aa ..\..6~..3...g^. 00:20:33.876 000000f0 5e 3d 5b 3f 5d 64 e6 9f 9d 67 de 72 5b b8 f1 94 ^=[?]d...g.r[... 00:20:33.876 00000100 d8 c8 05 9a 36 9a b6 f8 f6 00 15 d2 c1 07 a7 29 ....6..........) 00:20:33.876 00000110 d0 03 6e 2a 0c fd 72 d1 db f8 23 61 8d db 7c b7 ..n*..r...#a..|. 00:20:33.876 00000120 94 08 d2 03 44 9a 72 26 dc b4 1a e4 b0 43 bc 0d ....D.r&.....C.. 00:20:33.876 00000130 ff f8 37 f1 d5 c7 5d 00 ef c5 16 d7 f9 52 96 98 ..7...]......R.. 00:20:33.876 00000140 61 14 21 d7 af c3 1f a1 8f dc d2 92 14 26 3a 2d a.!..........&:- 00:20:33.876 00000150 d4 76 49 c2 f7 93 0c 63 ef d0 36 6d 95 96 f2 f0 .vI....c..6m.... 00:20:33.876 00000160 76 58 98 98 52 aa 15 77 a9 a4 83 21 cf 05 3f 63 vX..R..w...!..?c 00:20:33.876 00000170 4c 52 74 29 82 d1 fb 05 33 ad 33 09 b4 fa 7c 0c LRt)....3.3...|. 00:20:33.876 00000180 c8 2a 24 42 c5 00 0e 38 51 ca 94 bf 63 ec 88 6a .*$B...8Q...c..j 00:20:33.876 00000190 76 d7 4e ea 4f 4c d5 5c 88 e0 cf 20 51 20 07 11 v.N.OL.\... Q .. 00:20:33.876 000001a0 3f d4 bf 65 9a 43 14 e9 90 16 23 35 cb e9 5c b6 ?..e.C....#5..\. 00:20:33.876 000001b0 cd 66 d5 e1 f0 02 d2 c5 9f 93 3d 5c 06 e3 19 ff .f........=\.... 00:20:33.876 000001c0 08 16 6b fc 86 a9 b1 e0 b4 31 c4 45 1f 30 07 5e ..k......1.E.0.^ 00:20:33.876 000001d0 96 ac 14 4b 7f 20 e0 69 b7 b9 7c be e3 22 9a 65 ...K. .i..|..".e 00:20:33.876 000001e0 6a a1 83 2e 83 64 72 84 41 83 28 5c 8a 81 92 83 j....dr.A.(\.... 00:20:33.876 000001f0 0f 64 00 fb 9a ab 86 6e 7d 7e 53 29 7d ca e8 ec .d.....n}~S)}... 00:20:33.876 00000200 d0 7b 3e 0e 30 c4 a1 f1 1c b3 74 a9 90 c3 93 63 .{>.0.....t....c 00:20:33.876 00000210 db c4 c2 e2 44 2f e5 43 53 66 ef 34 29 ac b9 73 ....D/.CSf.4)..s 00:20:33.876 00000220 f2 45 a5 85 b4 4d c3 3f f0 e7 bd 54 7c 08 58 10 .E...M.?...T|.X. 00:20:33.876 00000230 52 de ba 20 26 58 3e 0e db 88 55 f5 65 d2 2c df R.. &X>...U.e.,. 00:20:33.876 00000240 38 d1 aa 27 e1 e9 34 11 3b f2 47 94 2c 22 83 60 8..'..4.;.G.,".` 00:20:33.876 00000250 de 46 f1 85 12 21 3e 0d de 95 ef 67 73 1a 03 3c .F...!>....gs..< 00:20:33.876 00000260 29 2b e6 74 b5 e1 05 f6 47 4b 81 dd 02 d1 ba ef )+.t....GK...... 00:20:33.876 00000270 43 f9 d7 44 ea 35 ee 29 5f 62 e9 85 64 ba 21 7a C..D.5.)_b..d.!z 00:20:33.876 00000280 1d b0 4b cd b2 97 5a 36 ea 2d b2 33 36 be 22 78 ..K...Z6.-.36."x 00:20:33.876 00000290 ae ac d0 73 ca 1d 4b 7d df 2f 26 43 71 00 c0 eb ...s..K}./&Cq... 00:20:33.876 000002a0 48 fa 0d 18 18 47 17 08 5b c6 ea c2 71 b6 4e ab H....G..[...q.N. 00:20:33.876 000002b0 26 25 2f 18 04 58 c6 ed ca dd b8 d0 ef ca 23 12 &%/..X........#. 00:20:33.876 000002c0 64 0d 20 9e 3a 4f 5a 2e 74 8b df 33 a6 bb b6 13 d. .:OZ.t..3.... 00:20:33.876 000002d0 6b 3c a9 ba a0 cb e2 8c a5 3c 77 e4 13 03 1f 08 k<.......5;.V 00:20:33.877 00000140 e2 69 ba fb 9d 02 36 97 67 06 7c f5 e0 10 84 b9 .i....6.g.|..... 00:20:33.877 00000150 45 25 90 d0 32 8a ce 71 10 49 c0 dc 4e e3 c2 43 E%..2..q.I..N..C 00:20:33.877 00000160 a6 cd f6 3b 29 77 66 44 e6 c8 0a bb 52 79 98 8b ...;)wfD....Ry.. 00:20:33.877 00000170 76 a4 52 5c 2e 04 be 0f fe 78 ab 58 db ab 13 30 v.R\.....x.X...0 00:20:33.877 00000180 f3 32 79 82 54 e2 a4 1d 40 a6 95 4f 15 b2 c5 d3 .2y.T...@..O.... 00:20:33.877 00000190 94 f5 20 5d e5 ba 80 32 6b 04 ca 3a b6 7b 2d 59 .. ]...2k..:.{-Y 00:20:33.877 000001a0 2d 37 ec df 45 86 73 ea 4c 24 ab c3 0f 3a 76 4b -7..E.s.L$...:vK 00:20:33.877 000001b0 5d d9 35 d9 d7 48 9f 87 be 73 65 37 77 66 ea 60 ].5..H...se7wf.` 00:20:33.877 000001c0 f8 fc a0 f3 a3 9a ca 0f 45 b9 4f cf 40 4e fd e5 ........E.O.@N.. 00:20:33.877 000001d0 1a ed cb 03 c5 ba 23 8d f0 55 b3 14 81 dc c1 14 ......#..U...... 00:20:33.877 000001e0 a1 f9 6e 1a 01 91 a6 18 8a 05 16 fa e0 3b 42 b7 ..n..........;B. 00:20:33.877 000001f0 01 bb 02 13 87 1d 51 65 ca 30 98 cc 16 5b 69 52 ......Qe.0...[iR 00:20:33.877 00000200 3a cb f6 d6 38 3d fd ad ca df 0c 06 b3 63 ab bf :...8=.......c.. 00:20:33.877 00000210 a5 34 c7 ea a7 43 e9 bd 3f 22 f9 b2 65 e9 7e 52 .4...C..?"..e.~R 00:20:33.877 00000220 e4 7d d4 5d 1a 7c a5 8a 35 58 10 ff 9d ac 3b 58 .}.].|..5X....;X 00:20:33.877 00000230 3b bc f8 bb a1 d0 23 32 1b ab 5c 18 e4 bb 09 b0 ;.....#2..\..... 00:20:33.877 00000240 8a 89 be 5b a3 9e ac ad 05 55 f9 07 60 c3 84 59 ...[.....U..`..Y 00:20:33.877 00000250 ef 9f b1 3b 97 99 07 e0 35 dc df e4 78 ca de e5 ...;....5...x... 00:20:33.877 00000260 b3 b0 55 33 e3 30 cb 23 ac 7a c6 99 43 b5 d7 01 ..U3.0.#.z..C... 00:20:33.877 00000270 5e d8 5f ec 62 7d b3 c6 17 b5 53 9f a0 5d 53 eb ^._.b}....S..]S. 00:20:33.877 00000280 2f 6b 9a fa 48 66 2e df c1 8a a1 0d 6a aa 93 ae /k..Hf......j... 00:20:33.877 00000290 69 f2 e8 49 14 f8 26 85 bb e8 63 54 ff b7 b9 69 i..I..&...cT...i 00:20:33.877 000002a0 27 eb 91 cc 5f c3 db 89 71 13 13 8b 48 6b a7 f0 '..._...q...Hk.. 00:20:33.877 000002b0 60 fd 3b bf ac 13 ea 98 b9 0c 42 3d 17 ff e0 a4 `.;.......B=.... 00:20:33.877 000002c0 8f 11 ac 93 b5 dd 4f de 9d 21 c8 9b 8a bf bf f2 ......O..!...... 00:20:33.877 000002d0 40 22 f8 9d 64 43 04 de 62 e9 b4 04 87 8e b6 a0 @"..dC..b....... 00:20:33.877 000002e0 7a 1c 90 8d f5 be 0d d1 eb 8e b6 64 ea d8 27 6d z..........d..'m 00:20:33.877 000002f0 89 1a 0c aa 8b e7 6d b4 1c c4 d9 b3 79 23 56 e5 ......m.....y#V. 00:20:33.877 dh secret: 00:20:33.877 00000000 bb aa ac 9f 3e 9d 9a b6 6c d5 36 ef 74 61 ae ac ....>...l.6.ta.. 00:20:33.877 00000010 67 ae 64 93 c9 7f 40 f7 2b b0 38 da 49 86 58 06 g.d...@.+.8.I.X. 00:20:33.877 00000020 ff 32 eb 1b 3f b2 06 fd 88 7e 02 66 29 00 eb 80 .2..?....~.f)... 00:20:33.877 00000030 d3 20 65 0b e7 0a bd 95 dc 1c 8e ac 24 ee 2f b3 . e.........$./. 00:20:33.877 00000040 5e 19 f3 ad 5c a3 6a ee b7 51 46 9f eb 0f 77 08 ^...\.j..QF...w. 00:20:33.877 00000050 c4 42 5b 3c ed f5 e8 e5 57 74 77 66 ef a2 f5 0b .B[<....Wtwf.... 00:20:33.877 00000060 f8 bc ab cd 81 47 40 b2 2f 35 33 8d f9 f1 b8 79 .....G@./53....y 00:20:33.877 00000070 07 ce 47 47 92 77 9e 49 63 2a 41 16 3a f0 7c 9b ..GG.w.Ic*A.:.|. 00:20:33.877 00000080 e1 fc b7 ac bd ef 7e 84 9a 10 7b 78 d7 f3 0c da ......~...{x.... 00:20:33.877 00000090 50 e8 f5 c5 82 5a 62 0a 8e c3 2f fa 96 9d f3 6d P....Zb.../....m 00:20:33.877 000000a0 45 c6 84 9f 68 b9 4d 8a 6b b7 5c 54 00 b0 97 75 E...h.M.k.\T...u 00:20:33.877 000000b0 68 bc 37 7c b8 f0 d7 2a 51 d1 c4 dc a6 7a 43 6e h.7|...*Q....zCn 00:20:33.877 000000c0 a6 d8 5b e7 50 ce f2 47 34 f1 c5 b5 a8 7f 16 fd ..[.P..G4....... 00:20:33.877 000000d0 a0 53 31 b1 4d c5 35 09 9d fb c2 37 6f fa 70 80 .S1.M.5....7o.p. 00:20:33.877 000000e0 40 65 24 fe a6 04 2e 2b a8 18 c8 78 1d cc ae 2c @e$....+...x..., 00:20:33.877 000000f0 29 84 b7 00 77 a7 bc 17 85 d2 29 5d fc 6d 89 f2 )...w.....)].m.. 00:20:33.877 00000100 41 68 f0 6f 37 68 42 d5 4e 4b f0 f6 ed f4 c3 35 Ah.o7hB.NK.....5 00:20:33.877 00000110 85 cb 18 98 ed 8c be 65 13 bd 93 0e 3b 60 66 46 .......e....;`fF 00:20:33.877 00000120 98 1a 77 34 02 51 92 3d c4 1f 42 7c 9b 2e 55 b8 ..w4.Q.=..B|..U. 00:20:33.877 00000130 d6 57 19 b5 11 11 50 c6 42 bd 42 bb 6e 1c a8 65 .W....P.B.B.n..e 00:20:33.877 00000140 4d ef 54 3e e3 e5 ba d7 1c 16 4f 7c c4 aa f2 a8 M.T>......O|.... 00:20:33.877 00000150 5c b7 39 0a 85 07 b5 f3 49 4e 4a 7e bf 71 3b fe \.9.....INJ~.q;. 00:20:33.877 00000160 fb 14 75 9e 4d 6c 08 dc 5b 02 d5 3f 8a 8a 1b 74 ..u.Ml..[..?...t 00:20:33.877 00000170 e4 1f a3 41 c5 13 ee de 27 58 eb ca 4d 7d 09 a3 ...A....'X..M}.. 00:20:33.877 00000180 89 80 73 b6 88 b1 d2 c1 03 63 53 c8 8d 4d 16 30 ..s......cS..M.0 00:20:33.877 00000190 f5 ea a5 26 93 06 17 cb 4e 00 8e 83 37 ea c3 db ...&....N...7... 00:20:33.877 000001a0 5f c4 7d 94 15 ac 5a b3 2d bf 49 05 08 5e e2 8e _.}...Z.-.I..^.. 00:20:33.877 000001b0 74 7f e5 d9 09 87 f7 0f 2d 0f ff e7 0a 32 14 7f t.......-....2.. 00:20:33.877 000001c0 19 20 cb 0c 19 fb eb a2 08 0e 75 6b 88 3d 91 77 . ........uk.=.w 00:20:33.877 000001d0 12 4f 79 3b 83 14 0e 87 6a 0a 90 09 fb 4c 1b 79 .Oy;....j....L.y 00:20:33.877 000001e0 47 18 57 dc c7 cd 10 19 e8 89 2e 10 43 dc 1e 25 G.W.........C..% 00:20:33.877 000001f0 dc 2c 71 d0 6f 27 02 31 df c3 9d 6a 74 e8 df ab .,q.o'.1...jt... 00:20:33.877 00000200 f0 e0 77 c6 18 89 38 f9 a0 10 8b 18 3e ec 7d d4 ..w...8.....>.}. 00:20:33.877 00000210 61 e2 fb 7a de 79 b8 6b df d1 0c 54 c9 1f 14 ab a..z.y.k...T.... 00:20:33.877 00000220 72 4e a6 11 3b 59 92 c5 ca 5a c0 db 2a 38 8d 75 rN..;Y...Z..*8.u 00:20:33.877 00000230 52 c0 4d 72 60 88 f1 ab 3e 7e a3 b8 1b 46 74 1b R.Mr`...>~...Ft. 00:20:33.877 00000240 5e 07 5b 97 6b 96 40 01 27 55 54 a8 7a 8b 74 02 ^.[.k.@.'UT.z.t. 00:20:33.877 00000250 53 68 0b 37 c8 79 4a 5b 9f c4 51 43 ff 9e 66 41 Sh.7.yJ[..QC..fA 00:20:33.877 00000260 3e ac 24 ad c0 20 bd e2 58 9e e4 6a 63 e5 54 e7 >.$.. ..X..jc.T. 00:20:33.877 00000270 5e b8 3a 00 48 0c 52 61 81 bc ef 1d 7a bd dc 81 ^.:.H.Ra....z... 00:20:33.877 00000280 81 25 f6 55 eb 31 cc f9 e3 d4 66 52 0a c1 e1 aa .%.U.1....fR.... 00:20:33.877 00000290 56 4a 3f a5 94 8a cf 1b 7a 4e b4 99 af 98 6f 19 VJ?.....zN....o. 00:20:33.877 000002a0 90 d2 1d 0d 02 9e a2 c3 72 5f 29 49 ee 9b 8d bc ........r_)I.... 00:20:33.877 000002b0 52 fe 32 c4 bc ba 18 46 16 b8 14 a7 0d 35 9e 01 R.2....F.....5.. 00:20:33.877 000002c0 1a aa 76 60 ea 51 e6 57 f7 f6 04 ef 1f af 68 c6 ..v`.Q.W......h. 00:20:33.877 000002d0 95 9b 0c 10 05 31 17 3d 8a 4f 06 08 77 db 28 59 .....1.=.O..w.(Y 00:20:33.877 000002e0 27 c9 df f0 80 ac 3b 4f 1b b6 34 ad c3 f4 45 c2 '.....;O..4...E. 00:20:33.877 000002f0 24 75 26 66 26 fb 34 a9 e2 c3 60 2e 72 19 ac 63 $u&f&.4...`.r..c 00:20:33.877 [2024-09-27 15:25:10.281213] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=1, dhgroup=4, seq=3428451731, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.877 [2024-09-27 15:25:10.319403] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.877 [2024-09-27 15:25:10.319445] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.877 [2024-09-27 15:25:10.319464] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.877 [2024-09-27 15:25:10.319493] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.877 [2024-09-27 15:25:10.319517] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.877 [2024-09-27 15:25:10.426128] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.877 [2024-09-27 15:25:10.426147] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.877 [2024-09-27 15:25:10.426155] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.877 [2024-09-27 15:25:10.426165] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.877 [2024-09-27 15:25:10.426219] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.877 ctrlr pubkey: 00:20:33.877 00000000 ae 18 8f 06 3a 2d 4b 82 cd 6f 18 ae 82 ca f9 83 ....:-K..o...... 00:20:33.877 00000010 8e 9f bb 75 53 13 76 62 1e e4 80 61 36 4f 47 73 ...uS.vb...a6OGs 00:20:33.877 00000020 df 3e 32 62 6a ba 6c ec 77 50 d1 8d e6 fa 55 ab .>2bj.l.wP....U. 00:20:33.877 00000030 99 ef 48 8f 53 77 01 a4 a1 39 13 1a 71 94 a9 de ..H.Sw...9..q... 00:20:33.877 00000040 da 08 99 9e 02 9d ce 85 7d 35 39 7e f1 1d 35 e9 ........}59~..5. 00:20:33.877 00000050 2b 3c 1c 85 62 7f 60 49 df d5 91 d6 cb 7c 2b b3 +<..b.`I.....|+. 00:20:33.877 00000060 3b fb ae e4 23 0d 96 9e 58 79 ed 6d 0b 44 06 fd ;...#...Xy.m.D.. 00:20:33.877 00000070 d0 fb d3 b4 58 0a 35 d1 de 08 77 00 96 ff 69 54 ....X.5...w...iT 00:20:33.877 00000080 fb 21 27 04 7a 27 09 9b 63 d5 bb 23 39 08 c9 00 .!'.z'..c..#9... 00:20:33.877 00000090 63 71 c8 8d 1b 5f f2 70 13 6d 37 23 cb 0d 6d 69 cq..._.p.m7#..mi 00:20:33.877 000000a0 4a 3c 06 e8 0c 41 23 22 5c 7f 37 09 f9 1b ec b1 J<...A#"\.7..... 00:20:33.877 000000b0 e9 df c3 a0 65 a3 65 89 4d f7 40 4b b3 d0 e3 87 ....e.e.M.@K.... 00:20:33.877 000000c0 1f ee bf cd d6 29 55 c3 1a f2 4f e4 81 90 47 97 .....)U...O...G. 00:20:33.877 000000d0 20 6f 51 88 eb 53 3c 93 31 93 50 f0 07 17 f1 42 oQ..S<.1.P....B 00:20:33.877 000000e0 f2 f6 5c bc 9f 36 7e 06 ca 33 dc 0b cd 67 5e aa ..\..6~..3...g^. 00:20:33.877 000000f0 5e 3d 5b 3f 5d 64 e6 9f 9d 67 de 72 5b b8 f1 94 ^=[?]d...g.r[... 00:20:33.877 00000100 d8 c8 05 9a 36 9a b6 f8 f6 00 15 d2 c1 07 a7 29 ....6..........) 00:20:33.877 00000110 d0 03 6e 2a 0c fd 72 d1 db f8 23 61 8d db 7c b7 ..n*..r...#a..|. 00:20:33.877 00000120 94 08 d2 03 44 9a 72 26 dc b4 1a e4 b0 43 bc 0d ....D.r&.....C.. 00:20:33.877 00000130 ff f8 37 f1 d5 c7 5d 00 ef c5 16 d7 f9 52 96 98 ..7...]......R.. 00:20:33.877 00000140 61 14 21 d7 af c3 1f a1 8f dc d2 92 14 26 3a 2d a.!..........&:- 00:20:33.877 00000150 d4 76 49 c2 f7 93 0c 63 ef d0 36 6d 95 96 f2 f0 .vI....c..6m.... 00:20:33.877 00000160 76 58 98 98 52 aa 15 77 a9 a4 83 21 cf 05 3f 63 vX..R..w...!..?c 00:20:33.878 00000170 4c 52 74 29 82 d1 fb 05 33 ad 33 09 b4 fa 7c 0c LRt)....3.3...|. 00:20:33.878 00000180 c8 2a 24 42 c5 00 0e 38 51 ca 94 bf 63 ec 88 6a .*$B...8Q...c..j 00:20:33.878 00000190 76 d7 4e ea 4f 4c d5 5c 88 e0 cf 20 51 20 07 11 v.N.OL.\... Q .. 00:20:33.878 000001a0 3f d4 bf 65 9a 43 14 e9 90 16 23 35 cb e9 5c b6 ?..e.C....#5..\. 00:20:33.878 000001b0 cd 66 d5 e1 f0 02 d2 c5 9f 93 3d 5c 06 e3 19 ff .f........=\.... 00:20:33.878 000001c0 08 16 6b fc 86 a9 b1 e0 b4 31 c4 45 1f 30 07 5e ..k......1.E.0.^ 00:20:33.878 000001d0 96 ac 14 4b 7f 20 e0 69 b7 b9 7c be e3 22 9a 65 ...K. .i..|..".e 00:20:33.878 000001e0 6a a1 83 2e 83 64 72 84 41 83 28 5c 8a 81 92 83 j....dr.A.(\.... 00:20:33.878 000001f0 0f 64 00 fb 9a ab 86 6e 7d 7e 53 29 7d ca e8 ec .d.....n}~S)}... 00:20:33.878 00000200 d0 7b 3e 0e 30 c4 a1 f1 1c b3 74 a9 90 c3 93 63 .{>.0.....t....c 00:20:33.878 00000210 db c4 c2 e2 44 2f e5 43 53 66 ef 34 29 ac b9 73 ....D/.CSf.4)..s 00:20:33.878 00000220 f2 45 a5 85 b4 4d c3 3f f0 e7 bd 54 7c 08 58 10 .E...M.?...T|.X. 00:20:33.878 00000230 52 de ba 20 26 58 3e 0e db 88 55 f5 65 d2 2c df R.. &X>...U.e.,. 00:20:33.878 00000240 38 d1 aa 27 e1 e9 34 11 3b f2 47 94 2c 22 83 60 8..'..4.;.G.,".` 00:20:33.878 00000250 de 46 f1 85 12 21 3e 0d de 95 ef 67 73 1a 03 3c .F...!>....gs..< 00:20:33.878 00000260 29 2b e6 74 b5 e1 05 f6 47 4b 81 dd 02 d1 ba ef )+.t....GK...... 00:20:33.878 00000270 43 f9 d7 44 ea 35 ee 29 5f 62 e9 85 64 ba 21 7a C..D.5.)_b..d.!z 00:20:33.878 00000280 1d b0 4b cd b2 97 5a 36 ea 2d b2 33 36 be 22 78 ..K...Z6.-.36."x 00:20:33.878 00000290 ae ac d0 73 ca 1d 4b 7d df 2f 26 43 71 00 c0 eb ...s..K}./&Cq... 00:20:33.878 000002a0 48 fa 0d 18 18 47 17 08 5b c6 ea c2 71 b6 4e ab H....G..[...q.N. 00:20:33.878 000002b0 26 25 2f 18 04 58 c6 ed ca dd b8 d0 ef ca 23 12 &%/..X........#. 00:20:33.878 000002c0 64 0d 20 9e 3a 4f 5a 2e 74 8b df 33 a6 bb b6 13 d. .:OZ.t..3.... 00:20:33.878 000002d0 6b 3c a9 ba a0 cb e2 8c a5 3c 77 e4 13 03 1f 08 k<..........>#....P..n0 00:20:33.878 00000100 8b 3c cd e9 c8 19 96 1d 9f 75 8a 1e 65 85 5a 0c .<.......u..e.Z. 00:20:33.878 00000110 c2 7e 29 df f4 db db 37 fe cc c7 bb ec dc bf 5b .~)....7.......[ 00:20:33.878 00000120 03 b7 21 e4 95 cc de a4 b6 9f 50 46 d7 68 77 65 ..!.......PF.hwe 00:20:33.878 00000130 68 2f ee 91 f1 10 0e 2f 2d ca dc c2 c8 dc ff 09 h/...../-....... 00:20:33.878 00000140 0c c0 54 54 8c d0 2f 2e 06 59 dd 0b 1b 4c 7d 7f ..TT../..Y...L}. 00:20:33.878 00000150 0f ee 43 c9 56 a4 3c 70 aa 6a 4f a7 33 84 e4 58 ..C.V...j. 00:20:33.878 000001c0 3f 1c c0 20 e0 bf bd b7 16 d4 93 7b 92 02 b9 15 ?.. .......{.... 00:20:33.878 000001d0 0f ca 2b 8f f9 2a d2 4c 04 84 2b 34 b6 d2 d1 e1 ..+..*.L..+4.... 00:20:33.878 000001e0 88 e2 4c dd 9a 6f 73 6c 41 78 91 6c 07 3b 0f b2 ..L..oslAx.l.;.. 00:20:33.878 000001f0 cb b5 8a ad 33 30 cd ab 37 d6 f9 3f 66 36 9b ff ....30..7..?f6.. 00:20:33.878 00000200 d7 d1 00 5d db 16 4c 95 b8 c6 c4 54 1a 49 e5 28 ...]..L....T.I.( 00:20:33.878 00000210 db 5a a2 9c c6 a6 30 28 36 a8 a4 13 f5 49 88 a5 .Z....0(6....I.. 00:20:33.879 00000220 d2 1a 8e 31 76 cb 8c 1b ff 31 bd b1 e3 c3 40 63 ...1v....1....@c 00:20:33.879 00000230 74 04 83 46 89 41 7d f3 d5 73 5a 76 51 24 9d 02 t..F.A}..sZvQ$.. 00:20:33.879 00000240 bc e9 48 0c 70 32 d8 eb 5c eb 1c 1a f1 3e 5a bf ..H.p2..\....>Z. 00:20:33.879 00000250 1b f3 fa 36 6c 91 c3 59 ed 08 dc b1 03 20 a8 a9 ...6l..Y..... .. 00:20:33.879 00000260 38 f1 0e 21 0b 12 11 76 af a2 a9 d8 f6 eb a9 77 8..!...v.......w 00:20:33.879 00000270 35 a3 22 bc 77 26 a4 bc 6e 60 5b 91 96 d0 55 4e 5.".w&..n`[...UN 00:20:33.879 00000280 82 60 fd d7 06 47 56 05 a9 ab d2 18 2e 16 88 d7 .`...GV......... 00:20:33.879 00000290 74 2f 72 84 74 de 91 f3 52 f4 1a 45 27 f2 09 22 t/r.t...R..E'.." 00:20:33.879 000002a0 d8 c1 53 b0 74 93 d0 7d d3 3d 35 36 07 c2 18 43 ..S.t..}.=56...C 00:20:33.879 000002b0 af 08 b4 be e6 33 2f dd c4 ff da 53 e4 83 99 8c .....3/....S.... 00:20:33.879 000002c0 32 b4 a7 3d 9a 81 08 f3 8f f3 18 0f bd 01 4f 71 2..=..........Oq 00:20:33.879 000002d0 80 14 d4 98 1c 49 02 6d 5d 43 52 a9 23 8c 13 39 .....I.m]CR.#..9 00:20:33.879 000002e0 ef 69 2c 10 17 75 6e f0 d1 94 1e 82 e5 f2 07 59 .i,..un........Y 00:20:33.879 000002f0 47 16 b0 2c 05 0f 73 87 eb f0 9e c0 59 e9 89 0d G..,..s.....Y... 00:20:33.879 [2024-09-27 15:25:10.475386] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=1, dhgroup=4, seq=3428451732, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.879 [2024-09-27 15:25:10.475502] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.879 [2024-09-27 15:25:10.532621] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.879 [2024-09-27 15:25:10.532659] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.879 [2024-09-27 15:25:10.532670] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.879 [2024-09-27 15:25:10.532696] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.879 [2024-09-27 15:25:10.702389] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.879 [2024-09-27 15:25:10.702408] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.879 [2024-09-27 15:25:10.702415] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.879 [2024-09-27 15:25:10.702466] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.879 [2024-09-27 15:25:10.702488] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.879 ctrlr pubkey: 00:20:33.879 00000000 74 6e b6 3d 60 4d c9 ab ef b8 14 0b fb 3d 91 2c tn.=`M.......=., 00:20:33.879 00000010 ea e3 88 25 82 a4 f2 64 15 0d cc f4 7f dc b1 80 ...%...d........ 00:20:33.879 00000020 b4 87 d4 2f b4 fd b5 3f e7 1e 80 e2 66 73 53 bc .../...?....fsS. 00:20:33.879 00000030 f4 2a 85 bc fd eb c7 de f4 00 50 28 90 28 bb b2 .*........P(.(.. 00:20:33.879 00000040 4b ad 8d f7 57 96 f6 e9 74 5d 35 4d e7 cc 33 0f K...W...t]5M..3. 00:20:33.879 00000050 45 21 4c 95 ef 4e 0e 46 f0 93 31 98 1a 8c e0 cf E!L..N.F..1..... 00:20:33.879 00000060 b2 6d 18 a2 38 a7 88 3a 44 7a b9 f4 1a ad e1 b7 .m..8..:Dz...... 00:20:33.879 00000070 1a 9d b4 a5 81 0c ce d5 58 19 9a 51 96 ed 83 0b ........X..Q.... 00:20:33.879 00000080 7f f9 81 4f be 26 1b 50 6c 10 5a 40 32 e7 33 fe ...O.&.Pl.Z@2.3. 00:20:33.879 00000090 19 98 2a 7d 82 4f bc e2 02 80 f5 45 65 d2 ad 68 ..*}.O.....Ee..h 00:20:33.879 000000a0 8f d9 38 88 e9 1a a3 e0 65 b9 b8 62 71 11 f9 f6 ..8.....e..bq... 00:20:33.879 000000b0 cf f2 29 56 1c 46 db 24 1d 27 3c 3e 2f 50 c7 39 ..)V.F.$.'<>/P.9 00:20:33.879 000000c0 8b 0a dd b3 e4 46 a7 5c a2 8a 84 35 12 6c a6 a8 .....F.\...5.l.. 00:20:33.879 000000d0 1e a0 f7 5c 26 ec 76 32 ca ff 33 62 08 2f db 5d ...\&.v2..3b./.] 00:20:33.879 000000e0 61 73 be 57 d7 f3 14 62 e6 06 ce aa 47 c2 c8 61 as.W...b....G..a 00:20:33.879 000000f0 63 a1 c7 35 66 c6 88 12 b4 8f b9 13 ed 79 81 b6 c..5f........y.. 00:20:33.879 00000100 00 98 f4 93 91 95 d6 9d a2 4b 28 fb d8 3e b0 49 .........K(..>.I 00:20:33.879 00000110 c7 e7 0e 39 bd 61 3a d1 7d 38 8e 2b 49 c3 2e 0c ...9.a:.}8.+I... 00:20:33.879 00000120 37 28 e5 df 49 2f 57 1e b3 0f cb cc 7c f8 3c 67 7(..I/W.....|.... 00:20:33.879 00000210 bf 6c 95 ec 18 85 79 5e ca 6e fd fd c2 43 8d a0 .l....y^.n...C.. 00:20:33.879 00000220 0e f2 a6 b9 34 af 00 7a 04 a7 e4 4e 61 5d a1 21 ....4..z...Na].! 00:20:33.879 00000230 4f 9d 58 83 61 fd 81 ca ec ca c7 7d f6 d6 5c 4a O.X.a......}..\J 00:20:33.879 00000240 16 8c f8 4b c0 c4 63 7d 84 5f 27 72 bd 97 98 af ...K..c}._'r.... 00:20:33.879 00000250 c9 2d b0 4c 0e 39 a4 13 94 9e a0 63 78 2c 31 b7 .-.L.9.....cx,1. 00:20:33.879 00000260 bb 13 de 5b 51 a9 38 fd 14 e1 16 11 f5 78 b8 bc ...[Q.8......x.. 00:20:33.879 00000270 da e5 53 d9 52 56 1d 23 66 10 4f ef d1 1d f4 1c ..S.RV.#f.O..... 00:20:33.879 00000280 a9 22 ee c8 fa b2 a6 3f e2 c4 ce 80 9d a1 1f a2 .".....?........ 00:20:33.879 00000290 c1 ff 25 4d 00 c8 b6 f2 81 64 f5 f5 b2 9b 5f 52 ..%M.....d...._R 00:20:33.879 000002a0 ce 44 84 01 78 67 2b ef 43 f8 ae 14 72 2c e2 9d .D..xg+.C...r,.. 00:20:33.879 000002b0 52 34 2a 05 d5 1b f4 5a 80 08 21 58 cb 03 05 b0 R4*....Z..!X.... 00:20:33.879 000002c0 bb 80 08 c3 1f 10 9b 04 1d c0 83 62 71 09 e4 15 ...........bq... 00:20:33.879 000002d0 1f 17 81 59 be 0e 45 21 f4 07 f4 20 31 50 df e7 ...Y..E!... 1P.. 00:20:33.879 000002e0 78 fc d8 57 87 8f 67 1e 08 37 db ae d3 d9 23 53 x..W..g..7....#S 00:20:33.879 000002f0 08 c9 ae b3 35 17 0e e2 5d af 21 ae fc e4 af 58 ....5...].!....X 00:20:33.879 host pubkey: 00:20:33.879 00000000 c9 7e 86 19 2c 7a 37 3e e9 41 82 a2 dd 51 fa d9 .~..,z7>.A...Q.. 00:20:33.879 00000010 be 80 49 2b e4 ba a3 e0 90 ab 4e 38 b4 91 d4 4b ..I+......N8...K 00:20:33.879 00000020 91 f8 8f 10 53 9e f0 70 f9 b3 5f d4 25 8a d4 2c ....S..p.._.%.., 00:20:33.879 00000030 81 df 30 4e 87 89 b3 14 2d c3 ef e9 57 f5 21 f3 ..0N....-...W.!. 00:20:33.879 00000040 27 1f cd 09 d4 8b 0d cf be 5e e4 12 fa f8 16 9e '........^...... 00:20:33.879 00000050 31 1f 2e 01 a2 95 df c7 1d 59 b1 af f1 ee 62 64 1........Y....bd 00:20:33.879 00000060 a2 03 ed d4 77 31 14 01 02 1d 52 54 10 c5 58 97 ....w1....RT..X. 00:20:33.879 00000070 b6 f3 38 65 e6 54 ae 81 a8 e9 65 c7 ad 1a d7 61 ..8e.T....e....a 00:20:33.879 00000080 91 17 55 90 12 7a 8f 3b c5 54 25 c3 47 ca 2a fe ..U..z.;.T%.G.*. 00:20:33.879 00000090 7a fd 94 b3 e8 a8 fb c1 f9 6d b4 93 09 75 ea f1 z........m...u.. 00:20:33.879 000000a0 d5 48 a2 18 62 a8 c4 63 60 9b 4d 1a 42 4e 9b ae .H..b..c`.M.BN.. 00:20:33.879 000000b0 0f 4a 35 43 16 16 9d ae 48 2e 92 79 50 11 7f ce .J5C....H..yP... 00:20:33.879 000000c0 23 51 5a 90 d6 9c 13 31 60 b3 ae 3b 4f ac 9f 98 #QZ....1`..;O... 00:20:33.879 000000d0 cd a0 f1 9d 2e bb d6 60 63 66 20 63 14 f3 c3 58 .......`cf c...X 00:20:33.879 000000e0 24 ae f3 aa c8 1c 67 2f af cb 66 0c 65 40 16 8a $.....g/..f.e@.. 00:20:33.879 000000f0 9b 24 e3 35 f3 d5 0d c0 3c 9c 5b de 71 f4 2d c9 .$.5....<.[.q.-. 00:20:33.879 00000100 88 57 90 45 2b 90 89 32 86 bb b9 3f b8 f2 aa 91 .W.E+..2...?.... 00:20:33.879 00000110 79 a5 64 36 f8 61 b8 c2 13 a4 cc d2 a2 15 84 08 y.d6.a.......... 00:20:33.879 00000120 61 82 a9 8e 6d 39 da 3e 0a 6e a3 ae 83 f0 56 d5 a...m9.>.n....V. 00:20:33.879 00000130 01 25 fe d8 db dc dc 27 78 e7 72 25 62 38 14 8a .%.....'x.r%b8.. 00:20:33.879 00000140 c5 d9 ad d9 e3 e5 db 12 f8 8f 42 c2 d2 c4 7a de ..........B...z. 00:20:33.879 00000150 dc 12 68 20 b1 1b f0 69 3c 5a 0a 95 f1 11 20 12 ..h ...i....l. 00:20:33.879 000002b0 1b 0e dc b4 73 c2 84 ec 9f 37 b7 f2 59 9d ee 6f ....s....7..Y..o 00:20:33.879 000002c0 58 3d a0 30 7b 14 77 77 13 8c 3b 48 7e 45 d2 0a X=.0{.ww..;H~E.. 00:20:33.879 000002d0 ef 3d a3 4e e1 77 c3 c2 6d dc 92 72 91 29 ad 63 .=.N.w..m..r.).c 00:20:33.879 000002e0 94 6f ae 4a 3a f4 00 a3 e8 f1 6c fb b8 39 11 12 .o.J:.....l..9.. 00:20:33.880 000002f0 c1 4e 9a e2 bc 47 aa cf a0 f4 e1 b5 61 08 a2 e7 .N...G......a... 00:20:33.880 dh secret: 00:20:33.880 00000000 94 b7 2d 3c 8f b1 a6 cc 79 bc b9 e2 ce 58 ca 27 ..-<....y....X.' 00:20:33.880 00000010 11 51 46 35 c4 a4 59 ed c4 11 08 87 fb 5e e2 5a .QF5..Y......^.Z 00:20:33.880 00000020 bb cb 35 da d3 d1 68 47 36 de d5 69 36 ba e3 e8 ..5...hG6..i6... 00:20:33.880 00000030 54 ce f0 26 7b 41 e9 95 10 5f 61 74 81 9c 65 ac T..&{A..._at..e. 00:20:33.880 00000040 1f 48 cf 2f 30 b0 28 bb 55 27 a2 04 be 03 9b 5c .H./0.(.U'.....\ 00:20:33.880 00000050 87 14 6c 88 5c 81 46 bf 0b d1 41 0a 6a 72 53 e9 ..l.\.F...A.jrS. 00:20:33.880 00000060 5a ee 49 89 16 09 cd d9 7f d8 8b ef b3 70 7d 82 Z.I..........p}. 00:20:33.880 00000070 84 3c fc 27 5b 1f f0 04 ff 51 53 33 71 9e 58 e9 .<.'[....QS3q.X. 00:20:33.880 00000080 69 77 6e 21 64 44 f5 2c 84 fc c7 98 df f8 e5 4c iwn!dD.,.......L 00:20:33.880 00000090 34 28 07 13 eb 91 be a6 21 b2 36 11 53 88 b3 08 4(......!.6.S... 00:20:33.880 000000a0 69 2d f4 31 b4 54 2f 17 bc d6 b0 16 ba e1 68 f3 i-.1.T/.......h. 00:20:33.880 000000b0 c7 a0 2f 84 6c 02 14 0f fb 5b 6a 57 33 8a 33 18 ../.l....[jW3.3. 00:20:33.880 000000c0 9b e8 d3 43 8f 82 26 66 b2 a8 a3 aa ff 9b 75 fd ...C..&f......u. 00:20:33.880 000000d0 3a d3 05 74 ce 69 82 61 1f 8d a3 fd 16 a3 c0 a5 :..t.i.a........ 00:20:33.880 000000e0 8e dc 6a 75 81 6a 2d ab 22 0d d3 b0 a9 97 4d 15 ..ju.j-.".....M. 00:20:33.880 000000f0 3a 22 31 8c d7 ab 6f 6c 8e fc 4b 19 41 bf be cd :"1...ol..K.A... 00:20:33.880 00000100 72 6c 30 ab 6e 98 53 ad 3e ea bb 82 49 4f 27 7f rl0.n.S.>...IO'. 00:20:33.880 00000110 6c 42 2f 32 15 58 c9 2b ca 52 96 8f 24 fc e4 62 lB/2.X.+.R..$..b 00:20:33.880 00000120 27 a4 50 7f 0f b3 b9 62 b9 98 5a 4f ae 35 5e aa '.P....b..ZO.5^. 00:20:33.880 00000130 06 78 bd 79 20 3f a5 ef ff b2 a3 6b 9f 82 14 2b .x.y ?.....k...+ 00:20:33.880 00000140 60 5a 74 ab 93 9d 24 26 af d6 39 7b 06 ab 1f 0d `Zt...$&..9{.... 00:20:33.880 00000150 7f 42 1b 1c 9c 57 c1 64 5f 8e 45 75 47 d5 10 1c .B...W.d_.EuG... 00:20:33.880 00000160 27 e8 82 fc c8 ef 0c c7 a7 1c a0 9e 74 6c 00 9f '...........tl.. 00:20:33.880 00000170 11 94 a9 2f 8d a6 fc 4f da ff 36 a6 93 49 0a b9 .../...O..6..I.. 00:20:33.880 00000180 08 59 f5 36 7c e3 51 10 64 b9 df 23 da 50 7e 9c .Y.6|.Q.d..#.P~. 00:20:33.880 00000190 ec c3 f6 93 60 ac 3f 09 14 57 f0 49 23 c6 85 a2 ....`.?..W.I#... 00:20:33.880 000001a0 a0 73 97 5d 95 1c bd d4 6c 0a 41 88 c8 15 3b 4a .s.]....l.A...;J 00:20:33.880 000001b0 c6 75 c1 df b4 0b 95 ba 34 75 fc 1c f6 a8 65 3c .u......4u....e< 00:20:33.880 000001c0 6d 76 dd be fb 30 0d a8 fe e4 30 d4 63 ec 99 e0 mv...0....0.c... 00:20:33.880 000001d0 c1 ae dd da d1 df f1 ad 35 8e 63 37 e6 9c ae d9 ........5.c7.... 00:20:33.880 000001e0 7e 5f b4 12 30 7b 1a 7e e4 3f 1d 09 43 8e 90 7e ~_..0{.~.?..C..~ 00:20:33.880 000001f0 5a 05 f0 56 f0 b6 ba 36 19 2d f9 3f 3e 8f e7 11 Z..V...6.-.?>... 00:20:33.880 00000200 30 14 98 83 b0 d0 eb 03 3f 1b 3c e4 71 24 1c 0f 0.......?.<.q$.. 00:20:33.880 00000210 8f 95 f0 1e 55 14 bd 52 7e 8a 14 db ed 27 6a c9 ....U..R~....'j. 00:20:33.880 00000220 bf b5 5f 60 c7 b2 52 04 fd c1 da cd 2c 93 eb d3 .._`..R.....,... 00:20:33.880 00000230 6c 97 eb 58 3b ad 48 cc b6 1a dc 0d ae df 7d 1a l..X;.H.......}. 00:20:33.880 00000240 ec 91 4e bd 2c 9c be 04 97 29 ea ca 6d fd 2a e8 ..N.,....)..m.*. 00:20:33.880 00000250 b0 5d 58 19 ab 03 53 4a d5 f4 b9 3b 13 3e 4b 98 .]X...SJ...;.>K. 00:20:33.880 00000260 a2 43 10 39 ca 18 b9 5b 57 23 4f 03 d7 99 aa 92 .C.9...[W#O..... 00:20:33.880 00000270 59 ef 65 2a 25 be 41 06 34 51 b8 a8 cb d3 30 c9 Y.e*%.A.4Q....0. 00:20:33.880 00000280 2f a6 28 e4 64 99 0b 5d 24 17 b3 56 2d 2e 05 8f /.(.d..]$..V-... 00:20:33.880 00000290 97 3f 21 22 18 a5 d5 16 ea eb c3 8a 03 8f c0 85 .?!"............ 00:20:33.880 000002a0 18 d1 97 d6 de 45 62 97 31 05 5b 78 01 08 53 63 .....Eb.1.[x..Sc 00:20:33.880 000002b0 b2 e7 e7 2e 88 7f 61 e7 54 70 6a 51 18 72 fa a8 ......a.TpjQ.r.. 00:20:33.880 000002c0 a2 1e da 4b b2 49 49 8e 7b 9c d1 e0 49 8e cb 1a ...K.II.{...I... 00:20:33.880 000002d0 63 0b 90 12 d5 49 d1 0f 4e c9 ea c8 43 79 1f 65 c....I..N...Cy.e 00:20:33.880 000002e0 a6 d7 c3 52 61 19 98 33 56 96 d4 c1 af a9 bb ba ...Ra..3V....... 00:20:33.880 000002f0 e6 76 15 e9 a8 7f 16 0f 5d 6c 74 4c cd 59 1a 3a .v......]ltL.Y.: 00:20:33.880 [2024-09-27 15:25:10.750118] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=1, dhgroup=4, seq=3428451733, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.880 [2024-09-27 15:25:10.787222] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.880 [2024-09-27 15:25:10.787252] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.880 [2024-09-27 15:25:10.787269] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.880 [2024-09-27 15:25:10.787275] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.880 [2024-09-27 15:25:10.893503] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.880 [2024-09-27 15:25:10.893549] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.880 [2024-09-27 15:25:10.893574] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.880 [2024-09-27 15:25:10.893611] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.880 [2024-09-27 15:25:10.893666] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.880 ctrlr pubkey: 00:20:33.880 00000000 74 6e b6 3d 60 4d c9 ab ef b8 14 0b fb 3d 91 2c tn.=`M.......=., 00:20:33.880 00000010 ea e3 88 25 82 a4 f2 64 15 0d cc f4 7f dc b1 80 ...%...d........ 00:20:33.880 00000020 b4 87 d4 2f b4 fd b5 3f e7 1e 80 e2 66 73 53 bc .../...?....fsS. 00:20:33.880 00000030 f4 2a 85 bc fd eb c7 de f4 00 50 28 90 28 bb b2 .*........P(.(.. 00:20:33.880 00000040 4b ad 8d f7 57 96 f6 e9 74 5d 35 4d e7 cc 33 0f K...W...t]5M..3. 00:20:33.880 00000050 45 21 4c 95 ef 4e 0e 46 f0 93 31 98 1a 8c e0 cf E!L..N.F..1..... 00:20:33.880 00000060 b2 6d 18 a2 38 a7 88 3a 44 7a b9 f4 1a ad e1 b7 .m..8..:Dz...... 00:20:33.880 00000070 1a 9d b4 a5 81 0c ce d5 58 19 9a 51 96 ed 83 0b ........X..Q.... 00:20:33.880 00000080 7f f9 81 4f be 26 1b 50 6c 10 5a 40 32 e7 33 fe ...O.&.Pl.Z@2.3. 00:20:33.880 00000090 19 98 2a 7d 82 4f bc e2 02 80 f5 45 65 d2 ad 68 ..*}.O.....Ee..h 00:20:33.880 000000a0 8f d9 38 88 e9 1a a3 e0 65 b9 b8 62 71 11 f9 f6 ..8.....e..bq... 00:20:33.880 000000b0 cf f2 29 56 1c 46 db 24 1d 27 3c 3e 2f 50 c7 39 ..)V.F.$.'<>/P.9 00:20:33.880 000000c0 8b 0a dd b3 e4 46 a7 5c a2 8a 84 35 12 6c a6 a8 .....F.\...5.l.. 00:20:33.880 000000d0 1e a0 f7 5c 26 ec 76 32 ca ff 33 62 08 2f db 5d ...\&.v2..3b./.] 00:20:33.880 000000e0 61 73 be 57 d7 f3 14 62 e6 06 ce aa 47 c2 c8 61 as.W...b....G..a 00:20:33.880 000000f0 63 a1 c7 35 66 c6 88 12 b4 8f b9 13 ed 79 81 b6 c..5f........y.. 00:20:33.880 00000100 00 98 f4 93 91 95 d6 9d a2 4b 28 fb d8 3e b0 49 .........K(..>.I 00:20:33.880 00000110 c7 e7 0e 39 bd 61 3a d1 7d 38 8e 2b 49 c3 2e 0c ...9.a:.}8.+I... 00:20:33.880 00000120 37 28 e5 df 49 2f 57 1e b3 0f cb cc 7c f8 3c 67 7(..I/W.....|.... 00:20:33.880 00000210 bf 6c 95 ec 18 85 79 5e ca 6e fd fd c2 43 8d a0 .l....y^.n...C.. 00:20:33.880 00000220 0e f2 a6 b9 34 af 00 7a 04 a7 e4 4e 61 5d a1 21 ....4..z...Na].! 00:20:33.880 00000230 4f 9d 58 83 61 fd 81 ca ec ca c7 7d f6 d6 5c 4a O.X.a......}..\J 00:20:33.880 00000240 16 8c f8 4b c0 c4 63 7d 84 5f 27 72 bd 97 98 af ...K..c}._'r.... 00:20:33.880 00000250 c9 2d b0 4c 0e 39 a4 13 94 9e a0 63 78 2c 31 b7 .-.L.9.....cx,1. 00:20:33.880 00000260 bb 13 de 5b 51 a9 38 fd 14 e1 16 11 f5 78 b8 bc ...[Q.8......x.. 00:20:33.880 00000270 da e5 53 d9 52 56 1d 23 66 10 4f ef d1 1d f4 1c ..S.RV.#f.O..... 00:20:33.880 00000280 a9 22 ee c8 fa b2 a6 3f e2 c4 ce 80 9d a1 1f a2 .".....?........ 00:20:33.880 00000290 c1 ff 25 4d 00 c8 b6 f2 81 64 f5 f5 b2 9b 5f 52 ..%M.....d...._R 00:20:33.880 000002a0 ce 44 84 01 78 67 2b ef 43 f8 ae 14 72 2c e2 9d .D..xg+.C...r,.. 00:20:33.880 000002b0 52 34 2a 05 d5 1b f4 5a 80 08 21 58 cb 03 05 b0 R4*....Z..!X.... 00:20:33.880 000002c0 bb 80 08 c3 1f 10 9b 04 1d c0 83 62 71 09 e4 15 ...........bq... 00:20:33.880 000002d0 1f 17 81 59 be 0e 45 21 f4 07 f4 20 31 50 df e7 ...Y..E!... 1P.. 00:20:33.880 000002e0 78 fc d8 57 87 8f 67 1e 08 37 db ae d3 d9 23 53 x..W..g..7....#S 00:20:33.880 000002f0 08 c9 ae b3 35 17 0e e2 5d af 21 ae fc e4 af 58 ....5...].!....X 00:20:33.880 host pubkey: 00:20:33.880 00000000 d5 0c f6 64 c6 af 02 f3 26 26 7c a5 39 23 1f c6 ...d....&&|.9#.. 00:20:33.880 00000010 c2 5b 31 2c 77 e5 0e 94 97 e7 ee a4 b9 ea 48 35 .[1,w.........H5 00:20:33.880 00000020 95 1d 59 6a 1e 38 57 b2 66 a0 6b 88 8a 13 5a b8 ..Yj.8W.f.k...Z. 00:20:33.881 00000030 a6 38 4f 9a c1 8c cc a2 48 12 22 e2 d0 b4 d2 5c .8O.....H."....\ 00:20:33.881 00000040 a9 86 56 22 52 6b 34 75 70 fa 2a 28 09 43 a4 36 ..V"Rk4up.*(.C.6 00:20:33.881 00000050 9e 35 11 23 00 8d c0 72 33 ee fd 4b a9 72 76 00 .5.#...r3..K.rv. 00:20:33.881 00000060 27 d4 0b d2 fa 06 80 87 b5 b1 a1 f5 99 52 10 38 '............R.8 00:20:33.881 00000070 d5 2a 61 f7 a3 11 3c f8 49 7b bd b2 a2 d2 08 9c .*a...<.I{...... 00:20:33.881 00000080 2a c8 51 83 04 56 34 0b c0 ae 30 ca 36 a8 96 db *.Q..V4...0.6... 00:20:33.881 00000090 cc e5 2f aa 54 81 90 2b f9 d1 f6 e1 15 61 9e 12 ../.T..+.....a.. 00:20:33.881 000000a0 d7 16 ab e0 90 15 a5 a0 ff 14 a9 31 d5 a8 4e 5b ...........1..N[ 00:20:33.881 000000b0 0c 1a 50 c4 65 18 c7 b2 2b 2a d1 2f 85 3d 68 32 ..P.e...+*./.=h2 00:20:33.881 000000c0 ee 3f 97 b3 fc c2 53 0c 7f 6a 4f 5a 37 f4 1c d6 .?....S..jOZ7... 00:20:33.881 000000d0 88 6a 86 f9 d9 3e 7f 7f 29 cc 54 bc 09 29 24 09 .j...>..).T..)$. 00:20:33.881 000000e0 4f a0 f0 db 98 d2 7a 57 50 3e ae d4 03 71 2e 17 O.....zWP>...q.. 00:20:33.881 000000f0 2d bb 84 59 09 1b 79 d2 a5 44 ad 4c 2f 35 f0 f4 -..Y..y..D.L/5.. 00:20:33.881 00000100 d3 e8 5d 92 a0 58 ac 78 b0 35 11 3f 13 c4 dc fb ..]..X.x.5.?.... 00:20:33.881 00000110 8f b2 45 15 d9 09 cb ad 14 43 2c d9 dc 34 ea 42 ..E......C,..4.B 00:20:33.881 00000120 b5 5a 0a 4e 21 ac 78 9e 3d 18 1b d7 29 16 a2 83 .Z.N!.x.=...)... 00:20:33.881 00000130 45 01 93 0a 94 43 65 74 c0 c5 ea cd a0 ce 6a 65 E....Cet......je 00:20:33.881 00000140 b0 24 1d 10 76 67 22 1c d4 70 02 b6 72 d4 20 5e .$..vg"..p..r. ^ 00:20:33.881 00000150 0d f9 3a 59 8d 87 8a 94 99 bd 70 60 b2 0b dd e4 ..:Y......p`.... 00:20:33.881 00000160 5f 87 da bc d9 8a c0 9e c0 bf 46 ff 36 e5 7f ff _.........F.6... 00:20:33.881 00000170 22 39 6e cd 83 c3 d0 81 21 af a0 85 aa 63 8b 65 "9n.....!....c.e 00:20:33.881 00000180 81 97 9d c8 6e e2 37 7a f8 d2 4c d9 78 94 0a 8a ....n.7z..L.x... 00:20:33.881 00000190 c8 a3 37 1d 47 01 e1 5f 4c ef 9c 2e 93 bf 23 b1 ..7.G.._L.....#. 00:20:33.881 000001a0 21 7c 40 85 af 4e 97 74 5e ea 09 6f 4c f9 99 21 !|@..N.t^..oL..! 00:20:33.881 000001b0 d0 55 45 a3 c1 ea ff 46 29 9c 05 c4 49 80 7b a3 .UE....F)...I.{. 00:20:33.881 000001c0 3b b1 b8 a8 7a b2 8b c0 cb ac 68 bc 7b 95 ad d0 ;...z.....h.{... 00:20:33.881 000001d0 c6 bb 34 64 da a8 53 2b 59 37 44 f6 05 64 e4 3a ..4d..S+Y7D..d.: 00:20:33.881 000001e0 ec ef 3b 77 69 b5 d5 43 ec 4c 00 4f 8b e4 e6 d3 ..;wi..C.L.O.... 00:20:33.881 000001f0 d9 65 fe 8f 55 27 ce f0 b1 69 50 1c ec 22 7a 29 .e..U'...iP.."z) 00:20:33.881 00000200 e7 73 e6 86 b9 16 74 73 ed fa 49 04 50 25 5c 7b .s....ts..I.P%\{ 00:20:33.881 00000210 04 d4 e4 76 62 f1 51 57 38 d1 60 cf dd 2c 31 f8 ...vb.QW8.`..,1. 00:20:33.881 00000220 b0 40 63 e2 df 6e e2 8d 81 5c 26 32 98 de 44 3b .@c..n...\&2..D; 00:20:33.881 00000230 b7 1a cc 2e 42 3a e6 36 8f 51 f5 f2 0a 36 fc 10 ....B:.6.Q...6.. 00:20:33.881 00000240 5d 74 bf 70 b4 a1 95 20 b9 25 d6 06 8d ca 51 78 ]t.p... .%....Qx 00:20:33.881 00000250 9a f5 05 a5 eb b0 43 38 6c 80 3f 37 2b 62 83 ff ......C8l.?7+b.. 00:20:33.881 00000260 cc 98 43 70 b2 ad 64 ae b6 22 ea 36 73 7e f5 50 ..Cp..d..".6s~.P 00:20:33.881 00000270 c3 6d 81 71 ad a3 2d 4e b8 a2 26 1f 7d 69 3f 59 .m.q..-N..&.}i?Y 00:20:33.881 00000280 a1 76 7e c7 0b e0 13 8c c1 ca 5d 6a e0 c4 8f 79 .v~.......]j...y 00:20:33.881 00000290 04 4f 24 31 58 4e 8c 02 25 b9 f0 75 8e b3 d3 d0 .O$1XN..%..u.... 00:20:33.881 000002a0 46 30 e3 cf 73 f9 b8 44 f3 26 3d 90 3a 7a 97 25 F0..s..D.&=.:z.% 00:20:33.881 000002b0 a8 af 50 4c 26 55 72 fb 40 87 a0 82 4c fa 6e 68 ..PL&Ur.@...L.nh 00:20:33.881 000002c0 8d cf 40 07 c5 4c c6 31 27 e5 1a 28 57 ec ec 44 ..@..L.1'..(W..D 00:20:33.881 000002d0 a6 27 20 aa 29 a8 91 7a 95 66 a4 f8 ec 80 ce a7 .' .)..z.f...... 00:20:33.881 000002e0 72 89 8e 01 cd e3 98 48 f8 ef 9d bb a4 d0 e8 ab r......H........ 00:20:33.881 000002f0 cf fc e3 62 a2 56 53 2c af 35 4f 3e 8f c9 de bb ...b.VS,.5O>.... 00:20:33.881 dh secret: 00:20:33.881 00000000 1e 81 30 03 62 f5 45 26 73 e5 10 e5 bc 24 e3 12 ..0.b.E&s....$.. 00:20:33.881 00000010 b1 34 e5 db 37 a4 6c f7 8e fa 7a 57 63 b0 25 de .4..7.l...zWc.%. 00:20:33.881 00000020 65 2b 1d 22 bb cb 14 b5 d6 ce 69 32 8b 95 a1 6d e+."......i2...m 00:20:33.881 00000030 89 62 1a 5b af d0 26 1e 16 ea f6 5e 0a 1e c2 67 .b.[..&....^...g 00:20:33.881 00000040 4d 26 b3 93 78 35 4f ff b8 18 b7 ac aa 6d 1d bf M&..x5O......m.. 00:20:33.881 00000050 da ec 2d 73 c4 8e df b9 b3 8e a0 bf 1a c0 eb 50 ..-s...........P 00:20:33.881 00000060 f0 bf 2f 7e 13 b7 2e 7c 15 52 71 c2 e4 95 50 b0 ../~...|.Rq...P. 00:20:33.881 00000070 48 c9 1f b6 e1 1e 78 54 58 dd 23 02 f9 53 42 34 H.....xTX.#..SB4 00:20:33.881 00000080 2e 22 a3 ff 9e 3f 63 16 db 09 3d 52 9b a8 87 99 ."...?c...=R.... 00:20:33.881 00000090 34 82 8e 4b f6 8c 45 82 79 9f d1 e5 55 a6 eb 28 4..K..E.y...U..( 00:20:33.881 000000a0 3f 90 a1 a2 b7 d8 61 9e b9 08 d3 ac 6c ba 84 d5 ?.....a.....l... 00:20:33.881 000000b0 ba 84 d6 51 bb 52 70 e6 2f 9e 57 49 3e b9 4e 25 ...Q.Rp./.WI>.N% 00:20:33.881 000000c0 23 0d c0 c3 a7 64 f3 9b c9 e8 0b 5d 58 84 33 a4 #....d.....]X.3. 00:20:33.881 000000d0 75 f8 51 77 ad f0 3d 1d 63 f5 e0 e8 d3 4f 15 0b u.Qw..=.c....O.. 00:20:33.881 000000e0 2c 0b 00 af 98 7b 2a d2 5b 90 f9 79 14 26 f4 75 ,....{*.[..y.&.u 00:20:33.881 000000f0 c6 a6 ab c6 d5 de c9 c3 f9 80 0b 9e 03 df cd b2 ................ 00:20:33.881 00000100 65 25 88 d1 e5 53 10 f8 4c 07 84 e7 da 27 8b bf e%...S..L....'.. 00:20:33.881 00000110 de 69 5c ab b9 4e 79 39 8e a4 d1 59 1f 78 4e 35 .i\..Ny9...Y.xN5 00:20:33.881 00000120 51 ee c9 8f 0b 6f 5e 2b c3 d3 4c 76 bb 2f 75 02 Q....o^+..Lv./u. 00:20:33.881 00000130 74 1a 50 b0 47 2f a6 dc 1c 0b 3a a2 33 e0 cd e7 t.P.G/....:.3... 00:20:33.881 00000140 69 ce 02 86 29 b9 bb 11 9f 39 37 2c 38 74 12 96 i...)....97,8t.. 00:20:33.881 00000150 73 a1 24 a9 79 b7 1b fb 32 57 28 11 cc 9f db 6b s.$.y...2W(....k 00:20:33.881 00000160 b8 5b ae de c5 c6 3e 89 92 e9 a8 61 c2 a1 6a 9b .[....>....a..j. 00:20:33.881 00000170 26 1a 79 6b a0 e0 e6 6d bb ad 4e 0d 7b a0 83 a9 &.yk...m..N.{... 00:20:33.881 00000180 a2 d1 71 75 02 f1 16 46 95 4d a9 e9 1e 21 f5 fc ..qu...F.M...!.. 00:20:33.881 00000190 d2 53 43 b2 79 f3 ff 35 07 35 5f f8 a9 b1 05 85 .SC.y..5.5_..... 00:20:33.881 000001a0 db 1f 87 19 17 ec 19 92 7b c1 c3 0c b2 ee 67 3c ........{.....g< 00:20:33.881 000001b0 a0 76 7a 56 23 a1 ac 99 e2 47 ae ea d2 b6 2d 7c .vzV#....G....-| 00:20:33.881 000001c0 90 3d ad 0b a2 97 de b5 9a 92 c7 31 3f 55 d2 9e .=.........1?U.. 00:20:33.881 000001d0 e2 a2 54 4c 00 fd 4b b2 72 2c 7e 59 f2 0f 69 de ..TL..K.r,~Y..i. 00:20:33.881 000001e0 3a c1 d3 be 35 da 7d e9 10 5a 5d 72 02 93 36 15 :...5.}..Z]r..6. 00:20:33.881 000001f0 ae ed f8 6c f6 ca fa 89 7e 50 8c b1 96 b3 f0 c8 ...l....~P...... 00:20:33.881 00000200 f6 95 d1 f6 d3 67 79 98 f8 f3 3f 02 30 ef c3 ab .....gy...?.0... 00:20:33.881 00000210 0b 31 9e 7e c8 c5 fd 97 dd 1b a1 98 c5 5c 7a dd .1.~.........\z. 00:20:33.881 00000220 8e 3f e8 d4 71 2c 83 13 28 cd 4e b0 59 22 bb f1 .?..q,..(.N.Y".. 00:20:33.881 00000230 05 f2 c3 5d ec ee ec 61 9b 6b 2b 0a 4d 45 0a 28 ...]...a.k+.ME.( 00:20:33.881 00000240 68 c6 b0 ff 2c 58 6f a5 87 cd 36 97 f8 ff 28 40 h...,Xo...6...(@ 00:20:33.881 00000250 81 3f df 3c b5 c8 26 c0 72 96 9c 44 41 e7 42 a5 .?.<..&.r..DA.B. 00:20:33.881 00000260 7f 1c 31 3b 2c 83 61 22 44 3d 6b 56 42 cb f1 41 ..1;,.a"D=kVB..A 00:20:33.881 00000270 90 73 ba 8c 1d b6 2a 6b 1f ab f7 2b 94 b4 f9 c6 .s....*k...+.... 00:20:33.881 00000280 87 82 84 09 3d 93 82 b1 cf 8f 75 c6 07 fe 26 be ....=.....u...&. 00:20:33.881 00000290 70 e9 b4 30 81 52 52 92 8b f4 04 5e 4a ee fe bc p..0.RR....^J... 00:20:33.881 000002a0 6a af 85 46 f1 20 64 c1 b2 3c 7a 61 8a 5e b3 e3 j..F. d.... 00:20:33.881 00000030 2b 3d 9e 75 00 09 6d f4 90 04 51 14 de 22 9e 35 +=.u..m...Q..".5 00:20:33.881 00000040 cd ea 95 2c 31 73 e6 6a 05 3d 87 57 d1 79 df 77 ...,1s.j.=.W.y.w 00:20:33.881 00000050 f5 cd 9f 81 63 ce 84 8b f5 14 a2 e8 4f 65 be 1b ....c.......Oe.. 00:20:33.881 00000060 a6 db 92 08 21 4b 24 71 d8 67 82 a4 3b 5d d7 ad ....!K$q.g..;].. 00:20:33.881 00000070 9b 94 11 1c b2 aa 23 e8 c1 8d 22 fa 6d a8 6e 4d ......#...".m.nM 00:20:33.881 00000080 80 9e db 7c 0c 75 d9 8d 50 3b ce b1 83 e4 2d e7 ...|.u..P;....-. 00:20:33.881 00000090 43 5e 8a 7b 86 2f f4 76 0c 7a ab 9e df 46 05 c1 C^.{./.v.z...F.. 00:20:33.881 000000a0 9b f7 bb fe cd f8 41 2b 4d 20 65 15 3b 3f b3 a9 ......A+M e.;?.. 00:20:33.881 000000b0 86 c3 c8 83 d1 29 ca 99 16 ff 04 f4 72 42 f7 56 .....)......rB.V 00:20:33.882 000000c0 9e c5 65 92 8d 1d b1 06 ad 02 b1 0d 39 d1 d6 22 ..e.........9.." 00:20:33.882 000000d0 c0 ac e9 a9 16 33 99 4f 5d d9 13 80 39 73 f0 17 .....3.O]...9s.. 00:20:33.882 000000e0 d0 fd 5b 0e 95 b3 23 56 b3 fe 53 67 30 07 2f 67 ..[...#V..Sg0./g 00:20:33.882 000000f0 c2 ad 56 67 cf 21 a8 03 13 95 bf 74 fc ff 7b 63 ..Vg.!.....t..{c 00:20:33.882 00000100 8e 3f 28 4e 52 f5 ca d4 e2 cd 59 d5 46 0f c4 c0 .?(NR.....Y.F... 00:20:33.882 00000110 ca 86 d0 ae 6c 1a 9a 57 d3 95 04 1e cc d2 d1 82 ....l..W........ 00:20:33.882 00000120 30 e2 0f 3b 8f 55 cf 5f 12 e1 9a e9 99 91 74 4c 0..;.U._......tL 00:20:33.882 00000130 74 4f 9a cb 5f 8b 0e 02 d6 ce 32 1d 5b 6c f0 87 tO.._.....2.[l.. 00:20:33.882 00000140 db f6 1f 1c 59 8d b8 6d d7 61 a6 ff d9 2b 0e 57 ....Y..m.a...+.W 00:20:33.882 00000150 fc fa d3 78 13 96 71 bf aa fd b7 53 44 37 13 4f ...x..q....SD7.O 00:20:33.882 00000160 69 24 f9 70 f2 3f b3 fc f1 62 68 d0 a7 74 e7 c0 i$.p.?...bh..t.. 00:20:33.882 00000170 f2 46 f2 31 dc e8 f6 fb 51 58 2a 3d 96 dc 75 2d .F.1....QX*=..u- 00:20:33.882 00000180 de f7 84 48 fc 8e 74 f5 44 4b 45 22 f3 1c 78 0b ...H..t.DKE"..x. 00:20:33.882 00000190 dd b9 d8 64 b1 c7 ef 70 f5 85 09 90 d3 53 a3 8e ...d...p.....S.. 00:20:33.882 000001a0 b8 d1 73 a0 ea 5a 65 71 22 3f 16 13 d2 dd 55 f2 ..s..Zeq"?....U. 00:20:33.882 000001b0 c6 2f 47 1e 12 fe 46 7d 4a ba ab c6 be 54 56 6b ./G...F}J....TVk 00:20:33.882 000001c0 bb 54 8e 61 d5 3c d6 41 93 01 f9 73 9b 0f 17 b0 .T.a.<.A...s.... 00:20:33.882 000001d0 88 a4 71 5c 67 f7 34 2c 11 05 cf 76 ce 8c 95 e6 ..q\g.4,...v.... 00:20:33.882 000001e0 1f aa a7 64 13 fb cf 96 b5 2b ed f7 c3 61 3c 78 ...d.....+...a..yS.....A.-.. 00:20:33.882 00000240 40 70 1f f4 f8 cb c8 a3 6c 27 63 f6 52 c1 10 d0 @p......l'c.R... 00:20:33.882 00000250 d9 6b 0e 87 e6 bf b4 23 a5 1a 22 3c 97 ae f9 f6 .k.....#.."<.... 00:20:33.882 00000260 f2 32 e4 f8 24 e0 d4 6a ee 3c e5 f1 45 f4 13 41 .2..$..j.<..E..A 00:20:33.882 00000270 9f 1d 83 70 29 9d b3 7f 81 f6 34 0e e7 de fe ee ...p).....4..... 00:20:33.882 00000280 60 d4 45 52 de be 83 6e 33 79 d2 66 52 32 f2 d5 `.ER...n3y.fR2.. 00:20:33.882 00000290 e4 40 6e 35 3c 9a 55 71 6d aa 1f 2b ae 14 71 3d .@n5<.Uqm..+..q= 00:20:33.882 000002a0 ea ef 9b db 71 ed 59 f5 16 ec 87 0f 76 cd 7b 95 ....q.Y.....v.{. 00:20:33.882 000002b0 e6 7d ec b9 0f 16 e3 05 a7 04 ba 13 2d a0 33 65 .}..........-.3e 00:20:33.882 000002c0 c1 74 53 94 b9 2b 73 9d 4b 95 c2 46 c0 4f 05 7a .tS..+s.K..F.O.z 00:20:33.882 000002d0 46 53 06 f0 8a 21 80 2a a2 6c 0d 10 65 be f4 cc FS...!.*.l..e... 00:20:33.882 000002e0 69 56 bc 59 24 a0 fa 56 f9 eb d2 59 78 19 e3 a9 iV.Y$..V...Yx... 00:20:33.882 000002f0 4e 01 88 9d 4e f9 18 35 a9 98 68 19 39 58 57 45 N...N..5..h.9XWE 00:20:33.882 00000300 39 a1 75 fd c0 03 18 6d 52 3e 3a b1 3e 6b 90 ef 9.u....mR>:.>k.. 00:20:33.882 00000310 a6 47 2e 8f 39 72 0b b9 61 2f 0a ba 0d 06 25 af .G..9r..a/....%. 00:20:33.882 00000320 d2 71 ba bb dc 71 cc c0 b2 d3 be 1f 69 c0 e8 0d .q...q......i... 00:20:33.882 00000330 0e 96 08 3d 79 0a b6 1a 3b d8 e1 b6 2a 51 ec 72 ...=y...;...*Q.r 00:20:33.882 00000340 9b 79 0e 96 c3 3b 9a f0 a0 47 9c 27 55 52 28 22 .y...;...G.'UR(" 00:20:33.882 00000350 80 93 8d 08 1a 20 07 c4 9a bf 6c 3d 02 86 20 bb ..... ....l=.. . 00:20:33.882 00000360 86 b2 4a b7 04 a3 17 a4 28 ce d4 d8 ab 5e c6 db ..J.....(....^.. 00:20:33.882 00000370 be db c6 b6 f4 b5 7a 5b b3 65 cf af 20 01 5b 80 ......z[.e.. .[. 00:20:33.882 00000380 e5 8e ec 0e 62 22 5a 2f 6b 74 3e 19 10 c2 c1 6a ....b"Z/kt>....j 00:20:33.882 00000390 b4 0c f2 03 3f f8 b0 04 76 e5 1f 92 2b da 1a 23 ....?...v...+..# 00:20:33.882 000003a0 c9 8a 84 64 94 b1 36 37 4b 79 4c 5b 5d fc 66 05 ...d..67KyL[].f. 00:20:33.882 000003b0 10 cf 68 b9 ae 14 2a eb f1 19 b5 88 65 54 8c e1 ..h...*.....eT.. 00:20:33.882 000003c0 fe 3b 66 d0 d8 ee 23 70 f1 85 24 85 56 43 64 c4 .;f...#p..$.VCd. 00:20:33.882 000003d0 77 c9 13 70 01 26 d8 86 ec e1 87 42 27 f8 19 9c w..p.&.....B'... 00:20:33.882 000003e0 28 c5 83 83 09 d3 43 6e 15 eb 61 f0 b3 43 40 a8 (.....Cn..a..C@. 00:20:33.882 000003f0 ea d6 46 0b 12 de 85 4a 3c ae 37 4b 00 b3 df a6 ..F....J<.7K.... 00:20:33.882 host pubkey: 00:20:33.882 00000000 46 76 1a b2 07 67 2e 55 c5 6f 7c b8 30 ec 0c 85 Fv...g.U.o|.0... 00:20:33.882 00000010 10 73 65 a1 b9 28 bf 22 f9 70 3a 80 ea ad 8e fe .se..(.".p:..... 00:20:33.882 00000020 55 7b e9 fe b4 44 76 f0 02 dd 96 99 6a 5b 90 9e U{...Dv.....j[.. 00:20:33.882 00000030 af 1a 8d a3 c5 f0 a9 f1 a6 06 d4 19 22 5c dd 02 ............"\.. 00:20:33.882 00000040 7a 94 9c 51 b8 a9 02 c5 33 49 b6 d4 8a be 16 43 z..Q....3I.....C 00:20:33.882 00000050 27 8e 04 af 06 8b e9 ac bd a6 6c 51 39 be f3 17 '.........lQ9... 00:20:33.882 00000060 9e 42 2d 60 13 53 c3 c0 0b b4 ad ad 12 d2 21 70 .B-`.S........!p 00:20:33.882 00000070 d2 e6 e7 28 3e c9 52 04 cc c7 62 0a 1b 23 f8 c5 ...(>.R...b..#.. 00:20:33.882 00000080 62 8f 28 2e 00 8f 28 9b b8 50 1c b3 2b b3 44 e9 b.(...(..P..+.D. 00:20:33.882 00000090 10 f7 2a 56 32 54 93 df 3d 82 63 4b 61 05 8c d1 ..*V2T..=.cKa... 00:20:33.882 000000a0 af 4f 1b 8e b8 7b 53 aa f6 06 22 d7 ce b5 0c c6 .O...{S..."..... 00:20:33.882 000000b0 19 ff b5 b0 b5 e5 ec b2 c3 37 57 bc 96 82 10 1e .........7W..... 00:20:33.882 000000c0 63 9c f8 74 66 83 6f 8b c6 27 96 b4 4f 55 d0 b8 c..tf.o..'..OU.. 00:20:33.882 000000d0 f1 0e 3e 16 37 3f 71 f9 12 fa d6 22 4d 57 11 de ..>.7?q...."MW.. 00:20:33.882 000000e0 f3 4e 13 cd e3 dc da 60 80 ba 3f fb e8 24 02 a8 .N.....`..?..$.. 00:20:33.882 000000f0 a2 61 f9 68 e6 d7 d3 55 f4 cf c5 2a 09 14 df 58 .a.h...U...*...X 00:20:33.882 00000100 2d 2c 1a e6 a6 f4 41 5e ff 26 4c 2c d8 50 ce 5a -,....A^.&L,.P.Z 00:20:33.882 00000110 75 01 3b ff a8 55 1d 17 e0 67 16 30 fe 4f 24 fa u.;..U...g.0.O$. 00:20:33.882 00000120 ac 63 0d 54 bb aa 98 9f 68 7c eb 71 e4 12 4e f9 .c.T....h|.q..N. 00:20:33.882 00000130 3d 9d a8 f9 77 2c 8b e8 f3 07 7c b1 46 35 ae 5e =...w,....|.F5.^ 00:20:33.882 00000140 4d ab 80 90 1b ed 37 9b c1 04 de df b3 df 17 66 M.....7........f 00:20:33.882 00000150 cc a2 4f 80 8c d4 21 37 7c 0e 4c ab ac 92 aa 53 ..O...!7|.L....S 00:20:33.882 00000160 d9 af dd 8e f0 b4 40 11 80 49 ff 8c eb da 5e 26 ......@..I....^& 00:20:33.882 00000170 16 40 38 1e a0 55 74 dc 82 23 16 60 8f 3a 81 fa .@8..Ut..#.`.:.. 00:20:33.882 00000180 f1 7a 99 9e e1 ed 90 b8 e6 7f 5b f7 65 ad cb 35 .z........[.e..5 00:20:33.882 00000190 25 14 db ea 50 11 ea 8d 56 87 90 fa d3 49 6f cc %...P...V....Io. 00:20:33.882 000001a0 60 74 a7 53 59 de 87 33 39 1c 19 a8 63 94 23 17 `t.SY..39...c.#. 00:20:33.882 000001b0 b3 e2 83 09 df b0 9d b8 c3 8e 56 9f ab 1a 9c 36 ..........V....6 00:20:33.882 000001c0 5a c2 06 59 25 df c6 95 6b 6f 3a 60 74 c9 28 f8 Z..Y%...ko:`t.(. 00:20:33.882 000001d0 4e 80 fa c5 bd 0c dc 0b db 98 75 60 a0 07 a1 30 N.........u`...0 00:20:33.882 000001e0 df 7b 9b 95 21 d0 f6 47 8b a8 d8 7f 77 d8 86 bb .{..!..G....w... 00:20:33.882 000001f0 22 50 ef 17 47 2a 32 cb 55 a5 ee 06 fc c8 56 2a "P..G*2.U.....V* 00:20:33.882 00000200 4d 9a 15 8b c3 c9 c0 eb 1f 62 58 e8 fd 30 a3 27 M........bX..0.' 00:20:33.882 00000210 00 54 53 b3 22 d5 75 79 ec e2 32 4e fa ed 41 b9 .TS.".uy..2N..A. 00:20:33.882 00000220 e1 72 c7 76 71 dc 3b 53 de 03 b7 f7 3c 2b f4 69 .r.vq.;S....<+.i 00:20:33.882 00000230 42 02 2d dc 83 74 c1 53 fc a0 25 a1 62 e1 6c a9 B.-..t.S..%.b.l. 00:20:33.882 00000240 a2 93 61 d4 3a 52 7b 2f 32 e6 1a d9 28 8b 88 b3 ..a.:R{/2...(... 00:20:33.882 00000250 4c 6b 25 02 a4 bd 5a 29 08 ba 22 c5 c7 c2 a6 f4 Lk%...Z).."..... 00:20:33.882 00000260 dc 88 60 76 38 3f 7d 9d 1e da 78 1f 75 16 33 4b ..`v8?}...x.u.3K 00:20:33.882 00000270 e2 e5 34 1d d6 75 76 a3 e2 cc 3a be f3 df bd de ..4..uv...:..... 00:20:33.882 00000280 40 e2 34 a7 a8 83 00 83 f3 f6 78 e4 1c 4d 26 a9 @.4.......x..M&. 00:20:33.882 00000290 a9 53 a1 2a 40 f3 ad 8c d3 ae c1 f8 60 17 a3 0d .S.*@.......`... 00:20:33.882 000002a0 31 62 ff b0 5d 6f f8 b8 7b b6 3f 13 47 a9 33 fb 1b..]o..{.?.G.3. 00:20:33.882 000002b0 bb ac 57 b2 1b e7 cc 4f c4 c4 9f 89 b8 3d af a6 ..W....O.....=.. 00:20:33.882 000002c0 23 93 44 95 2f 4a 04 c7 1d 65 e1 b0 e4 c9 92 48 #.D./J...e.....H 00:20:33.882 000002d0 65 a3 c3 08 3f d6 71 35 7e 09 59 fb da 0d a6 a4 e...?.q5~.Y..... 00:20:33.882 000002e0 14 ce 0e 9e 7d 46 50 20 f2 d4 b6 31 cc 0b 7b ac ....}FP ...1..{. 00:20:33.882 000002f0 ed b3 cf 76 b5 54 85 cd bb 60 74 95 e9 fb 3e 3a ...v.T...`t...>: 00:20:33.882 00000300 29 59 61 de 46 7c de 10 70 b0 c7 8a a7 04 32 29 )Ya.F|..p.....2) 00:20:33.882 00000310 25 f2 fa fb 6e 2c b8 69 81 3f e8 54 ea 89 c2 d3 %...n,.i.?.T.... 00:20:33.882 00000320 52 61 db 61 bb 44 d3 23 24 50 97 e9 d6 b0 cd 53 Ra.a.D.#$P.....S 00:20:33.882 00000330 3c 50 c7 07 d8 fe 91 b5 94 a5 41 da 0d b6 e0 2e .=*E......3..la 00:20:33.882 00000370 c5 31 1a 9f fc 76 ec fe 80 ce 12 56 85 9b a7 8a .1...v.....V.... 00:20:33.882 00000380 a6 8d 30 d2 0c 45 c0 21 bd 10 e3 64 97 c7 b6 65 ..0..E.!...d...e 00:20:33.882 00000390 87 8d 9b 6e 6d dc 47 43 05 0e 4a df cb e8 78 f3 ...nm.GC..J...x. 00:20:33.882 000003a0 93 fb 0a 80 f4 fe 09 66 80 75 d5 78 ef 86 e1 11 .......f.u.x.... 00:20:33.882 000003b0 a2 be 93 7e bc 9d 3d 6e 08 d6 0e 4d ae ff 0b 7b ...~..=n...M...{ 00:20:33.882 000003c0 a0 e9 47 da 7a c1 46 04 66 d3 a5 cb a1 df e2 01 ..G.z.F.f....... 00:20:33.882 000003d0 03 80 93 71 4a 15 88 02 90 aa fc a7 45 33 6d 5c ...qJ.......E3m\ 00:20:33.882 000003e0 e5 49 38 84 f4 43 3d 7d 8c 85 a4 3c 47 fb 0b fe .I8..C=}.... 00:20:33.883 00000350 44 78 d3 a7 08 0f ec d5 89 b5 ec c5 2d 8a 12 9c Dx..........-... 00:20:33.883 00000360 29 4b ed 6d 4c ca eb 96 7b 3b 45 25 3d cc 98 53 )K.mL...{;E%=..S 00:20:33.883 00000370 ec d2 01 d6 fc 26 77 5e 9e ff da 35 61 89 57 b2 .....&w^...5a.W. 00:20:33.883 00000380 2b 96 06 db 1a ee 44 9b c2 29 b9 bd dc e3 2f 6c +.....D..)..../l 00:20:33.883 00000390 9a 34 3e 39 78 54 96 57 99 db 5c 07 c1 c1 99 17 .4>9xT.W..\..... 00:20:33.883 000003a0 a5 58 a0 1f 0f 06 3b 6d 0e 87 b2 6f 2a 34 6d 93 .X....;m...o*4m. 00:20:33.883 000003b0 6c 2a 24 e7 0e 3f 7f 72 8b a0 41 3d ed dd 02 aa l*$..?.r..A=.... 00:20:33.883 000003c0 56 12 94 90 ac fa 43 79 03 11 1c d9 2d 5e eb 1e V.....Cy....-^.. 00:20:33.883 000003d0 68 2c ea d7 c0 3e d2 75 8f 23 6d c4 b6 75 8e 5c h,...>.u.#m..u.\ 00:20:33.883 000003e0 11 97 dd 71 a5 b1 41 94 9f 1a 0b 48 56 47 bc d2 ...q..A....HVG.. 00:20:33.883 000003f0 95 e9 1f e8 6e ca 2e f7 7f fc e2 d4 21 6a 13 31 ....n.......!j.1 00:20:33.883 [2024-09-27 15:25:11.300626] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=1, dhgroup=5, seq=3428451735, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.883 [2024-09-27 15:25:11.358612] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.883 [2024-09-27 15:25:11.358661] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.883 [2024-09-27 15:25:11.358678] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.883 [2024-09-27 15:25:11.358693] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.883 [2024-09-27 15:25:11.358711] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.883 [2024-09-27 15:25:11.464633] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.883 [2024-09-27 15:25:11.464650] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.883 [2024-09-27 15:25:11.464657] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.883 [2024-09-27 15:25:11.464667] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.883 [2024-09-27 15:25:11.464721] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.883 ctrlr pubkey: 00:20:33.883 00000000 23 36 19 0d 0a 46 15 96 e0 75 3b 84 4f 79 2c 46 #6...F...u;.Oy,F 00:20:33.883 00000010 b3 c4 aa f1 30 e1 91 03 08 7f 76 e7 11 fa 74 2a ....0.....v...t* 00:20:33.883 00000020 d9 d7 d2 f9 f8 dc e8 56 da 33 f1 80 7b 3e 02 05 .......V.3..{>.. 00:20:33.883 00000030 2b 3d 9e 75 00 09 6d f4 90 04 51 14 de 22 9e 35 +=.u..m...Q..".5 00:20:33.883 00000040 cd ea 95 2c 31 73 e6 6a 05 3d 87 57 d1 79 df 77 ...,1s.j.=.W.y.w 00:20:33.883 00000050 f5 cd 9f 81 63 ce 84 8b f5 14 a2 e8 4f 65 be 1b ....c.......Oe.. 00:20:33.883 00000060 a6 db 92 08 21 4b 24 71 d8 67 82 a4 3b 5d d7 ad ....!K$q.g..;].. 00:20:33.883 00000070 9b 94 11 1c b2 aa 23 e8 c1 8d 22 fa 6d a8 6e 4d ......#...".m.nM 00:20:33.883 00000080 80 9e db 7c 0c 75 d9 8d 50 3b ce b1 83 e4 2d e7 ...|.u..P;....-. 00:20:33.883 00000090 43 5e 8a 7b 86 2f f4 76 0c 7a ab 9e df 46 05 c1 C^.{./.v.z...F.. 00:20:33.883 000000a0 9b f7 bb fe cd f8 41 2b 4d 20 65 15 3b 3f b3 a9 ......A+M e.;?.. 00:20:33.883 000000b0 86 c3 c8 83 d1 29 ca 99 16 ff 04 f4 72 42 f7 56 .....)......rB.V 00:20:33.883 000000c0 9e c5 65 92 8d 1d b1 06 ad 02 b1 0d 39 d1 d6 22 ..e.........9.." 00:20:33.883 000000d0 c0 ac e9 a9 16 33 99 4f 5d d9 13 80 39 73 f0 17 .....3.O]...9s.. 00:20:33.883 000000e0 d0 fd 5b 0e 95 b3 23 56 b3 fe 53 67 30 07 2f 67 ..[...#V..Sg0./g 00:20:33.883 000000f0 c2 ad 56 67 cf 21 a8 03 13 95 bf 74 fc ff 7b 63 ..Vg.!.....t..{c 00:20:33.883 00000100 8e 3f 28 4e 52 f5 ca d4 e2 cd 59 d5 46 0f c4 c0 .?(NR.....Y.F... 00:20:33.883 00000110 ca 86 d0 ae 6c 1a 9a 57 d3 95 04 1e cc d2 d1 82 ....l..W........ 00:20:33.883 00000120 30 e2 0f 3b 8f 55 cf 5f 12 e1 9a e9 99 91 74 4c 0..;.U._......tL 00:20:33.883 00000130 74 4f 9a cb 5f 8b 0e 02 d6 ce 32 1d 5b 6c f0 87 tO.._.....2.[l.. 00:20:33.883 00000140 db f6 1f 1c 59 8d b8 6d d7 61 a6 ff d9 2b 0e 57 ....Y..m.a...+.W 00:20:33.883 00000150 fc fa d3 78 13 96 71 bf aa fd b7 53 44 37 13 4f ...x..q....SD7.O 00:20:33.883 00000160 69 24 f9 70 f2 3f b3 fc f1 62 68 d0 a7 74 e7 c0 i$.p.?...bh..t.. 00:20:33.883 00000170 f2 46 f2 31 dc e8 f6 fb 51 58 2a 3d 96 dc 75 2d .F.1....QX*=..u- 00:20:33.883 00000180 de f7 84 48 fc 8e 74 f5 44 4b 45 22 f3 1c 78 0b ...H..t.DKE"..x. 00:20:33.883 00000190 dd b9 d8 64 b1 c7 ef 70 f5 85 09 90 d3 53 a3 8e ...d...p.....S.. 00:20:33.883 000001a0 b8 d1 73 a0 ea 5a 65 71 22 3f 16 13 d2 dd 55 f2 ..s..Zeq"?....U. 00:20:33.883 000001b0 c6 2f 47 1e 12 fe 46 7d 4a ba ab c6 be 54 56 6b ./G...F}J....TVk 00:20:33.883 000001c0 bb 54 8e 61 d5 3c d6 41 93 01 f9 73 9b 0f 17 b0 .T.a.<.A...s.... 00:20:33.884 000001d0 88 a4 71 5c 67 f7 34 2c 11 05 cf 76 ce 8c 95 e6 ..q\g.4,...v.... 00:20:33.884 000001e0 1f aa a7 64 13 fb cf 96 b5 2b ed f7 c3 61 3c 78 ...d.....+...a..yS.....A.-.. 00:20:33.884 00000240 40 70 1f f4 f8 cb c8 a3 6c 27 63 f6 52 c1 10 d0 @p......l'c.R... 00:20:33.884 00000250 d9 6b 0e 87 e6 bf b4 23 a5 1a 22 3c 97 ae f9 f6 .k.....#.."<.... 00:20:33.884 00000260 f2 32 e4 f8 24 e0 d4 6a ee 3c e5 f1 45 f4 13 41 .2..$..j.<..E..A 00:20:33.884 00000270 9f 1d 83 70 29 9d b3 7f 81 f6 34 0e e7 de fe ee ...p).....4..... 00:20:33.884 00000280 60 d4 45 52 de be 83 6e 33 79 d2 66 52 32 f2 d5 `.ER...n3y.fR2.. 00:20:33.884 00000290 e4 40 6e 35 3c 9a 55 71 6d aa 1f 2b ae 14 71 3d .@n5<.Uqm..+..q= 00:20:33.884 000002a0 ea ef 9b db 71 ed 59 f5 16 ec 87 0f 76 cd 7b 95 ....q.Y.....v.{. 00:20:33.884 000002b0 e6 7d ec b9 0f 16 e3 05 a7 04 ba 13 2d a0 33 65 .}..........-.3e 00:20:33.884 000002c0 c1 74 53 94 b9 2b 73 9d 4b 95 c2 46 c0 4f 05 7a .tS..+s.K..F.O.z 00:20:33.884 000002d0 46 53 06 f0 8a 21 80 2a a2 6c 0d 10 65 be f4 cc FS...!.*.l..e... 00:20:33.884 000002e0 69 56 bc 59 24 a0 fa 56 f9 eb d2 59 78 19 e3 a9 iV.Y$..V...Yx... 00:20:33.884 000002f0 4e 01 88 9d 4e f9 18 35 a9 98 68 19 39 58 57 45 N...N..5..h.9XWE 00:20:33.884 00000300 39 a1 75 fd c0 03 18 6d 52 3e 3a b1 3e 6b 90 ef 9.u....mR>:.>k.. 00:20:33.884 00000310 a6 47 2e 8f 39 72 0b b9 61 2f 0a ba 0d 06 25 af .G..9r..a/....%. 00:20:33.884 00000320 d2 71 ba bb dc 71 cc c0 b2 d3 be 1f 69 c0 e8 0d .q...q......i... 00:20:33.884 00000330 0e 96 08 3d 79 0a b6 1a 3b d8 e1 b6 2a 51 ec 72 ...=y...;...*Q.r 00:20:33.884 00000340 9b 79 0e 96 c3 3b 9a f0 a0 47 9c 27 55 52 28 22 .y...;...G.'UR(" 00:20:33.884 00000350 80 93 8d 08 1a 20 07 c4 9a bf 6c 3d 02 86 20 bb ..... ....l=.. . 00:20:33.884 00000360 86 b2 4a b7 04 a3 17 a4 28 ce d4 d8 ab 5e c6 db ..J.....(....^.. 00:20:33.884 00000370 be db c6 b6 f4 b5 7a 5b b3 65 cf af 20 01 5b 80 ......z[.e.. .[. 00:20:33.884 00000380 e5 8e ec 0e 62 22 5a 2f 6b 74 3e 19 10 c2 c1 6a ....b"Z/kt>....j 00:20:33.884 00000390 b4 0c f2 03 3f f8 b0 04 76 e5 1f 92 2b da 1a 23 ....?...v...+..# 00:20:33.884 000003a0 c9 8a 84 64 94 b1 36 37 4b 79 4c 5b 5d fc 66 05 ...d..67KyL[].f. 00:20:33.884 000003b0 10 cf 68 b9 ae 14 2a eb f1 19 b5 88 65 54 8c e1 ..h...*.....eT.. 00:20:33.884 000003c0 fe 3b 66 d0 d8 ee 23 70 f1 85 24 85 56 43 64 c4 .;f...#p..$.VCd. 00:20:33.884 000003d0 77 c9 13 70 01 26 d8 86 ec e1 87 42 27 f8 19 9c w..p.&.....B'... 00:20:33.884 000003e0 28 c5 83 83 09 d3 43 6e 15 eb 61 f0 b3 43 40 a8 (.....Cn..a..C@. 00:20:33.884 000003f0 ea d6 46 0b 12 de 85 4a 3c ae 37 4b 00 b3 df a6 ..F....J<.7K.... 00:20:33.884 host pubkey: 00:20:33.884 00000000 31 4d 8e f8 01 e5 b0 f2 93 ae 8c e2 0a bb b7 3e 1M.............> 00:20:33.884 00000010 8c 42 34 78 bb 87 dd 87 d3 63 f6 e7 d7 00 5f cd .B4x.....c...._. 00:20:33.884 00000020 33 b4 76 3c 46 58 50 87 7d 3b 59 4b 6e 83 7e bd 3.v'..2 00:20:33.884 00000210 79 09 54 75 f8 65 37 91 8c 56 2e 15 35 80 e2 9f y.Tu.e7..V..5... 00:20:33.884 00000220 df db ba 73 52 34 7f da c7 36 bf aa 86 5c e4 6a ...sR4...6...\.j 00:20:33.884 00000230 c9 3d c8 86 60 48 51 fa 5d 2b 71 d4 91 1f 16 f5 .=..`HQ.]+q..... 00:20:33.884 00000240 51 c8 fd fc 95 1f f7 80 d1 07 57 53 d7 35 67 71 Q.........WS.5gq 00:20:33.884 00000250 d7 75 4e ca 11 d6 b3 3e 82 cd c2 53 d8 60 77 8a .uN....>...S.`w. 00:20:33.884 00000260 5d 9f a5 a8 ba 88 cb c8 8a dc a8 a9 a3 c0 57 d4 ].............W. 00:20:33.884 00000270 ce 77 08 cf 0f 12 b6 e7 aa 3a b3 50 ff bc f9 12 .w.......:.P.... 00:20:33.884 00000280 7d bd 31 0c ba 13 72 47 5d 86 29 d7 48 9d b9 1a }.1...rG].).H... 00:20:33.884 00000290 45 b7 7e ab a7 92 4b 16 90 40 a3 d3 f9 fd a6 45 E.~...K..@.....E 00:20:33.884 000002a0 8d 00 fe 97 bb 8f 54 87 7a 3b 3a 15 13 74 88 d5 ......T.z;:..t.. 00:20:33.884 000002b0 a3 b4 a7 e5 42 33 95 3f 44 88 3d b7 99 3f 20 48 ....B3.?D.=..? H 00:20:33.884 000002c0 b3 dc 31 a1 ab 65 9d be 81 d8 21 18 4c 9f 1f 9f ..1..e....!.L... 00:20:33.884 000002d0 ae d4 5c cb 91 c7 78 00 5b 3e 99 42 c3 94 f3 6c ..\...x.[>.B...l 00:20:33.884 000002e0 a1 84 74 3a 61 4e 03 8f d0 1f 09 61 ac 00 6b 28 ..t:aN.....a..k( 00:20:33.884 000002f0 24 2b 42 f6 cb db 01 d6 54 41 ea 69 10 1d 7c 4a $+B.....TA.i..|J 00:20:33.884 00000300 9e cd 34 2d 29 ae fa 8b ba 47 cc fe 40 2e 72 f8 ..4-)....G..@.r. 00:20:33.884 00000310 5c 8b 81 d7 e1 5d 6a 62 9f 9a 2c 39 05 4d b9 3c \....]jb..,9.M.< 00:20:33.884 00000320 1a 97 1f 8a ec dd a2 39 1f ac a5 13 e6 ee fb d6 .......9........ 00:20:33.884 00000330 38 20 5d 49 8f f9 08 11 10 e2 35 85 4a b7 e2 d8 8 ]I......5.J... 00:20:33.884 00000340 7c 07 50 4a 06 e4 2b 27 c7 25 b8 7b 1f e3 30 e2 |.PJ..+'.%.{..0. 00:20:33.884 00000350 27 28 26 7d ae cf 2d 75 56 6e 0c ee ae a8 fd 44 '(&}..-uVn.....D 00:20:33.884 00000360 fe dc 2a cb e5 2a 11 45 d2 7f e3 9e 5d 27 f5 3c ..*..*.E....]'.< 00:20:33.884 00000370 48 db 94 6d 5a a1 6a 70 71 42 f2 4f 15 52 e4 1b H..mZ.jpqB.O.R.. 00:20:33.884 00000380 0c 27 fe ea fa 44 27 c8 1e 3d e4 f5 19 37 14 be .'...D'..=...7.. 00:20:33.884 00000390 30 03 99 6d 98 49 0e 42 15 ed 9c 9a d3 0f 35 28 0..m.I.B......5( 00:20:33.884 000003a0 ed ea 90 4f 43 9a 43 f0 36 81 87 bf 1a 24 0f ad ...OC.C.6....$.. 00:20:33.884 000003b0 53 86 b4 b6 e4 b9 2f e5 de 71 a8 45 4c 57 40 ff S...../..q.ELW@. 00:20:33.884 000003c0 4f 0e 26 35 e8 34 76 5e 13 d6 cc 42 da 53 b6 e9 O.&5.4v^...B.S.. 00:20:33.884 000003d0 97 3d c2 b3 a0 f9 26 54 ab 84 39 59 31 98 bc 64 .=....&T..9Y1..d 00:20:33.884 000003e0 c8 e1 8c 5a 2c fa 08 b9 84 39 46 ec 55 d6 9c 3b ...Z,....9F.U..; 00:20:33.884 000003f0 d6 15 91 de 6b 85 9c d7 a9 85 80 d7 14 b5 df 62 ....k..........b 00:20:33.884 dh secret: 00:20:33.884 00000000 89 68 73 e5 db 8b 6d 96 7b 7f 5a 4f bc d4 85 1e .hs...m.{.ZO.... 00:20:33.884 00000010 bd 85 d5 f5 4c ae 17 08 4f 99 76 0a 67 de 90 63 ....L...O.v.g..c 00:20:33.884 00000020 0f 56 3d af f6 a0 fc c4 65 08 1e ad ab df f3 af .V=.....e....... 00:20:33.884 00000030 3c 96 c3 44 e5 16 89 83 3f 24 c1 58 a0 e0 92 f9 <..D....?$.X.... 00:20:33.884 00000040 c5 6a 68 05 94 ef 6d b7 b9 f8 c4 0c 04 fa c2 c7 .jh...m......... 00:20:33.884 00000050 93 0a f0 d0 54 3f 99 17 57 93 82 46 61 5c a8 31 ....T?..W..Fa\.1 00:20:33.884 00000060 e8 29 1c 5d 12 81 81 b5 1f 58 c8 41 3f e6 4f ca .).].....X.A?.O. 00:20:33.884 00000070 7e 3f f8 c8 9e 5e e6 95 6e b7 0e ee 79 64 18 5a ~?...^..n...yd.Z 00:20:33.884 00000080 6e e3 ae 3f 5e e9 cf 46 2c d0 77 03 30 a1 0e 8c n..?^..F,.w.0... 00:20:33.884 00000090 eb d3 22 0e 9b 15 54 c4 ff c8 28 e1 f6 aa 04 d4 .."...T...(..... 00:20:33.884 000000a0 50 80 de 91 fb ce 9e c9 d4 6d 64 7a 01 2e 1c b9 P........mdz.... 00:20:33.884 000000b0 82 a4 72 f0 05 42 97 91 f2 8a 2d 79 aa 44 98 e8 ..r..B....-y.D.. 00:20:33.884 000000c0 c0 e1 85 4f 3a 0c de 0b 33 c5 78 15 a2 d9 43 9a ...O:...3.x...C. 00:20:33.884 000000d0 c9 8a 36 ea 33 32 6a 8c de 50 34 d3 1c 01 b1 c4 ..6.32j..P4..... 00:20:33.884 000000e0 10 06 9c e9 79 7b 2f 99 8b 7a a3 00 c8 7f 3f 17 ....y{/..z....?. 00:20:33.884 000000f0 3f 46 13 60 93 60 fc 37 46 a0 fa dd 37 f7 00 8e ?F.`.`.7F...7... 00:20:33.884 00000100 b3 f5 49 5b a5 23 d7 20 d2 5d 8e 40 64 ac e3 63 ..I[.#. .].@d..c 00:20:33.884 00000110 ed d2 a3 49 52 06 b7 82 e8 95 0d 9c 9c f2 6e ae ...IR.........n. 00:20:33.885 00000120 9d 9e 61 d4 19 54 73 7c ff 3e be 84 bf b1 e6 4e ..a..Ts|.>.....N 00:20:33.885 00000130 24 9c 84 49 9d 3b 0d 4c 1c 0e ab 41 a3 09 c5 82 $..I.;.L...A.... 00:20:33.885 00000140 35 12 9c 9c 50 20 d2 96 b2 f5 79 2c 00 7c 71 1e 5...P ....y,.|q. 00:20:33.885 00000150 2d d1 39 d4 d3 83 4c 61 50 55 52 14 54 de da cb -.9...LaPUR.T... 00:20:33.885 00000160 d0 08 e8 39 20 25 2f c7 73 e1 ae 64 bc 18 5c 0e ...9 %/.s..d..\. 00:20:33.885 00000170 89 6f 9b d7 28 58 3e 72 b4 49 92 f2 c4 1d a9 d1 .o..(X>r.I...... 00:20:33.885 00000180 31 d6 f6 13 55 44 fd 9b 67 2f 8a a2 da af c0 d0 1...UD..g/...... 00:20:33.885 00000190 ed 3b df 2d 62 52 dc 70 aa e1 bc 11 14 c6 08 96 .;.-bR.p........ 00:20:33.885 000001a0 db fd f9 14 df bf c7 84 7e fa dd 7c c9 40 3d b8 ........~..|.@=. 00:20:33.885 000001b0 7b db 31 8f 65 51 7e 6f cc 6d 63 92 d3 75 ec 5f {.1.eQ~o.mc..u._ 00:20:33.885 000001c0 8b 04 1a 82 15 ac 4e 75 4e 10 1b 44 35 f3 a5 97 ......NuN..D5... 00:20:33.885 000001d0 1f 33 44 4f 5b 68 fd 18 ff 2c d2 90 2c bd 45 6e .3DO[h...,..,.En 00:20:33.885 000001e0 dc 4a b7 cc 74 32 30 bc f0 d1 94 f3 c9 f2 6b 6e .J..t20.......kn 00:20:33.885 000001f0 a3 31 2a e8 3d a2 6a 75 bc 18 a1 86 b7 28 25 20 .1*.=.ju.....(% 00:20:33.885 00000200 7c f2 d3 75 2f 57 69 cb 6c 97 ad 09 17 93 32 1b |..u/Wi.l.....2. 00:20:33.885 00000210 16 24 9d bb 0f df d0 79 bf 75 d2 b4 2b cc 12 cd .$.....y.u..+... 00:20:33.885 00000220 68 a5 62 43 25 cc 22 0e c6 5e 6e fe 48 63 84 b7 h.bC%."..^n.Hc.. 00:20:33.885 00000230 36 8d 9b f2 6f 5f 33 38 40 96 a2 d0 27 32 55 11 6...o_38@...'2U. 00:20:33.885 00000240 20 d7 9b 7e 52 75 2c bd 21 25 12 7d 46 ba 95 13 ..~Ru,.!%.}F... 00:20:33.885 00000250 0d cf 80 4f 3a 12 a2 92 8c a7 ed 5c 5f 94 c2 86 ...O:......\_... 00:20:33.885 00000260 2e e9 39 6e 9d 8c 48 28 6d c7 08 8d ed 2c 53 f9 ..9n..H(m....,S. 00:20:33.885 00000270 3b 00 40 ba 60 06 0c 85 b1 6b 2a 93 8d d0 5a de ;.@.`....k*...Z. 00:20:33.885 00000280 51 2c e4 86 58 66 b8 5e 58 d6 d4 3e d8 1d 9e 3b Q,..Xf.^X..>...; 00:20:33.885 00000290 f4 4c f8 02 fa 7b b1 1a 1e 7d 41 9f bf 45 48 38 .L...{...}A..EH8 00:20:33.885 000002a0 b8 26 6f 53 38 c0 cb 92 19 cf 64 bc 7c 81 9b 80 .&oS8.....d.|... 00:20:33.885 000002b0 cc e3 20 ec 07 8a d6 ff 2c 4c 7a f7 ae 95 d9 9e .. .....,Lz..... 00:20:33.885 000002c0 d4 93 6d 44 61 d0 f9 1f 30 22 5d 9e d6 1c 1b 66 ..mDa...0"]....f 00:20:33.885 000002d0 f5 5e b5 30 91 9d e8 2d 0e ff 22 d5 2e bc b5 1e .^.0...-.."..... 00:20:33.885 000002e0 3d b0 94 c1 5d 66 18 45 3a 0d de 51 dc eb e3 fd =...]f.E:..Q.... 00:20:33.885 000002f0 30 98 ca 83 92 cd 7c 78 54 90 46 f4 db fb f3 1e 0.....|xT.F..... 00:20:33.885 00000300 58 db ed 29 dd af 89 78 ae 76 e9 e1 fe 9d 65 c6 X..)...x.v....e. 00:20:33.885 00000310 5e 67 fc 6f 05 c4 27 74 83 89 82 5a af ca ab 09 ^g.o..'t...Z.... 00:20:33.885 00000320 f7 2a 2c 61 be e7 f6 ea 61 29 59 d3 2d 3c 5a fe .*,a....a)Y.-^.ZX...8. 00:20:33.885 00000390 c5 58 c7 49 ff 01 65 39 96 fe 35 7e 33 e1 12 da .X.I..e9..5~3... 00:20:33.885 000003a0 86 25 ba 80 d6 de e5 45 36 34 f6 9d 9f 5f 06 1d .%.....E64..._.. 00:20:33.885 000003b0 ca 2a 71 a9 34 34 ec 85 a8 3a 61 39 fe c2 3a aa .*q.44...:a9..:. 00:20:33.885 000003c0 bc 63 76 30 8b a7 54 b0 d4 cd 67 c3 06 d3 50 ef .cv0..T...g...P. 00:20:33.885 000003d0 96 e2 ac 09 76 83 2a d8 1d 95 0d 9f 65 33 af 63 ....v.*.....e3.c 00:20:33.885 000003e0 47 a6 6e 12 33 83 8b f8 99 55 28 94 41 a6 50 d5 G.n.3....U(.A.P. 00:20:33.885 000003f0 7d 10 80 fd bd e4 57 35 ac b1 ea 30 ed ce 33 f2 }.....W5...0..3. 00:20:33.885 [2024-09-27 15:25:11.578047] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=1, dhgroup=5, seq=3428451736, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.885 [2024-09-27 15:25:11.578161] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.885 [2024-09-27 15:25:11.658712] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.885 [2024-09-27 15:25:11.658762] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.885 [2024-09-27 15:25:11.658772] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.885 [2024-09-27 15:25:11.658798] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.885 [2024-09-27 15:25:11.852000] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.885 [2024-09-27 15:25:11.852019] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.885 [2024-09-27 15:25:11.852026] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.885 [2024-09-27 15:25:11.852070] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.885 [2024-09-27 15:25:11.852092] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.885 ctrlr pubkey: 00:20:33.885 00000000 33 74 7b 38 a2 f0 da a7 6d 50 36 d2 4b a8 7f 8e 3t{8....mP6.K... 00:20:33.885 00000010 6c 3c 7d 15 48 21 cb 8b bf 01 3e 0f c8 85 18 9e l<}.H!....>..... 00:20:33.885 00000020 b7 f3 76 d7 70 a5 9e 27 11 49 f8 61 28 55 ff 58 ..v.p..'.I.a(U.X 00:20:33.885 00000030 cb d3 88 be 4c e5 05 b3 e3 19 d4 43 a9 56 e2 77 ....L......C.V.w 00:20:33.885 00000040 bd 80 5f a7 46 7b d4 98 1a 0e 80 63 d4 12 b7 b9 .._.F{.....c.... 00:20:33.885 00000050 1a a0 04 03 5e cb 53 47 a3 a7 54 45 93 e0 57 db ....^.SG..TE..W. 00:20:33.885 00000060 bd db 20 87 26 17 3f 8a 45 53 d9 d9 ca 30 c4 53 .. .&.?.ES...0.S 00:20:33.885 00000070 d2 33 92 fb 55 a5 35 58 af ab 6d ec 1b ee 4b 67 .3..U.5X..m...Kg 00:20:33.885 00000080 6e 4d 63 9a db ea 9f 71 a8 9d 68 3c 51 c2 ff e8 nMc....q..ht.LQ<..Ld`.*" 00:20:33.885 00000120 cb 31 19 32 b5 48 ed b9 66 3d 6d ed 81 f6 41 d4 .1.2.H..f=m...A. 00:20:33.885 00000130 1e c9 3e 4c e1 77 55 7c f2 59 f2 da 8d bb 41 e5 ..>L.wU|.Y....A. 00:20:33.885 00000140 fd a4 e8 28 6b 79 fd 91 23 91 65 f3 30 ca 33 c4 ...(ky..#.e.0.3. 00:20:33.885 00000150 59 77 fa b9 58 23 d5 c7 ee 16 4b d5 94 f2 5e 20 Yw..X#....K...^ 00:20:33.885 00000160 44 08 4c 78 2b 08 72 dd 10 47 35 b6 a5 ac a5 57 D.Lx+.r..G5....W 00:20:33.885 00000170 e7 45 ed d7 a7 6d 37 ab 50 53 c7 cf 80 8c 62 ed .E...m7.PS....b. 00:20:33.885 00000180 9f b2 c8 2b 8e 6d 5b f6 0f ef e0 cc 9c 9b 11 72 ...+.m[........r 00:20:33.885 00000190 80 2a 23 19 1c c6 07 02 7f ac 1e 8f 3c c2 a0 f7 .*#.........<... 00:20:33.885 000001a0 74 6c 64 b1 d4 fa 33 cb 2b e1 c0 9b 81 aa 12 b5 tld...3.+....... 00:20:33.885 000001b0 f3 1e 4c a0 06 f2 8e 9b 83 f9 b2 f0 3d ba 8d 05 ..L.........=... 00:20:33.885 000001c0 cb a3 e9 11 f0 87 44 17 b7 58 75 a4 01 4c d2 8e ......D..Xu..L.. 00:20:33.885 000001d0 ce b8 4d 22 f8 09 20 6b 94 fe 21 8b 17 4a fc a9 ..M".. k..!..J.. 00:20:33.885 000001e0 93 51 90 f6 89 9f c2 fd 61 e0 02 6f 84 42 41 bd .Q......a..o.BA. 00:20:33.885 000001f0 07 2f 29 3d 56 47 d8 5f e8 d6 b3 d8 08 40 d5 5e ./)=VG._.....@.^ 00:20:33.885 00000200 1e e2 dc bb aa b3 41 8b 86 b9 48 2f 33 04 ec 17 ......A...H/3... 00:20:33.885 00000210 9b a4 77 0f 2b 54 7b 3c e2 e3 6a a3 ae 31 2d c1 ..w.+T{<..j..1-. 00:20:33.885 00000220 73 93 bb 24 1c 4a f6 56 6e 2e fa 73 7f 8a 23 05 s..$.J.Vn..s..#. 00:20:33.885 00000230 d2 eb 46 25 1d 30 29 89 b2 1d 12 ce 18 9c 35 f8 ..F%.0).......5. 00:20:33.885 00000240 f4 7e 00 30 3f 20 ca 4e 04 5b 37 5f 57 c6 7b 8b .~.0? .N.[7_W.{. 00:20:33.885 00000250 fa 33 51 d6 28 98 33 3c e8 81 a4 c7 08 cf a2 44 .3Q.(.3<.......D 00:20:33.885 00000260 c6 74 21 ca e6 eb 33 f1 dc 7a 8b 01 49 e4 66 be .t!...3..z..I.f. 00:20:33.885 00000270 4c ee 92 42 dc 26 03 b2 3c 89 a3 80 ce 5c 88 f7 L..B.&..<....\.. 00:20:33.885 00000280 29 9f 18 52 63 43 d2 5d 94 5e 59 15 ce c4 26 df )..RcC.].^Y...&. 00:20:33.885 00000290 62 f5 8b 5a ee ce 82 f5 4b f8 96 8b 7d 9e ab a4 b..Z....K...}... 00:20:33.885 000002a0 51 33 c9 16 b5 83 d6 41 05 d4 7c a3 79 2e 16 f5 Q3.....A..|.y... 00:20:33.885 000002b0 e2 63 64 51 ff d8 63 26 d3 d4 63 fc 6a 92 93 76 .cdQ..c&..c.j..v 00:20:33.885 000002c0 7b c3 f8 1a d2 c0 37 b3 97 05 a2 6f 8d 26 9b 60 {.....7....o.&.` 00:20:33.885 000002d0 9d d6 ee 76 1c e8 6d 40 a9 ec 7c 06 ec fe ae cc ...v..m@..|..... 00:20:33.885 000002e0 1f 71 50 21 d7 0d 24 0c 95 3a 1b 42 54 cb f2 9e .qP!..$..:.BT... 00:20:33.885 000002f0 4f c9 d8 a7 a6 cb 2f 2b 9b ad a6 c5 ed 28 56 36 O...../+.....(V6 00:20:33.885 00000300 79 71 75 b0 3c 68 38 79 40 05 30 3b 41 64 db b9 yqu..K...... 00:20:33.885 00000380 ec e4 d5 f0 ef 1a d3 60 3d 3c e4 a3 ea 25 2e d2 .......`=<...%.. 00:20:33.885 00000390 b5 99 fa 86 21 30 51 58 a6 92 09 37 b8 1f df b3 ....!0QX...7.... 00:20:33.885 000003a0 61 a5 8d 34 df 69 c1 a3 57 35 cf d0 f9 3c 3b 52 a..4.i..W5...<;R 00:20:33.885 000003b0 f8 17 83 52 4f 4e f7 e7 c5 5d 99 80 11 c5 e6 98 ...RON...]...... 00:20:33.885 000003c0 af a1 f9 98 2e e9 3a 2a 01 d1 d1 4f ed 8e 3d 12 ......:*...O..=. 00:20:33.885 000003d0 00 f8 c7 62 9a b0 9f db 67 11 9a b9 ca 5e 27 87 ...b....g....^'. 00:20:33.885 000003e0 0a 92 37 9c 80 1a ba e9 bb 90 80 b9 d1 40 20 01 ..7..........@ . 00:20:33.886 000003f0 df b3 38 b0 2d 3a 85 b8 d4 f7 8e 82 31 34 73 e8 ..8.-:......14s. 00:20:33.886 host pubkey: 00:20:33.886 00000000 ef 94 6b d8 bc 8c 5d ec 4c 13 6f 1d ed c6 7a 93 ..k...].L.o...z. 00:20:33.886 00000010 7f a4 f7 01 e0 25 15 23 6f 01 cd c7 ca a4 78 cb .....%.#o.....x. 00:20:33.886 00000020 61 1a 3f d6 bf e0 a9 1a 8f 59 2d b1 a3 37 7c 47 a.?......Y-..7|G 00:20:33.886 00000030 a7 ef 95 61 a7 5e 10 fd ad dd b0 3a ca 11 9e c3 ...a.^.....:.... 00:20:33.886 00000040 11 81 db 1c 6f 88 ed f3 68 d8 40 1c bb 7c 6e 84 ....o...h.@..|n. 00:20:33.886 00000050 91 db 96 78 0f 5a 0e d2 ca 1f de 1c 00 29 09 39 ...x.Z.......).9 00:20:33.886 00000060 ee bb 2e 79 6e 16 ea b3 44 8a 9d 62 8c 0b 8a 26 ...yn...D..b...& 00:20:33.886 00000070 5e a7 e4 7d 51 c8 dd 0d b9 9d 3d ce e8 a3 20 0b ^..}Q.....=... . 00:20:33.886 00000080 0f cf 6b bc 8f 7d 70 e9 37 ef 96 39 d3 12 f6 d6 ..k..}p.7..9.... 00:20:33.886 00000090 31 5f d3 55 d6 f7 4f af e3 52 a6 a2 99 81 59 62 1_.U..O..R....Yb 00:20:33.886 000000a0 4c ce da 44 c0 5b 93 6d 8a a9 5d 7e 4e b4 49 2c L..D.[.m..]~N.I, 00:20:33.886 000000b0 c3 ad a3 7a 1b 50 1a 53 5c b9 27 1c 60 6a 7e 03 ...z.P.S\.'.`j~. 00:20:33.886 000000c0 14 98 b7 b9 45 71 dd 59 80 59 f7 c4 d9 5a c5 da ....Eq.Y.Y...Z.. 00:20:33.886 000000d0 23 80 b0 88 6e 58 3e b8 73 29 5a 32 aa f1 2c ca #...nX>.s)Z2..,. 00:20:33.886 000000e0 53 f3 c7 16 82 70 5c 1d eb b6 5b ab cc 07 80 18 S....p\...[..... 00:20:33.886 000000f0 24 a5 86 a4 f6 b1 7e 10 b4 3d a8 05 a1 f6 1c 16 $.....~..=...... 00:20:33.886 00000100 b9 e2 3e 0b 67 67 f9 96 36 74 5f ac 3f 0d 58 9b ..>.gg..6t_.?.X. 00:20:33.886 00000110 95 26 cf ea 72 7a 4c 46 65 35 2d 47 77 81 d1 e2 .&..rzLFe5-Gw... 00:20:33.886 00000120 34 f4 5e 62 cd 9f 0d 7a 4e 54 7b 2f 15 6c c1 55 4.^b...zNT{/.l.U 00:20:33.886 00000130 e5 2a ec 4c 19 ad 57 01 13 b0 23 f0 67 13 5c 24 .*.L..W...#.g.\$ 00:20:33.886 00000140 b3 75 88 30 74 f6 4f 58 7d 40 79 fc 45 06 5e 19 .u.0t.OX}@y.E.^. 00:20:33.886 00000150 26 bc cc 71 71 a3 83 d4 45 17 6d bd 60 61 c1 ef &..qq...E.m.`a.. 00:20:33.886 00000160 5e d1 db af 13 5e b7 e9 7f 94 8d c1 4a 8f ba be ^....^......J... 00:20:33.886 00000170 62 f9 e6 88 8b 06 08 0c a8 fa 6f f0 2d 40 db a8 b.........o.-@.. 00:20:33.886 00000180 b5 09 5c db a7 3b f1 dd 01 db 58 cb 67 ab b4 77 ..\..;....X.g..w 00:20:33.886 00000190 2d f1 44 0d 2f 7e 2c d6 b6 14 4e fe 2d da cc 0b -.D./~,...N.-... 00:20:33.886 000001a0 c2 27 df 88 d2 27 a6 38 a4 7a f8 98 15 62 ba 96 .'...'.8.z...b.. 00:20:33.886 000001b0 a0 e9 8f 87 79 2f 10 c1 2d 31 ef 30 60 46 29 1c ....y/..-1.0`F). 00:20:33.886 000001c0 19 20 b2 c4 68 e4 6b 1e 52 ac 8d d4 ef a6 82 83 . ..h.k.R....... 00:20:33.886 000001d0 88 77 e0 2a bb 34 3d ec 59 07 cb 6b 5b e2 51 0d .w.*.4=.Y..k[.Q. 00:20:33.886 000001e0 a3 51 de f1 d1 38 1f d1 c6 53 d4 1b 2a b2 88 d4 .Q...8...S..*... 00:20:33.886 000001f0 15 f9 cf 6d f8 66 57 04 25 85 24 6c 93 db 6f 91 ...m.fW.%.$l..o. 00:20:33.886 00000200 df bb d4 0c c0 18 f2 ef 88 29 e6 74 08 7b 77 2a .........).t.{w* 00:20:33.886 00000210 63 84 2f 9b d3 85 84 b1 9b f6 ff fa 9e 7e a8 20 c./..........~. 00:20:33.886 00000220 e3 f8 5a 5a 26 99 f1 9d 00 3b f9 24 ec a5 35 bd ..ZZ&....;.$..5. 00:20:33.886 00000230 f7 60 13 7a 1c e0 40 4e 74 89 1a 12 d1 6b d6 31 .`.z..@Nt....k.1 00:20:33.886 00000240 d4 09 81 b1 21 53 8b 97 bd c8 c4 be 91 a5 8d a8 ....!S.......... 00:20:33.886 00000250 99 61 c7 74 51 4b ee da 2b 9f b4 a8 07 e3 91 2a .a.tQK..+......* 00:20:33.886 00000260 10 88 f1 37 f8 f0 5c 31 aa 83 ff 69 71 80 78 27 ...7..\1...iq.x' 00:20:33.886 00000270 c6 ad 75 6d c3 74 e2 95 8c 8a ef 89 21 1a ce 76 ..um.t......!..v 00:20:33.886 00000280 e2 88 bf 4e 2e 20 67 35 ac 1e b4 41 b0 95 c3 ad ...N. g5...A.... 00:20:33.886 00000290 ce b8 e9 e4 fc 34 da 25 0f ac 61 b5 3c 82 04 2d .....4.%..a.<..- 00:20:33.886 000002a0 54 30 12 75 55 79 63 96 85 34 bf ea 08 30 31 bb T0.uUyc..4...01. 00:20:33.886 000002b0 b7 f9 1c 3c 36 4c 83 9f d1 38 04 c0 62 54 4f 32 ...<6L...8..bTO2 00:20:33.886 000002c0 03 4e 6e 4e 56 10 0a d2 46 fd 5f f9 a2 0b 1e 51 .NnNV...F._....Q 00:20:33.886 000002d0 c6 32 e3 e9 60 4f b2 57 dd 48 47 ce 30 2e 20 e2 .2..`O.W.HG.0. . 00:20:33.886 000002e0 45 0c dc 0f 67 f7 9e 8b 1c 7d 50 43 69 4d 51 86 E...g....}PCiMQ. 00:20:33.886 000002f0 60 83 df c7 0d 28 2d e3 ad 0d 79 2d b5 d7 10 64 `....(-...y-...d 00:20:33.886 00000300 33 20 54 b2 f9 fc fa ef f1 d8 8c 3c d4 39 69 4b 3 T........<.9iK 00:20:33.886 00000310 0e c5 de 30 f4 0a 17 fc f6 0b 37 17 66 02 7b a6 ...0......7.f.{. 00:20:33.886 00000320 8f a1 b9 d1 7e 51 c9 fa ee 95 16 4b 43 39 fa 17 ....~Q.....KC9.. 00:20:33.886 00000330 cc 38 63 49 64 b5 6a 89 97 83 17 70 a2 17 71 a9 .8cId.j....p..q. 00:20:33.886 00000340 3e f7 f8 b0 38 4e 93 5e 09 4b 5c 30 67 70 4e cc >...8N.^.K\0gpN. 00:20:33.886 00000350 06 69 33 cb 16 14 1d 75 47 3f ff 89 de 48 c1 7c .i3....uG?...H.| 00:20:33.886 00000360 cd 7e 72 74 31 49 3b b9 77 4d 6d 23 3b 30 a1 6d .~rt1I;.wMm#;0.m 00:20:33.886 00000370 f6 4f 68 e0 2f 67 10 74 ec fb ae 9b 55 40 55 c9 .Oh./g.t....U@U. 00:20:33.886 00000380 4c 7e fe 7e 5f a8 ce a6 42 6e cc 73 9d 5c 3f 26 L~.~_...Bn.s.\?& 00:20:33.886 00000390 23 7a 60 c4 db 42 e8 57 fd 04 7b 2c 8a 8e 30 ca #z`..B.W..{,..0. 00:20:33.886 000003a0 e4 62 b2 32 2a c2 3a df 57 1d c9 8e e5 09 35 64 .b.2*.:.W.....5d 00:20:33.886 000003b0 e8 b5 50 16 2f 6d 87 d5 97 c4 1b 50 77 0f 6a b6 ..P./m.....Pw.j. 00:20:33.886 000003c0 be fb 66 dd 67 cb 53 79 fc 80 b5 d1 41 be 46 c2 ..f.g.Sy....A.F. 00:20:33.886 000003d0 af ad 0a 51 1a 93 4c b9 e9 cc 08 32 53 d0 6c bc ...Q..L....2S.l. 00:20:33.886 000003e0 35 bb 14 38 a4 78 13 db ff f4 67 ff 0b 2f c1 af 5..8.x....g../.. 00:20:33.886 000003f0 ec 7d 1c 96 12 0e a9 fc d8 59 db 22 0a fb 4c 5a .}.......Y."..LZ 00:20:33.886 dh secret: 00:20:33.886 00000000 e1 ac 6e 87 fe 17 3b 0e ff 9e ad de b7 49 93 2d ..n...;......I.- 00:20:33.886 00000010 49 94 c7 9d a9 17 66 72 1f be 57 74 06 fb 91 f7 I.....fr..Wt.... 00:20:33.886 00000020 b1 98 7a 2a 74 2f 23 51 96 ae 9a 6f cf 74 1c 3f ..z*t/#Q...o.t.? 00:20:33.886 00000030 cd a0 b4 3a ff 07 89 5a 7b 70 c4 3e 8a 30 e0 d6 ...:...Z{p.>.0.. 00:20:33.886 00000040 ed 97 d4 ca f9 ca 07 b4 98 e8 43 0c 19 63 07 39 ..........C..c.9 00:20:33.886 00000050 fc c2 75 59 f6 3e ef 35 02 f4 48 c8 50 d8 18 20 ..uY.>.5..H.P.. 00:20:33.886 00000060 f1 4b 8f e9 4c 6a 49 bc bb b3 9e e8 de 44 ce cd .K..LjI......D.. 00:20:33.886 00000070 86 b0 94 31 d4 43 9d 6d 92 96 62 2b 53 21 7d 4f ...1.C.m..b+S!}O 00:20:33.886 00000080 20 3b 7a 5b 86 a7 7a 5a a3 e3 eb 73 4c e0 1a c1 ;z[..zZ...sL... 00:20:33.886 00000090 3e 68 34 d7 80 ab e0 67 d2 ce 32 0a b3 b9 9c d9 >h4....g..2..... 00:20:33.886 000000a0 44 f7 a7 d6 d6 17 c3 75 e2 7e 83 d0 4b fe 4d 35 D......u.~..K.M5 00:20:33.886 000000b0 e7 09 77 93 f4 51 c9 06 f3 c6 0e 84 65 3e 62 ce ..w..Q......e>b. 00:20:33.886 000000c0 e2 5f 3f 68 45 80 20 57 3d 56 0e 06 70 89 9f 60 ._?hE. W=V..p..` 00:20:33.886 000000d0 8a ca c9 df 77 67 5e fb 1e e9 69 93 5b 69 72 04 ....wg^...i.[ir. 00:20:33.886 000000e0 46 52 ac 9f d3 71 4a 3d c6 b2 06 b9 d7 01 df cf FR...qJ=........ 00:20:33.886 000000f0 c9 5d 08 1d 34 6a f3 79 0c 6f 6c cf 85 c5 ca 90 .]..4j.y.ol..... 00:20:33.886 00000100 c7 9f ad 5e 95 e9 fa e7 24 e3 09 7e b4 7b 03 ce ...^....$..~.{.. 00:20:33.886 00000110 df ad b4 a6 cc 4b aa 20 11 35 93 a5 5f 4c 93 c0 .....K. .5.._L.. 00:20:33.886 00000120 9f 02 19 e0 fb 55 1e 02 4d e0 9c d0 d8 22 74 9f .....U..M...."t. 00:20:33.886 00000130 7a c7 a4 6b 7e 2c f6 39 22 46 64 93 a3 60 a6 52 z..k~,.9"Fd..`.R 00:20:33.886 00000140 0e 4a ed 67 50 33 b2 62 e7 23 fe 20 06 8e 53 b9 .J.gP3.b.#. ..S. 00:20:33.886 00000150 7d 8b 3b c4 1e d6 1f 6b 56 91 4d 65 5a 16 c5 a3 }.;....kV.MeZ... 00:20:33.886 00000160 a3 80 85 8a 02 fa ac 55 85 d8 54 42 0f 01 91 0b .......U..TB.... 00:20:33.886 00000170 db 1b 7a 85 41 6e 81 7d 0b 85 23 d2 e7 df e4 2b ..z.An.}..#....+ 00:20:33.886 00000180 97 15 fa e1 c3 b0 94 32 f9 e9 a6 ba 4f 19 8f 7d .......2....O..} 00:20:33.886 00000190 70 9f 8b 73 83 a8 bc 27 28 f5 be 61 2a 0a 60 d6 p..s...'(..a*.`. 00:20:33.886 000001a0 62 bf 86 f5 eb 93 36 39 4d 38 6d 75 41 84 f3 6f b.....69M8muA..o 00:20:33.886 000001b0 be c3 79 19 27 69 02 a7 fe d9 19 8c 6f 9e f9 1b ..y.'i......o... 00:20:33.886 000001c0 71 fe 7d 26 72 95 5e f4 31 d2 3f df d1 c1 2a a0 q.}&r.^.1.?...*. 00:20:33.886 000001d0 5e 85 a3 d2 4c e4 7e 9d 26 f8 8f e8 ef cd 89 11 ^...L.~.&....... 00:20:33.886 000001e0 d6 34 f6 60 ce b9 b0 b2 87 bd 2f 99 1c 7c 47 54 .4.`....../..|GT 00:20:33.886 000001f0 1d 6f c5 94 42 19 04 d5 98 b6 86 87 ea db bb e5 .o..B........... 00:20:33.886 00000200 cc c0 73 e2 8a aa 97 81 2b 9c 17 18 11 ac 0b 36 ..s.....+......6 00:20:33.886 00000210 db f0 2f be 69 a5 56 f9 78 b4 ff 0f 01 1b be 06 ../.i.V.x....... 00:20:33.886 00000220 67 12 da b0 26 ee c4 85 5f 3d 1e 85 a6 1f bd ae g...&..._=...... 00:20:33.886 00000230 77 b0 bc ae 6c 91 3e 33 52 04 40 3a 1a 88 03 3c w...l.>3R.@:...< 00:20:33.886 00000240 b4 28 26 9a 9a 8d cf 37 d7 ef c8 57 f4 44 9e 25 .(&....7...W.D.% 00:20:33.886 00000250 1f 93 3d 45 8b 7f f4 41 55 6e fa be 3a 38 6b 35 ..=E...AUn..:8k5 00:20:33.886 00000260 3e 97 76 31 d0 86 4f 5b 0b cd 25 cc 77 e4 34 d8 >.v1..O[..%.w.4. 00:20:33.886 00000270 82 61 fa 4a 58 01 a1 e8 94 d2 43 45 09 bd 2d 41 .a.JX.....CE..-A 00:20:33.886 00000280 6c 38 35 a3 db a9 50 4c 0b 9a 51 fa 3a f2 77 13 l85...PL..Q.:.w. 00:20:33.886 00000290 97 73 70 1b d5 8c f0 dd aa 81 fc 6f 2d 89 b5 65 .sp........o-..e 00:20:33.886 000002a0 cd b6 dc 3c c0 e9 b4 69 58 42 7e 25 be b5 b6 88 ...<...iXB~%.... 00:20:33.886 000002b0 a3 a2 e9 4c 47 f3 63 ed 39 63 f2 8e 3f e5 d6 b1 ...LG.c.9c..?... 00:20:33.886 000002c0 3e 0f 67 d8 50 84 39 53 de 4c 62 ee fb 88 e8 55 >.g.P.9S.Lb....U 00:20:33.886 000002d0 c0 4a f8 02 68 44 2d 33 b1 a0 34 60 ed 56 1e a0 .J..hD-3..4`.V.. 00:20:33.886 000002e0 48 83 f7 a4 ca 1b c4 44 7c 6c ce e1 1a 86 29 97 H......D|l....). 00:20:33.886 000002f0 ca 16 b1 65 a5 aa 9b 5a ea a2 86 a9 47 51 c3 60 ...e...Z....GQ.` 00:20:33.886 00000300 0e d4 df 8d 50 2c 8f 93 ed f9 66 e2 bb 2b ce 2c ....P,....f..+., 00:20:33.886 00000310 4f c3 4d 00 02 04 63 21 71 c4 1c 17 97 73 a5 76 O.M...c!q....s.v 00:20:33.886 00000320 51 c8 f9 a0 d6 f8 50 51 27 a4 cf bb d7 c2 bd f6 Q.....PQ'....... 00:20:33.886 00000330 cc 6d f2 31 0a 7a e5 c6 48 45 a1 86 ac 5f 10 da .m.1.z..HE..._.. 00:20:33.886 00000340 08 2b 93 ed 32 56 3f 05 1b 3a 65 37 92 f5 a5 bd .+..2V?..:e7.... 00:20:33.886 00000350 21 11 60 36 b6 3a 16 15 54 2c fa 49 b9 a7 55 86 !.`6.:..T,.I..U. 00:20:33.886 00000360 f1 1a 7b 81 1a 52 80 8d 64 c7 28 7d 40 48 68 a4 ..{..R..d.(}@Hh. 00:20:33.886 00000370 b9 ae f1 b4 5c 64 24 dd eb 70 bf 10 cc 3d 0b 52 ....\d$..p...=.R 00:20:33.886 00000380 08 d0 4e 42 25 bd 55 db cb e2 a0 24 e7 f8 d1 b0 ..NB%.U....$.... 00:20:33.886 00000390 d8 2f 4f 0c 5d 33 7b af cc db 90 6e 29 66 cc 4f ./O.]3{....n)f.O 00:20:33.886 000003a0 8b d8 c7 4e 10 ca 4a 9e 43 8a 91 b7 97 21 03 af ...N..J.C....!.. 00:20:33.886 000003b0 d2 0e a2 2d c9 ae 1f 14 0e a9 d4 8d c4 8c 21 1e ...-..........!. 00:20:33.886 000003c0 04 5b 50 46 d3 52 9b 3d 2f 4e 38 7d 66 fb da aa .[PF.R.=/N8}f... 00:20:33.886 000003d0 93 2d e0 02 42 29 cb c4 68 2b 9d db e0 2d b4 12 .-..B)..h+...-.. 00:20:33.886 000003e0 de be 04 68 71 cf f9 c3 ff 8a 7f 8d e6 27 50 07 ...hq........'P. 00:20:33.886 000003f0 84 c4 bd ed 1b 15 f8 a9 4c d4 4b f7 31 67 eb 04 ........L.K.1g.. 00:20:33.886 [2024-09-27 15:25:11.971198] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=5, seq=3428451737, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.886 [2024-09-27 15:25:12.031945] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.886 [2024-09-27 15:25:12.031997] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.886 [2024-09-27 15:25:12.032015] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.887 [2024-09-27 15:25:12.032035] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.887 [2024-09-27 15:25:12.032050] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.887 [2024-09-27 15:25:12.138350] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.887 [2024-09-27 15:25:12.138367] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.887 [2024-09-27 15:25:12.138375] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.887 [2024-09-27 15:25:12.138385] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.887 [2024-09-27 15:25:12.138442] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.887 ctrlr pubkey: 00:20:33.887 00000000 33 74 7b 38 a2 f0 da a7 6d 50 36 d2 4b a8 7f 8e 3t{8....mP6.K... 00:20:33.887 00000010 6c 3c 7d 15 48 21 cb 8b bf 01 3e 0f c8 85 18 9e l<}.H!....>..... 00:20:33.887 00000020 b7 f3 76 d7 70 a5 9e 27 11 49 f8 61 28 55 ff 58 ..v.p..'.I.a(U.X 00:20:33.887 00000030 cb d3 88 be 4c e5 05 b3 e3 19 d4 43 a9 56 e2 77 ....L......C.V.w 00:20:33.887 00000040 bd 80 5f a7 46 7b d4 98 1a 0e 80 63 d4 12 b7 b9 .._.F{.....c.... 00:20:33.887 00000050 1a a0 04 03 5e cb 53 47 a3 a7 54 45 93 e0 57 db ....^.SG..TE..W. 00:20:33.887 00000060 bd db 20 87 26 17 3f 8a 45 53 d9 d9 ca 30 c4 53 .. .&.?.ES...0.S 00:20:33.887 00000070 d2 33 92 fb 55 a5 35 58 af ab 6d ec 1b ee 4b 67 .3..U.5X..m...Kg 00:20:33.887 00000080 6e 4d 63 9a db ea 9f 71 a8 9d 68 3c 51 c2 ff e8 nMc....q..ht.LQ<..Ld`.*" 00:20:33.887 00000120 cb 31 19 32 b5 48 ed b9 66 3d 6d ed 81 f6 41 d4 .1.2.H..f=m...A. 00:20:33.887 00000130 1e c9 3e 4c e1 77 55 7c f2 59 f2 da 8d bb 41 e5 ..>L.wU|.Y....A. 00:20:33.887 00000140 fd a4 e8 28 6b 79 fd 91 23 91 65 f3 30 ca 33 c4 ...(ky..#.e.0.3. 00:20:33.887 00000150 59 77 fa b9 58 23 d5 c7 ee 16 4b d5 94 f2 5e 20 Yw..X#....K...^ 00:20:33.887 00000160 44 08 4c 78 2b 08 72 dd 10 47 35 b6 a5 ac a5 57 D.Lx+.r..G5....W 00:20:33.887 00000170 e7 45 ed d7 a7 6d 37 ab 50 53 c7 cf 80 8c 62 ed .E...m7.PS....b. 00:20:33.887 00000180 9f b2 c8 2b 8e 6d 5b f6 0f ef e0 cc 9c 9b 11 72 ...+.m[........r 00:20:33.887 00000190 80 2a 23 19 1c c6 07 02 7f ac 1e 8f 3c c2 a0 f7 .*#.........<... 00:20:33.887 000001a0 74 6c 64 b1 d4 fa 33 cb 2b e1 c0 9b 81 aa 12 b5 tld...3.+....... 00:20:33.887 000001b0 f3 1e 4c a0 06 f2 8e 9b 83 f9 b2 f0 3d ba 8d 05 ..L.........=... 00:20:33.887 000001c0 cb a3 e9 11 f0 87 44 17 b7 58 75 a4 01 4c d2 8e ......D..Xu..L.. 00:20:33.887 000001d0 ce b8 4d 22 f8 09 20 6b 94 fe 21 8b 17 4a fc a9 ..M".. k..!..J.. 00:20:33.887 000001e0 93 51 90 f6 89 9f c2 fd 61 e0 02 6f 84 42 41 bd .Q......a..o.BA. 00:20:33.887 000001f0 07 2f 29 3d 56 47 d8 5f e8 d6 b3 d8 08 40 d5 5e ./)=VG._.....@.^ 00:20:33.887 00000200 1e e2 dc bb aa b3 41 8b 86 b9 48 2f 33 04 ec 17 ......A...H/3... 00:20:33.887 00000210 9b a4 77 0f 2b 54 7b 3c e2 e3 6a a3 ae 31 2d c1 ..w.+T{<..j..1-. 00:20:33.887 00000220 73 93 bb 24 1c 4a f6 56 6e 2e fa 73 7f 8a 23 05 s..$.J.Vn..s..#. 00:20:33.887 00000230 d2 eb 46 25 1d 30 29 89 b2 1d 12 ce 18 9c 35 f8 ..F%.0).......5. 00:20:33.887 00000240 f4 7e 00 30 3f 20 ca 4e 04 5b 37 5f 57 c6 7b 8b .~.0? .N.[7_W.{. 00:20:33.887 00000250 fa 33 51 d6 28 98 33 3c e8 81 a4 c7 08 cf a2 44 .3Q.(.3<.......D 00:20:33.887 00000260 c6 74 21 ca e6 eb 33 f1 dc 7a 8b 01 49 e4 66 be .t!...3..z..I.f. 00:20:33.887 00000270 4c ee 92 42 dc 26 03 b2 3c 89 a3 80 ce 5c 88 f7 L..B.&..<....\.. 00:20:33.887 00000280 29 9f 18 52 63 43 d2 5d 94 5e 59 15 ce c4 26 df )..RcC.].^Y...&. 00:20:33.887 00000290 62 f5 8b 5a ee ce 82 f5 4b f8 96 8b 7d 9e ab a4 b..Z....K...}... 00:20:33.887 000002a0 51 33 c9 16 b5 83 d6 41 05 d4 7c a3 79 2e 16 f5 Q3.....A..|.y... 00:20:33.887 000002b0 e2 63 64 51 ff d8 63 26 d3 d4 63 fc 6a 92 93 76 .cdQ..c&..c.j..v 00:20:33.887 000002c0 7b c3 f8 1a d2 c0 37 b3 97 05 a2 6f 8d 26 9b 60 {.....7....o.&.` 00:20:33.887 000002d0 9d d6 ee 76 1c e8 6d 40 a9 ec 7c 06 ec fe ae cc ...v..m@..|..... 00:20:33.887 000002e0 1f 71 50 21 d7 0d 24 0c 95 3a 1b 42 54 cb f2 9e .qP!..$..:.BT... 00:20:33.887 000002f0 4f c9 d8 a7 a6 cb 2f 2b 9b ad a6 c5 ed 28 56 36 O...../+.....(V6 00:20:33.887 00000300 79 71 75 b0 3c 68 38 79 40 05 30 3b 41 64 db b9 yqu..K...... 00:20:33.887 00000380 ec e4 d5 f0 ef 1a d3 60 3d 3c e4 a3 ea 25 2e d2 .......`=<...%.. 00:20:33.887 00000390 b5 99 fa 86 21 30 51 58 a6 92 09 37 b8 1f df b3 ....!0QX...7.... 00:20:33.887 000003a0 61 a5 8d 34 df 69 c1 a3 57 35 cf d0 f9 3c 3b 52 a..4.i..W5...<;R 00:20:33.887 000003b0 f8 17 83 52 4f 4e f7 e7 c5 5d 99 80 11 c5 e6 98 ...RON...]...... 00:20:33.887 000003c0 af a1 f9 98 2e e9 3a 2a 01 d1 d1 4f ed 8e 3d 12 ......:*...O..=. 00:20:33.887 000003d0 00 f8 c7 62 9a b0 9f db 67 11 9a b9 ca 5e 27 87 ...b....g....^'. 00:20:33.887 000003e0 0a 92 37 9c 80 1a ba e9 bb 90 80 b9 d1 40 20 01 ..7..........@ . 00:20:33.887 000003f0 df b3 38 b0 2d 3a 85 b8 d4 f7 8e 82 31 34 73 e8 ..8.-:......14s. 00:20:33.887 host pubkey: 00:20:33.887 00000000 dc ec ff 21 7c a1 0d f0 86 3c dd 52 8e 9e 44 86 ...!|....<.R..D. 00:20:33.887 00000010 0d 7c 25 42 23 8e 1d 68 6e e3 4a 5d 9c 11 a3 43 .|%B#..hn.J]...C 00:20:33.887 00000020 0c 4f 68 ba 94 bc 3d 72 7b 24 1b 54 36 95 48 c9 .Oh...=r{$.T6.H. 00:20:33.887 00000030 47 e2 a8 fb b4 1d c1 ce fa c9 ee c9 06 ed 5d 5b G.............][ 00:20:33.887 00000040 4f 3b 50 81 95 21 f2 4f 99 ff b7 52 95 52 f4 e1 O;P..!.O...R.R.. 00:20:33.887 00000050 03 83 ff 71 e5 76 17 a6 b9 91 a9 2c ed 1b 53 0b ...q.v.....,..S. 00:20:33.887 00000060 32 09 b6 4d e2 d6 f6 b8 da e6 6e 9d 20 ec b7 f8 2..M......n. ... 00:20:33.887 00000070 00 4c f4 92 31 db 82 cf ae 5f 5f 9f 54 5b 8c 35 .L..1....__.T[.5 00:20:33.887 00000080 c1 88 9d 0d be e0 0d bf 19 15 68 ab 8b c2 e8 68 ..........h....h 00:20:33.887 00000090 70 ec 98 b3 8b 37 89 39 e4 4e c8 d6 15 3a 47 eb p....7.9.N...:G. 00:20:33.887 000000a0 35 4c 7a eb 76 cb d1 7b c2 18 10 f2 16 60 f8 a5 5Lz.v..{.....`.. 00:20:33.887 000000b0 59 f4 b6 42 95 2b 11 87 7c 1e d9 a3 5f b8 c5 9c Y..B.+..|..._... 00:20:33.887 000000c0 9e 60 65 2b c3 90 70 d3 65 ad 3a 10 fd 28 ab 6a .`e+..p.e.:..(.j 00:20:33.887 000000d0 26 8f af 7e ef 3e 66 8d 98 1a bb 66 d2 9c 9b a8 &..~.>f....f.... 00:20:33.887 000000e0 7f 7d 39 7e df 2f e4 c0 17 46 f2 d0 d6 06 f3 d9 .}9~./...F...... 00:20:33.887 000000f0 b6 38 ce 0d 4a 86 c8 6d 96 7f 85 32 29 78 e1 9a .8..J..m...2)x.. 00:20:33.887 00000100 bb d7 f4 8d 75 89 d3 96 a3 13 8e 41 92 cc 5f 3e ....u......A.._> 00:20:33.887 00000110 f2 4c d5 8c e9 34 55 ab 97 37 19 cb af 99 49 29 .L...4U..7....I) 00:20:33.887 00000120 67 96 d7 df 49 a6 71 01 a9 06 92 dc a5 38 4b 94 g...I.q......8K. 00:20:33.887 00000130 15 9f 6c dc e7 ab f4 54 0e 2b f4 37 9c d6 a9 f6 ..l....T.+.7.... 00:20:33.887 00000140 00 2b 40 4d b7 27 82 e3 e0 b0 34 dc a0 5b 56 a8 .+@M.'....4..[V. 00:20:33.887 00000150 be 8c e7 59 d9 72 00 7a a8 66 90 f0 c5 81 ef 1b ...Y.r.z.f...... 00:20:33.887 00000160 5a c6 b7 0f b3 b8 32 3a 33 1a ce 09 70 19 b6 99 Z.....2:3...p... 00:20:33.887 00000170 fd 63 a7 ca 9e 36 6e 70 33 73 39 26 2b ba c9 57 .c...6np3s9&+..W 00:20:33.887 00000180 71 84 b4 99 95 34 e3 07 71 a0 9c ec c6 71 ce df q....4..q....q.. 00:20:33.887 00000190 b0 ba 8f 44 20 84 4e 8d b3 92 0a 49 d9 21 9f d0 ...D .N....I.!.. 00:20:33.887 000001a0 53 35 d7 8d c7 55 f6 ed 61 d1 d4 96 89 ca 10 6b S5...U..a......k 00:20:33.887 000001b0 4b b2 c4 94 18 67 ff e8 5d 20 3c 9c 90 ba b0 83 K....g..] <..... 00:20:33.887 000001c0 e0 4d 5f 51 e7 c5 99 db 78 aa fa 9b af 03 2c c8 .M_Q....x.....,. 00:20:33.887 000001d0 bc e0 3a 14 5f 10 1c 6d f0 c4 6e 4b a4 56 26 c9 ..:._..m..nK.V&. 00:20:33.887 000001e0 83 f8 23 3b 0e f7 3c 67 68 ce 24 60 c4 7c 62 2f ..#;.... 00:20:33.887 000002b0 31 9f 2d c5 32 b3 95 d1 60 cb f8 71 89 de 3a bb 1.-.2...`..q..:. 00:20:33.887 000002c0 64 dd 36 9a e3 fa 06 87 26 6e b8 56 89 88 a5 b6 d.6.....&n.V.... 00:20:33.887 000002d0 4f 1a b3 33 e5 4e ff 1d 6b 86 82 af 65 2f 57 cf O..3.N..k...e/W. 00:20:33.887 000002e0 9b ad 62 5b 05 49 0b 88 c3 53 41 b0 89 57 16 60 ..b[.I...SA..W.` 00:20:33.887 000002f0 b3 eb 28 6b 29 2b 92 79 f0 bd 47 9f c5 7c 1c 81 ..(k)+.y..G..|.. 00:20:33.887 00000300 1c 47 df a3 b5 59 e2 a4 17 10 fa e8 9a c8 cf 0b .G...Y.......... 00:20:33.887 00000310 27 c3 77 07 05 1f cb 55 b7 59 e5 88 9f 97 23 0d '.w....U.Y....#. 00:20:33.887 00000320 34 3f 22 27 d8 df fe c2 aa 01 29 f4 e2 49 88 4b 4?"'......)..I.K 00:20:33.887 00000330 1b 3c 02 43 9a 69 e3 d4 e9 21 7f 41 dd fa 83 ea .<.C.i...!.A.... 00:20:33.887 00000340 9c d9 b4 18 ee be 68 4f cf 35 d7 0a 2b af a8 22 ......hO.5..+.." 00:20:33.887 00000350 ff e8 bb 7f b1 36 12 8e e2 a8 ab 72 78 00 2c 2b .....6.....rx.,+ 00:20:33.887 00000360 09 5c 28 c3 9c a2 01 04 8a de e1 25 9e 2c dd 48 .\(........%.,.H 00:20:33.887 00000370 55 68 6f 1a 9a 9b 01 2a 73 6b 80 54 6c 70 f3 42 Uho....*sk.Tlp.B 00:20:33.887 00000380 3e 0b 29 55 0c 42 82 39 07 96 a7 44 af 72 6e d7 >.)U.B.9...D.rn. 00:20:33.887 00000390 15 d3 3c dd a4 29 76 1e 8d e9 77 f6 96 f1 c9 0b ..<..)v...w..... 00:20:33.887 000003a0 2e f2 aa ce c2 9c 79 bc 53 1f f6 41 11 ca 21 0c ......y.S..A..!. 00:20:33.887 000003b0 0e f8 cf 0c 1d 15 b3 9a cb de 49 eb 14 68 34 2c ..........I..h4, 00:20:33.887 000003c0 9b fd 43 16 6f d9 77 f0 fb fa 0d a4 e5 ed 27 2f ..C.o.w.......'/ 00:20:33.887 000003d0 6b 7b 46 f8 20 c6 19 04 5e 8d ac 97 17 d0 66 c3 k{F. ...^.....f. 00:20:33.887 000003e0 11 3c c9 ee 55 81 9c ba 24 fc 46 da 20 cb 00 b1 .<..U...$.F. ... 00:20:33.887 000003f0 be c7 9a ec 0b c0 95 4f 32 b5 aa 48 3b 7b c6 14 .......O2..H;{.. 00:20:33.888 dh secret: 00:20:33.888 00000000 f2 f9 46 4c 69 02 3a b8 c7 a7 26 4a 41 4c 66 00 ..FLi.:...&JALf. 00:20:33.888 00000010 65 c0 0d 07 22 9b ea d7 a7 58 6e 1d 44 d1 dc 67 e..."....Xn.D..g 00:20:33.888 00000020 10 e2 15 08 65 25 d8 4b b7 39 b1 a9 d6 be 33 82 ....e%.K.9....3. 00:20:33.888 00000030 fd 0a f3 6e b3 68 f4 8f bb b4 0a 40 45 50 13 c1 ...n.h.....@EP.. 00:20:33.888 00000040 91 15 e8 54 9c c1 3d 56 d1 50 d1 d3 a3 18 89 cd ...T..=V.P...... 00:20:33.888 00000050 c4 78 8d e3 68 73 95 55 ab d8 c2 be 92 8a 1a de .x..hs.U........ 00:20:33.888 00000060 1e c7 80 67 c7 39 3f 08 19 ce 2b 50 a3 7b 02 83 ...g.9?...+P.{.. 00:20:33.888 00000070 77 ca a6 a1 cf 66 3b 98 73 38 37 df 66 57 0e c2 w....f;.s87.fW.. 00:20:33.888 00000080 2c cf 5b 66 66 6a ec 91 ec 90 4d 70 b3 0a db 0b ,.[ffj....Mp.... 00:20:33.888 00000090 80 7a a5 c4 01 ca 89 14 19 73 20 58 14 e1 70 6f .z.......s X..po 00:20:33.888 000000a0 74 e3 a2 c8 7b f4 60 79 c4 b8 1b bd d3 cf d0 b9 t...{.`y........ 00:20:33.888 000000b0 a9 3f 74 53 04 d7 7b b0 3e ff 23 f3 e0 56 45 d4 .?tS..{.>.#..VE. 00:20:33.888 000000c0 a5 05 fe 04 9c 50 25 0d fe 04 0d 65 fd c6 43 cf .....P%....e..C. 00:20:33.888 000000d0 22 ca a2 7e 18 cb f0 59 17 5f 73 98 3b f8 50 ad "..~...Y._s.;.P. 00:20:33.888 000000e0 c6 a1 ce e9 82 96 83 db de 47 f7 be 1d 73 7e ec .........G...s~. 00:20:33.888 000000f0 49 5d e5 d1 ae 5c 77 c0 f8 d3 0b 22 9e d3 44 22 I]...\w...."..D" 00:20:33.888 00000100 df 1f b4 c6 68 aa 15 88 0c a8 84 39 2a f3 64 90 ....h......9*.d. 00:20:33.888 00000110 35 2c 99 97 2b 0b 0c 32 97 25 60 c8 b3 6c 21 41 5,..+..2.%`..l!A 00:20:33.888 00000120 b9 93 87 5e 9c 2e ef ff be a6 b9 47 db cc 73 7f ...^.......G..s. 00:20:33.888 00000130 ba 53 a6 6a c9 ef 45 18 40 ed 34 f9 d5 1d 48 fe .S.j..E.@.4...H. 00:20:33.888 00000140 63 71 82 ec ef a1 8c 25 98 d8 5a e4 df 1d bd 54 cq.....%..Z....T 00:20:33.888 00000150 01 4f 59 4b aa 36 b0 61 79 9e bf 21 a5 0b 35 76 .OYK.6.ay..!..5v 00:20:33.888 00000160 b9 a0 b2 6f fb 91 27 35 22 7b 53 34 a9 51 95 ad ...o..'5"{S4.Q.. 00:20:33.888 00000170 62 c3 4d b0 7d b1 5f d3 ef 6d b8 c6 2c 9a fc 8b b.M.}._..m..,... 00:20:33.888 00000180 47 be 89 35 74 36 19 2a 39 c6 0c 6c d4 49 c4 90 G..5t6.*9..l.I.. 00:20:33.888 00000190 ad bb 99 b8 0b 50 42 89 bc 2e 99 c6 db 88 93 0f .....PB......... 00:20:33.888 000001a0 e2 2e 24 a5 41 b7 78 73 ce e1 20 6f eb f6 03 95 ..$.A.xs.. o.... 00:20:33.888 000001b0 36 f8 c6 58 62 0e 7b da e3 d2 b2 8a a2 70 82 a4 6..Xb.{......p.. 00:20:33.888 000001c0 f2 f9 a0 0d 66 e4 a5 f9 73 94 02 43 26 7a b3 e2 ....f...s..C&z.. 00:20:33.888 000001d0 fb 9b 2f 56 21 27 15 7b fe f3 db 60 fd 3e ed 75 ../V!'.{...`.>.u 00:20:33.888 000001e0 05 a0 6f bb 22 bb 02 61 ec 4b c6 64 86 c6 8a e2 ..o."..a.K.d.... 00:20:33.888 000001f0 74 b3 26 de 5d 2f b1 df 7f 47 20 74 c4 f7 6b f9 t.&.]/...G t..k. 00:20:33.888 00000200 62 91 d8 3d 51 9b 56 96 a6 d0 fa 81 c3 1a a8 88 b..=Q.V......... 00:20:33.888 00000210 3e 68 1b 58 91 2e 95 df f8 96 3c a5 9e a9 b8 fa >h.X......<..... 00:20:33.888 00000220 ec b6 19 1e 99 31 87 91 1d f2 56 1c 12 8b 23 1f .....1....V...#. 00:20:33.888 00000230 12 76 0d 32 bb a4 12 b1 91 d8 4f d0 e2 c7 24 c5 .v.2......O...$. 00:20:33.888 00000240 5a 60 32 72 7f 12 0d 56 ff 84 21 45 73 39 db 30 Z`2r...V..!Es9.0 00:20:33.888 00000250 fe 24 3c c0 2e 7c 61 e8 43 26 c1 b8 f8 36 e4 c9 .$<..|a.C&...6.. 00:20:33.888 00000260 0e 99 1d ee 96 d6 42 5f 86 2f 5b 2a 0c 68 8d e9 ......B_./[*.h.. 00:20:33.888 00000270 02 ad 16 ba 65 68 fb 5d c5 a2 90 56 97 6e a8 58 ....eh.]...V.n.X 00:20:33.888 00000280 0d a4 ad 5c f1 3c d1 16 b6 98 65 e7 e2 b1 30 ec ...\.<....e...0. 00:20:33.888 00000290 e9 2a 3e 23 b6 74 fd b8 4f ae d0 a3 42 9d 8c 40 .*>#.t..O...B..@ 00:20:33.888 000002a0 e0 43 e7 a3 ca 25 02 e1 32 ec 53 bf 05 48 dc a1 .C...%..2.S..H.. 00:20:33.888 000002b0 8d ec 3a b6 49 2b ef aa cd b3 84 1e 29 19 f2 0e ..:.I+......)... 00:20:33.888 000002c0 cd b7 df c2 17 c5 20 0c 9e 63 93 1d 0c ae 40 ca ...... ..c....@. 00:20:33.888 000002d0 9d 23 66 cd 06 f9 2d 24 61 9a 51 52 aa e5 5c 9a .#f...-$a.QR..\. 00:20:33.888 000002e0 75 ed 48 ef 60 0c dd ce 7c fb 1d d9 88 c7 b2 04 u.H.`...|....... 00:20:33.888 000002f0 2d 0a ce bc 0b fe 0e e5 a2 19 92 3c 64 3f 5b ff -..........MuL 00:20:33.888 00000360 ba 28 1d 41 93 24 b0 b4 a9 85 bb 84 fc 6e b9 17 .(.A.$.......n.. 00:20:33.888 00000370 d3 7c d8 22 45 28 b1 c7 13 80 d4 1d e7 e9 d3 b2 .|."E(.......... 00:20:33.888 00000380 3a a5 89 53 20 1c f1 da 63 b2 75 95 0c 5b eb e1 :..S ...c.u..[.. 00:20:33.888 00000390 b7 3f e0 5e ca ef 5f c0 11 e8 cd e0 47 b2 ce 39 .?.^.._.....G..9 00:20:33.888 000003a0 77 e6 a3 99 36 94 72 38 85 21 c4 63 02 cd 05 49 w...6.r8.!.c...I 00:20:33.888 000003b0 e1 48 d1 1e 1b f4 28 e1 0c 19 80 7a 63 33 40 3b .H....(....zc3@; 00:20:33.888 000003c0 94 18 e6 da 42 87 18 2b a4 09 34 7e 0f a2 76 20 ....B..+..4~..v 00:20:33.888 000003d0 aa dc af 1e 51 37 6a 44 f1 ee a6 b7 4e bb bd f6 ....Q7jD....N... 00:20:33.888 000003e0 16 a2 c0 84 24 25 23 08 d4 6a f0 aa 69 53 2c 42 ....$%#..j..iS,B 00:20:33.888 000003f0 ff d1 9b f2 0e e9 87 51 83 2e 9e 05 ab e5 b9 4e .......Q.......N 00:20:33.888 [2024-09-27 15:25:12.249424] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=1, dhgroup=5, seq=3428451738, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.888 [2024-09-27 15:25:12.249538] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.888 [2024-09-27 15:25:12.332663] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.888 [2024-09-27 15:25:12.332710] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.888 [2024-09-27 15:25:12.332720] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.888 [2024-09-27 15:25:12.332746] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.888 [2024-09-27 15:25:12.522600] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.888 [2024-09-27 15:25:12.522620] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.888 [2024-09-27 15:25:12.522628] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.888 [2024-09-27 15:25:12.522671] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.888 [2024-09-27 15:25:12.522698] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.888 ctrlr pubkey: 00:20:33.888 00000000 1d b7 d9 5c 24 d3 b9 8d 18 8e ea 89 fe c0 b5 f7 ...\$........... 00:20:33.888 00000010 a8 ef 85 c4 f7 23 11 13 6a 0b b3 0e 29 e8 f4 fd .....#..j...)... 00:20:33.888 00000020 c3 d0 7b 1b a3 f0 bc 4d 76 be 28 e8 c6 ac b3 1a ..{....Mv.(..... 00:20:33.888 00000030 15 5e 3e dd ec 96 60 c4 e2 17 ca f1 e3 6c ca 3b .^>...`......l.; 00:20:33.888 00000040 b0 50 f0 29 a9 cb 1e a7 1d 16 c5 b3 74 63 33 41 .P.)........tc3A 00:20:33.888 00000050 2c 54 b4 82 5c e0 23 78 42 18 8f 36 d2 76 b6 0f ,T..\.#xB..6.v.. 00:20:33.888 00000060 22 a4 9c b7 e1 02 52 e4 df 40 f7 d6 b9 2b 61 07 ".....R..@...+a. 00:20:33.888 00000070 f5 26 45 a2 e9 26 76 2c 82 dc 8c e2 ac 73 1d 86 .&E..&v,.....s.. 00:20:33.888 00000080 e1 75 19 cd ed a5 be 4a 1c ca 4f 95 f7 75 02 cb .u.....J..O..u.. 00:20:33.888 00000090 b3 7c 8e 7b 99 11 4e 86 8d b0 83 e0 ad 37 ec f4 .|.{..N......7.. 00:20:33.888 000000a0 b1 92 31 14 9b a7 32 3e ca e5 cd 37 b5 f8 8b d7 ..1...2>...7.... 00:20:33.888 000000b0 58 99 c3 80 9a ea c4 db a8 0f 29 05 47 44 16 2b X.........).GD.+ 00:20:33.888 000000c0 c5 90 04 24 99 cd 2b fd 00 7e 76 0f e4 cb eb 10 ...$..+..~v..... 00:20:33.888 000000d0 0c a1 85 97 2b ef b2 dc d6 03 d7 57 79 3f 60 45 ....+......Wy?`E 00:20:33.888 000000e0 7e af 7c cd 4f b4 f1 9f cd d2 5c e2 6e 5c ed 60 ~.|.O.....\.n\.` 00:20:33.888 000000f0 b1 34 7b 3a 89 c6 1a eb 13 b3 ec 1e 8d ad d5 10 .4{:............ 00:20:33.888 00000100 47 03 1a b2 5d 5d 13 c7 6a f9 0c 4b 86 72 1e 04 G...]]..j..K.r.. 00:20:33.888 00000110 1a dd 63 bc 9e 5c d4 69 7b 2f ab 9e 87 05 4c f5 ..c..\.i{/....L. 00:20:33.888 00000120 e9 e8 f4 84 08 a7 8d e5 74 00 e8 92 e8 63 17 37 ........t....c.7 00:20:33.888 00000130 2a 83 60 48 af 06 9d 06 1c 78 2c 59 60 93 18 91 *.`H.....x,Y`... 00:20:33.888 00000140 4f 14 69 e7 fa 06 6f 01 5e 3c 8a da 55 84 4e 52 O.i...o.^<..U.NR 00:20:33.888 00000150 17 78 4d ef 3b bc e3 d2 dc 24 52 ea 41 18 c1 a8 .xM.;....$R.A... 00:20:33.888 00000160 c6 a3 bb 2a 27 fd 11 e5 61 57 8d 22 f8 9d e0 85 ...*'...aW.".... 00:20:33.888 00000170 63 f1 8e 2a 9b 10 7b 54 56 db 6d 28 60 08 26 d1 c..*..{TV.m(`.&. 00:20:33.888 00000180 72 fd e4 da 16 f6 f4 4c e2 4a 54 c5 2b 41 ca eb r......L.JT.+A.. 00:20:33.888 00000190 45 5e a8 25 ae 5b 3f 03 62 33 6b c5 bf 53 35 e9 E^.%.[?.b3k..S5. 00:20:33.888 000001a0 e9 f7 9a 61 41 dc 4e 37 4d ea 1a f4 c0 e6 4f b6 ...aA.N7M.....O. 00:20:33.888 000001b0 c5 c5 b7 97 ee 60 21 4a 35 e4 89 a8 43 57 4a 21 .....`!J5...CWJ! 00:20:33.888 000001c0 38 c5 bc 11 43 64 8f 30 ab e1 87 52 db ec 3d 1c 8...Cd.0...R..=. 00:20:33.888 000001d0 7b 90 2c 51 d6 56 21 b3 96 59 ac 15 fe 1b 01 76 {.,Q.V!..Y.....v 00:20:33.888 000001e0 df 06 49 4b 1a 3e 60 1b 45 fa 01 81 12 d2 f9 b8 ..IK.>`.E....... 00:20:33.888 000001f0 c5 44 fa 2f ad 73 b2 5d a2 3b b5 a2 e6 17 29 31 .D./.s.].;....)1 00:20:33.888 00000200 ce 8e 15 7c 0b c4 bc 12 c8 25 e3 ae 0b 6e 88 fb ...|.....%...n.. 00:20:33.888 00000210 2a 03 23 17 6e f7 eb ce 40 08 37 49 68 04 c0 2a *.#.n...@.7Ih..* 00:20:33.888 00000220 ec 28 91 fc 20 16 eb 40 7b bc 92 f0 58 44 b3 86 .(.. ..@{...XD.. 00:20:33.889 00000230 7c e5 80 b3 61 13 d0 b7 8b 62 9e ff 9a e2 03 81 |...a....b...... 00:20:33.889 00000240 c0 7b df 33 64 bb 7c 2b b3 02 4a cf 3e 4d 97 ae .{.3d.|+..J.>M.. 00:20:33.889 00000250 7c 73 40 7a 80 b2 6c f5 55 41 0e 50 ca ea 22 94 |s@z..l.UA.P..". 00:20:33.889 00000260 33 99 76 fa 95 48 90 1f 39 3c 76 53 e5 3f 12 ee 3.v..H..9.. 00:20:33.889 00000340 04 ae 49 2b b5 9f 3f a2 18 ac e3 79 2f 7c 0c 20 ..I+..?....y/|. 00:20:33.889 00000350 c7 b3 ef e5 44 dc 24 34 a9 f9 af c0 52 98 a5 70 ....D.$4....R..p 00:20:33.889 00000360 b2 01 87 9d 5a c6 09 58 d1 19 5f f0 a4 74 a6 01 ....Z..X.._..t.. 00:20:33.889 00000370 22 1f 75 d4 bf 11 a6 c6 7a 11 e0 99 1d 0f 18 54 ".u.....z......T 00:20:33.889 00000380 13 d8 16 46 88 56 94 4d 22 ea d5 93 42 2a 2a 2c ...F.V.M"...B**, 00:20:33.889 00000390 e7 b0 29 77 1b 69 ab a8 a6 41 a0 fa e1 b7 bf ba ..)w.i...A...... 00:20:33.889 000003a0 24 00 a7 ce af 27 6e 04 34 1c 5b 2a 67 49 9d f8 $....'n.4.[*gI.. 00:20:33.889 000003b0 c2 e1 aa 21 6a 20 f8 90 37 e9 67 fa e0 98 ca 66 ...!j ..7.g....f 00:20:33.889 000003c0 98 4b 9b 33 e1 e1 f7 6a 63 4d 1e 9d 44 76 e8 52 .K.3...jcM..Dv.R 00:20:33.889 000003d0 22 46 5b a5 3f fd 23 23 4f bc 10 d8 6b cf 7b 5c "F[.?.##O...k.{\ 00:20:33.889 000003e0 6d 10 09 bf b5 39 f9 68 4b fe 49 ea 56 e6 05 91 m....9.hK.I.V... 00:20:33.889 000003f0 44 36 b0 6b 21 31 ef b9 b3 ef f4 7a a3 51 33 3a D6.k!1.....z.Q3: 00:20:33.889 host pubkey: 00:20:33.889 00000000 53 58 b3 48 49 32 07 a8 6a d4 06 53 d9 d2 4d c1 SX.HI2..j..S..M. 00:20:33.889 00000010 63 7c 14 4e 8a 76 8e 1c b0 0f e9 ac 77 e4 01 6b c|.N.v......w..k 00:20:33.889 00000020 10 53 fe 8a 7b 58 fe 95 3c 4a a0 1a c0 f4 d5 1b .S..{X......c.g 00:20:33.889 00000170 9c 67 ec bc 06 25 fe 5f 05 fd a6 ee bd 2e d0 24 .g...%._.......$ 00:20:33.889 00000180 c0 ff 37 88 76 44 9a 6e ae fd 3d 16 00 60 67 4e ..7.vD.n..=..`gN 00:20:33.889 00000190 05 0d 13 ef e0 36 83 3a 5f f7 1c cb 88 2e b2 cb .....6.:_....... 00:20:33.889 000001a0 b1 6e aa f7 47 9b 36 91 d9 06 bb 8e 55 78 8f 51 .n..G.6.....Ux.Q 00:20:33.889 000001b0 07 0c 90 d0 a4 15 53 7a 73 17 67 42 0c e2 e1 99 ......Szs.gB.... 00:20:33.889 000001c0 c9 a4 9c 24 9f 27 14 11 a5 ce 9a 2c ab 33 cf 5e ...$.'.....,.3.^ 00:20:33.889 000001d0 a2 71 7e 5d 02 1b 10 2b 0e 84 47 0e 8a 77 5f e2 .q~]...+..G..w_. 00:20:33.889 000001e0 5a 6d e2 fe d8 a5 73 63 76 87 6f ac 2c 3f ea 91 Zm....scv.o.,?.. 00:20:33.889 000001f0 b7 b8 89 ed 59 13 e6 c5 44 b7 a3 79 f6 8a ca 49 ....Y...D..y...I 00:20:33.889 00000200 ac 6e c0 82 7a bb bb 25 d2 ca 92 7d f8 96 56 f6 .n..z..%...}..V. 00:20:33.889 00000210 19 4a 71 32 a2 50 9c 20 07 83 33 c1 09 ed a5 04 .Jq2.P. ..3..... 00:20:33.889 00000220 0e 0f 50 03 6b 9f 13 b7 f0 43 ee ed 4f 7a d6 1f ..P.k....C..Oz.. 00:20:33.889 00000230 f3 ee 05 a1 23 ee 0c 10 6f 65 89 4a 7f a1 82 af ....#...oe.J.... 00:20:33.889 00000240 3f a0 c5 c9 9c 16 4c bd 1d b9 85 1b 2b 88 0d e0 ?.....L.....+... 00:20:33.889 00000250 31 e2 52 b8 1b 63 65 5b 87 7b 97 25 dd 03 7a 96 1.R..ce[.{.%..z. 00:20:33.889 00000260 8b 76 83 32 fb d2 dd 86 29 e9 2c 46 9e cc 57 0b .v.2....).,F..W. 00:20:33.889 00000270 ba e9 18 a4 24 2f 6e b0 9b 00 84 3c 94 c1 0b 4e ....$/n....<...N 00:20:33.889 00000280 fe 58 78 1e f8 d6 f0 d5 fe 21 fc 0c 18 f4 a4 80 .Xx......!...... 00:20:33.889 00000290 e2 0c 8f cf 11 e2 80 ed 14 91 11 fa 69 1c d3 a7 ............i... 00:20:33.889 000002a0 76 1a 23 52 1f df 46 ca cd 8b 8e 83 a9 a1 09 3e v.#R..F........> 00:20:33.889 000002b0 47 a9 f1 f6 40 c0 76 fe ec 3b 34 c1 b2 75 31 85 G...@.v..;4..u1. 00:20:33.889 000002c0 e7 6e 80 8d b9 a6 16 18 f1 d3 dc e8 db 72 f1 31 .n...........r.1 00:20:33.889 000002d0 1b 5d 9a 1a e4 0a c5 cd b8 19 3d 2e 9f 96 51 0e .]........=...Q. 00:20:33.889 000002e0 24 ae d7 d8 91 e1 bb 60 ac e5 20 df a7 4a 60 6e $......`.. ..J`n 00:20:33.889 000002f0 eb 43 46 64 d4 3d b1 8d f1 83 44 eb 3a a7 89 82 .CFd.=....D.:... 00:20:33.889 00000300 9e 7e bf 67 5b 82 db 0f 98 19 4f 6e 9a 1a c4 07 .~.g[.....On.... 00:20:33.889 00000310 0e 89 66 cc ec 90 45 af b8 1b 66 e2 b0 f7 35 26 ..f...E...f...5& 00:20:33.889 00000320 7b be e6 9b 34 3c 9c 54 5a b6 3e 50 fa 87 c8 11 {...4<.TZ.>P.... 00:20:33.889 00000330 20 7c fd c6 93 1c 2a 72 f1 f0 40 a0 5a 87 be d8 |....*r..@.Z... 00:20:33.889 00000340 fb f9 cb 22 cd 2f a4 b5 05 2f 90 df 6d 7f 74 20 ..."./.../..m.t 00:20:33.889 00000350 bd 6e 98 e6 2a 6d ed 72 ec 63 d8 5d b9 7b 04 1e .n..*m.r.c.].{.. 00:20:33.889 00000360 f6 94 58 f1 17 03 e6 bd 68 ac c3 9b 1c 31 ec e5 ..X.....h....1.. 00:20:33.889 00000370 e5 85 52 6d 12 bc bf b5 2b 4e 8b c3 9d 2c 7c c9 ..Rm....+N...,|. 00:20:33.889 00000380 96 ee ed 78 26 3d 02 85 68 9b 75 51 50 a8 3e 15 ...x&=..h.uQP.>. 00:20:33.889 00000390 db 43 21 54 f0 c1 01 0c d2 c5 0a 16 36 01 9c b2 .C!T........6... 00:20:33.889 000003a0 59 97 c1 71 7a bd 20 53 2b f5 83 9b f2 90 82 3b Y..qz. S+......; 00:20:33.889 000003b0 54 98 41 08 a0 80 48 86 68 6a 3a 0e ce d3 76 c6 T.A...H.hj:...v. 00:20:33.889 000003c0 ba 2a 53 a9 f5 89 9d 9a 27 d3 30 ab 51 84 c7 e4 .*S.....'.0.Q... 00:20:33.889 000003d0 3f c8 8e f0 67 f0 22 42 c1 5a 78 62 28 53 f4 bb ?...g."B.Zxb(S.. 00:20:33.889 000003e0 a2 3b c1 ec 09 c4 58 5b f7 0b 33 25 7c 5f c6 82 .;....X[..3%|_.. 00:20:33.889 000003f0 6a ef c6 b9 23 7a b1 f3 2d de 1c ad 14 82 9b c8 j...#z..-....... 00:20:33.889 dh secret: 00:20:33.889 00000000 82 c8 22 28 bf a6 fc 5e 88 00 8d b5 fd 11 31 6f .."(...^......1o 00:20:33.889 00000010 23 cb 4a 53 64 3c 6c 42 df 07 e3 e3 ba 75 d3 97 #.JSd 00:20:33.889 00000310 0a d3 03 54 18 b1 a9 fe 33 06 3e de cf 24 0f e8 ...T....3.>..$.. 00:20:33.889 00000320 51 14 0e 25 f7 08 92 ec d6 65 3e ae a1 1d f8 f1 Q..%.....e>..... 00:20:33.889 00000330 ad 74 7b 7c 61 91 a2 78 21 96 d1 17 fa 16 b9 df .t{|a..x!....... 00:20:33.889 00000340 31 fe 4c 2e c4 f0 13 20 d0 a0 a1 1d 69 85 f1 09 1.L.... ....i... 00:20:33.889 00000350 82 06 af bd b4 ec 47 15 d6 a2 a0 b1 44 b2 57 9e ......G.....D.W. 00:20:33.889 00000360 da c6 df 68 c7 57 25 b5 fe 22 17 2d 94 ec da ff ...h.W%..".-.... 00:20:33.889 00000370 ed 9d 2e cc 72 73 0b dc 4d fe da 1c cd c1 f2 49 ....rs..M......I 00:20:33.889 00000380 68 fb 47 83 f5 5b 89 2a 3c b7 40 51 e5 7a 6e 15 h.G..[.*<.@Q.zn. 00:20:33.889 00000390 c4 5f 63 9a 7d c4 9a 16 c6 e9 f9 5d db 82 6a c5 ._c.}......]..j. 00:20:33.889 000003a0 3a 3b 23 a9 65 ac ee 52 4a 7c 7a dc e3 49 be 71 :;#.e..RJ|z..I.q 00:20:33.890 000003b0 fc d1 3b 2e 6c 68 1c 0f 3c 1c 12 6e 40 c2 a8 a6 ..;.lh..<..n@... 00:20:33.890 000003c0 a8 4b 6f 36 0f 76 5a c4 25 aa 8f cb f4 50 2c 52 .Ko6.vZ.%....P,R 00:20:33.890 000003d0 69 14 dd d0 97 db 07 49 14 54 14 89 ff 37 ce 27 i......I.T...7.' 00:20:33.890 000003e0 44 d1 be cc 08 0e 9b 99 17 29 49 9a e4 92 48 8d D........)I...H. 00:20:33.890 000003f0 34 27 ee b0 bf 2b 5d 97 3d e6 e7 74 d5 ea 52 43 4'...+].=..t..RC 00:20:33.890 [2024-09-27 15:25:12.633748] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=1, dhgroup=5, seq=3428451739, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.890 [2024-09-27 15:25:12.691911] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.890 [2024-09-27 15:25:12.691958] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.890 [2024-09-27 15:25:12.691976] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.890 [2024-09-27 15:25:12.692001] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.890 [2024-09-27 15:25:12.692012] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.890 [2024-09-27 15:25:12.798285] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.890 [2024-09-27 15:25:12.798303] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.890 [2024-09-27 15:25:12.798314] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.890 [2024-09-27 15:25:12.798324] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.890 [2024-09-27 15:25:12.798395] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.890 ctrlr pubkey: 00:20:33.890 00000000 1d b7 d9 5c 24 d3 b9 8d 18 8e ea 89 fe c0 b5 f7 ...\$........... 00:20:33.890 00000010 a8 ef 85 c4 f7 23 11 13 6a 0b b3 0e 29 e8 f4 fd .....#..j...)... 00:20:33.890 00000020 c3 d0 7b 1b a3 f0 bc 4d 76 be 28 e8 c6 ac b3 1a ..{....Mv.(..... 00:20:33.890 00000030 15 5e 3e dd ec 96 60 c4 e2 17 ca f1 e3 6c ca 3b .^>...`......l.; 00:20:33.890 00000040 b0 50 f0 29 a9 cb 1e a7 1d 16 c5 b3 74 63 33 41 .P.)........tc3A 00:20:33.890 00000050 2c 54 b4 82 5c e0 23 78 42 18 8f 36 d2 76 b6 0f ,T..\.#xB..6.v.. 00:20:33.890 00000060 22 a4 9c b7 e1 02 52 e4 df 40 f7 d6 b9 2b 61 07 ".....R..@...+a. 00:20:33.890 00000070 f5 26 45 a2 e9 26 76 2c 82 dc 8c e2 ac 73 1d 86 .&E..&v,.....s.. 00:20:33.890 00000080 e1 75 19 cd ed a5 be 4a 1c ca 4f 95 f7 75 02 cb .u.....J..O..u.. 00:20:33.890 00000090 b3 7c 8e 7b 99 11 4e 86 8d b0 83 e0 ad 37 ec f4 .|.{..N......7.. 00:20:33.890 000000a0 b1 92 31 14 9b a7 32 3e ca e5 cd 37 b5 f8 8b d7 ..1...2>...7.... 00:20:33.890 000000b0 58 99 c3 80 9a ea c4 db a8 0f 29 05 47 44 16 2b X.........).GD.+ 00:20:33.890 000000c0 c5 90 04 24 99 cd 2b fd 00 7e 76 0f e4 cb eb 10 ...$..+..~v..... 00:20:33.890 000000d0 0c a1 85 97 2b ef b2 dc d6 03 d7 57 79 3f 60 45 ....+......Wy?`E 00:20:33.890 000000e0 7e af 7c cd 4f b4 f1 9f cd d2 5c e2 6e 5c ed 60 ~.|.O.....\.n\.` 00:20:33.890 000000f0 b1 34 7b 3a 89 c6 1a eb 13 b3 ec 1e 8d ad d5 10 .4{:............ 00:20:33.890 00000100 47 03 1a b2 5d 5d 13 c7 6a f9 0c 4b 86 72 1e 04 G...]]..j..K.r.. 00:20:33.890 00000110 1a dd 63 bc 9e 5c d4 69 7b 2f ab 9e 87 05 4c f5 ..c..\.i{/....L. 00:20:33.890 00000120 e9 e8 f4 84 08 a7 8d e5 74 00 e8 92 e8 63 17 37 ........t....c.7 00:20:33.890 00000130 2a 83 60 48 af 06 9d 06 1c 78 2c 59 60 93 18 91 *.`H.....x,Y`... 00:20:33.890 00000140 4f 14 69 e7 fa 06 6f 01 5e 3c 8a da 55 84 4e 52 O.i...o.^<..U.NR 00:20:33.890 00000150 17 78 4d ef 3b bc e3 d2 dc 24 52 ea 41 18 c1 a8 .xM.;....$R.A... 00:20:33.890 00000160 c6 a3 bb 2a 27 fd 11 e5 61 57 8d 22 f8 9d e0 85 ...*'...aW.".... 00:20:33.890 00000170 63 f1 8e 2a 9b 10 7b 54 56 db 6d 28 60 08 26 d1 c..*..{TV.m(`.&. 00:20:33.890 00000180 72 fd e4 da 16 f6 f4 4c e2 4a 54 c5 2b 41 ca eb r......L.JT.+A.. 00:20:33.890 00000190 45 5e a8 25 ae 5b 3f 03 62 33 6b c5 bf 53 35 e9 E^.%.[?.b3k..S5. 00:20:33.890 000001a0 e9 f7 9a 61 41 dc 4e 37 4d ea 1a f4 c0 e6 4f b6 ...aA.N7M.....O. 00:20:33.890 000001b0 c5 c5 b7 97 ee 60 21 4a 35 e4 89 a8 43 57 4a 21 .....`!J5...CWJ! 00:20:33.890 000001c0 38 c5 bc 11 43 64 8f 30 ab e1 87 52 db ec 3d 1c 8...Cd.0...R..=. 00:20:33.890 000001d0 7b 90 2c 51 d6 56 21 b3 96 59 ac 15 fe 1b 01 76 {.,Q.V!..Y.....v 00:20:33.890 000001e0 df 06 49 4b 1a 3e 60 1b 45 fa 01 81 12 d2 f9 b8 ..IK.>`.E....... 00:20:33.890 000001f0 c5 44 fa 2f ad 73 b2 5d a2 3b b5 a2 e6 17 29 31 .D./.s.].;....)1 00:20:33.890 00000200 ce 8e 15 7c 0b c4 bc 12 c8 25 e3 ae 0b 6e 88 fb ...|.....%...n.. 00:20:33.890 00000210 2a 03 23 17 6e f7 eb ce 40 08 37 49 68 04 c0 2a *.#.n...@.7Ih..* 00:20:33.890 00000220 ec 28 91 fc 20 16 eb 40 7b bc 92 f0 58 44 b3 86 .(.. ..@{...XD.. 00:20:33.890 00000230 7c e5 80 b3 61 13 d0 b7 8b 62 9e ff 9a e2 03 81 |...a....b...... 00:20:33.890 00000240 c0 7b df 33 64 bb 7c 2b b3 02 4a cf 3e 4d 97 ae .{.3d.|+..J.>M.. 00:20:33.890 00000250 7c 73 40 7a 80 b2 6c f5 55 41 0e 50 ca ea 22 94 |s@z..l.UA.P..". 00:20:33.890 00000260 33 99 76 fa 95 48 90 1f 39 3c 76 53 e5 3f 12 ee 3.v..H..9.. 00:20:33.890 00000340 04 ae 49 2b b5 9f 3f a2 18 ac e3 79 2f 7c 0c 20 ..I+..?....y/|. 00:20:33.890 00000350 c7 b3 ef e5 44 dc 24 34 a9 f9 af c0 52 98 a5 70 ....D.$4....R..p 00:20:33.890 00000360 b2 01 87 9d 5a c6 09 58 d1 19 5f f0 a4 74 a6 01 ....Z..X.._..t.. 00:20:33.890 00000370 22 1f 75 d4 bf 11 a6 c6 7a 11 e0 99 1d 0f 18 54 ".u.....z......T 00:20:33.890 00000380 13 d8 16 46 88 56 94 4d 22 ea d5 93 42 2a 2a 2c ...F.V.M"...B**, 00:20:33.890 00000390 e7 b0 29 77 1b 69 ab a8 a6 41 a0 fa e1 b7 bf ba ..)w.i...A...... 00:20:33.890 000003a0 24 00 a7 ce af 27 6e 04 34 1c 5b 2a 67 49 9d f8 $....'n.4.[*gI.. 00:20:33.890 000003b0 c2 e1 aa 21 6a 20 f8 90 37 e9 67 fa e0 98 ca 66 ...!j ..7.g....f 00:20:33.890 000003c0 98 4b 9b 33 e1 e1 f7 6a 63 4d 1e 9d 44 76 e8 52 .K.3...jcM..Dv.R 00:20:33.890 000003d0 22 46 5b a5 3f fd 23 23 4f bc 10 d8 6b cf 7b 5c "F[.?.##O...k.{\ 00:20:33.890 000003e0 6d 10 09 bf b5 39 f9 68 4b fe 49 ea 56 e6 05 91 m....9.hK.I.V... 00:20:33.890 000003f0 44 36 b0 6b 21 31 ef b9 b3 ef f4 7a a3 51 33 3a D6.k!1.....z.Q3: 00:20:33.890 host pubkey: 00:20:33.890 00000000 6c 5f a6 8e 93 42 28 90 22 2d 60 85 1e 25 66 71 l_...B(."-`..%fq 00:20:33.890 00000010 cf a4 3b 46 3e bf 4b 7a 94 4e 60 e4 c2 dd 7b c2 ..;F>.Kz.N`...{. 00:20:33.890 00000020 ab c5 d9 36 b1 99 0c 22 13 6b f3 84 b8 f1 a8 53 ...6...".k.....S 00:20:33.890 00000030 1a 5f 05 32 2c 8c 77 ca c0 e6 da b1 08 43 a6 bb ._.2,.w......C.. 00:20:33.890 00000040 d1 4c 0d cf 7a 96 cb 83 b1 04 35 f4 e1 6a 1c fc .L..z.....5..j.. 00:20:33.890 00000050 66 f7 37 8a 1d f3 b2 33 cf d4 bf e9 b4 1e fa 40 f.7....3.......@ 00:20:33.890 00000060 7b 25 ec db c9 47 6c 9f f0 fe 37 56 fe ed 46 31 {%...Gl...7V..F1 00:20:33.890 00000070 64 7e 25 ab 87 6e 1d cb 5d 91 8b df 16 4d 3a 1d d~%..n..]....M:. 00:20:33.890 00000080 a7 4f 86 30 19 01 5e 0b 8d 1c 09 c6 0d f7 e8 3b .O.0..^........; 00:20:33.890 00000090 7b 04 4e 51 b2 0b 05 c7 cb 30 1b c9 95 7f 30 a5 {.NQ.....0....0. 00:20:33.890 000000a0 90 f2 20 62 12 28 80 b0 21 10 dc 39 a4 70 37 99 .. b.(..!..9.p7. 00:20:33.890 000000b0 a3 6c 91 d1 2f 81 82 d8 46 25 54 54 84 d2 0c 37 .l../...F%TT...7 00:20:33.890 000000c0 8b b7 d1 ad 8b 72 88 cd 55 fb e2 67 de 64 2b 3f .....r..U..g.d+? 00:20:33.890 000000d0 69 35 7a 43 36 13 12 28 c4 e7 02 2e e7 81 3d cf i5zC6..(......=. 00:20:33.890 000000e0 14 68 33 87 e8 98 9b 4c 4d 3a 7b ad a1 d7 4e 94 .h3....LM:{...N. 00:20:33.890 000000f0 7b c5 63 3c e6 d5 f2 4d 8e d5 89 02 be c9 08 4e {.c<...M.......N 00:20:33.890 00000100 4b 49 f1 84 63 9c 25 9f 16 2e 15 39 23 18 ca 4d KI..c.%....9#..M 00:20:33.890 00000110 d0 17 53 d3 b8 ec 11 f1 7f 68 11 ab 1f fe 77 93 ..S......h....w. 00:20:33.890 00000120 8e 78 05 10 8c 3e d2 b8 cf 29 e4 44 e0 64 fb fb .x...>...).D.d.. 00:20:33.890 00000130 7d 4b dd d8 7a 8b f3 3c ba 59 64 ca 9e fe ae 63 }K..z..<.Yd....c 00:20:33.890 00000140 5e d2 67 62 9b 21 4b cb 2a 18 86 48 ce bd 52 c5 ^.gb.!K.*..H..R. 00:20:33.890 00000150 e8 78 3b 03 48 13 b8 20 11 77 55 35 28 83 6d 9d .x;.H.. .wU5(.m. 00:20:33.890 00000160 21 54 82 5c 21 c0 9d f2 ec 35 b0 7d 89 1a 73 8e !T.\!....5.}..s. 00:20:33.890 00000170 27 8a 3a 02 5a 16 c5 68 8a c1 b7 c4 f2 0a 5d fc '.:.Z..h......]. 00:20:33.890 00000180 2c 47 07 4c d2 ea fd 5b ae 05 5c c2 82 82 ce 1b ,G.L...[..\..... 00:20:33.890 00000190 41 bb b8 c0 3d e9 94 b7 d7 a5 b9 2e 22 0c 1d a6 A...=......."... 00:20:33.890 000001a0 4c ae b1 c8 2f d9 61 4f 77 51 48 83 d0 37 cb f4 L.../.aOwQH..7.. 00:20:33.890 000001b0 3d 45 ec 6b 7f 5f ba 17 03 ac be 9d 22 e2 14 39 =E.k._......"..9 00:20:33.890 000001c0 7b 76 17 ff 46 d9 a6 9d 1b 47 c0 3b b1 a3 2e 3a {v..F....G.;...: 00:20:33.890 000001d0 e6 24 84 30 d5 90 e2 e9 5e d7 ef 56 f1 57 6a 9b .$.0....^..V.Wj. 00:20:33.890 000001e0 5e 0f 53 40 0c 9a eb 08 e1 22 32 41 0f 3d 5e 4d ^.S@....."2A.=^M 00:20:33.890 000001f0 b3 20 db f7 fe 81 d8 42 da 6d 31 ef f8 1e e1 f0 . .....B.m1..... 00:20:33.890 00000200 25 dc fc b7 a9 0e 37 5a 23 75 cb 90 6d 02 95 51 %.....7Z#u..m..Q 00:20:33.890 00000210 5b f2 4b 7f 65 dc ae 78 f6 b8 cf 12 45 01 7a 6d [.K.e..x....E.zm 00:20:33.890 00000220 c4 4f ce 2d 47 de 66 ad 87 56 69 eb 05 27 8a 05 .O.-G.f..Vi..'.. 00:20:33.890 00000230 3f 93 53 60 f7 f3 00 98 f5 d2 a7 8d 70 b2 11 06 ?.S`........p... 00:20:33.890 00000240 db c0 61 0f 9d 46 77 4e 27 78 df 6b 7b b9 e3 6a ..a..FwN'x.k{..j 00:20:33.890 00000250 37 8c aa ce d0 bf ca 71 16 4e 52 9a 3d 73 a6 24 7......q.NR.=s.$ 00:20:33.890 00000260 6d 25 7f 79 8c 18 87 47 f3 da d9 9a ba e2 10 00 m%.y...G........ 00:20:33.890 00000270 31 2f b4 24 49 a0 47 0b 67 c9 20 0e 06 2f a3 14 1/.$I.G.g. ../.. 00:20:33.890 00000280 e1 bd 4b 39 5d 04 2d 86 19 ee b9 13 91 6b b1 2f ..K9].-......k./ 00:20:33.890 00000290 94 36 b6 18 5b 39 a3 28 b1 d9 07 f7 db 98 6d 9e .6..[9.(......m. 00:20:33.890 000002a0 4e 96 cb 16 8d 3d 86 2e ae 5f 8b 46 7e 51 b4 06 N....=..._.F~Q.. 00:20:33.890 000002b0 49 62 ed f3 94 ec 9a 2f 00 18 5a a2 49 25 2d b9 Ib...../..Z.I%-. 00:20:33.890 000002c0 4b 49 86 c1 cd fb 42 ea d4 5e 2a ed 6c 4f 0a 38 KI....B..^*.lO.8 00:20:33.890 000002d0 df c3 5c 3d a4 f0 85 4a 31 d3 c1 f8 c9 7e 32 e6 ..\=...J1....~2. 00:20:33.890 000002e0 23 4e da 1b 29 fe 1e d4 3f d2 2e c3 80 ba 09 ac #N..)...?....... 00:20:33.890 000002f0 72 85 7c 51 95 40 1b 98 96 a6 03 5b 01 ce 64 d7 r.|Q.@.....[..d. 00:20:33.890 00000300 99 0f 9f 3d f5 bd a2 c8 83 a4 12 72 f4 9d 59 3c ...=.......r..Y< 00:20:33.890 00000310 cc b2 0f 4c 54 23 1d dc 6f 36 05 97 78 f5 37 e6 ...LT#..o6..x.7. 00:20:33.890 00000320 a1 60 0e 2d 67 bc 8a 60 1e 66 64 a5 da 71 ce 70 .`.-g..`.fd..q.p 00:20:33.890 00000330 93 53 fe 43 d3 41 93 81 61 df 6f 3f 88 fb 55 da .S.C.A..a.o?..U. 00:20:33.890 00000340 e7 b2 c0 0d 09 3b 76 0c b6 85 7d 80 ac 9e 58 e5 .....;v...}...X. 00:20:33.890 00000350 ae 1c a5 0e 7e 7c d2 00 59 d7 88 e1 b9 24 c0 75 ....~|..Y....$.u 00:20:33.890 00000360 4a 33 54 74 fa 4a 25 31 9b 24 e5 f8 96 6a 25 b1 J3Tt.J%1.$...j%. 00:20:33.890 00000370 12 9e e8 40 89 95 70 07 00 f9 c7 72 e5 1b 89 75 ...@..p....r...u 00:20:33.890 00000380 0b f5 3f 1e f2 e3 c6 60 a1 f6 c8 17 14 52 3b 1a ..?....`.....R;. 00:20:33.890 00000390 b3 43 5d 3c dd 77 08 f9 b7 16 d6 6c a4 82 02 99 .C]<.w.....l.... 00:20:33.890 000003a0 96 c4 2d a4 65 8b 9b 7d 3d a1 5d 0e 59 9b 40 2f ..-.e..}=.].Y.@/ 00:20:33.890 000003b0 69 f6 5c 97 24 7f cf 98 dc ea 9b ac 96 4d 60 3e i.\.$........M`> 00:20:33.890 000003c0 d0 71 84 68 ff d8 64 fc 34 e3 14 1d ee 3c 37 5b .q.h..d.4....<7[ 00:20:33.890 000003d0 ba 97 60 f1 fd e5 4e f0 cd 8e c0 44 cb d3 fe 29 ..`...N....D...) 00:20:33.890 000003e0 d9 7f 39 82 9d ef d8 01 a4 de a3 9a 8b 23 ce 33 ..9..........#.3 00:20:33.891 000003f0 58 45 23 51 a3 e9 fc cf b7 26 91 77 6c 09 fe 00 XE#Q.....&.wl... 00:20:33.891 dh secret: 00:20:33.891 00000000 a5 8a 04 e2 55 14 a5 07 88 f3 df 04 38 9f f2 5d ....U.......8..] 00:20:33.891 00000010 4d 73 82 74 61 bf cb 6c 26 e7 9d 40 4b 1e 2f 6a Ms.ta..l&..@K./j 00:20:33.891 00000020 5f fe e3 02 03 a0 03 2b 73 65 9f cd 3a ad 37 85 _......+se..:.7. 00:20:33.891 00000030 22 3a 12 16 ef f0 b4 14 1e 3a f4 dc 2b 4f f2 54 ":.......:..+O.T 00:20:33.891 00000040 e8 48 5b f3 4e de 3b ad 25 a6 7f 3b 81 99 4f 56 .H[.N.;.%..;..OV 00:20:33.891 00000050 3f 6d b0 ea 6f 7e 02 f9 56 6e 63 0a 44 79 39 9a ?m..o~..Vnc.Dy9. 00:20:33.891 00000060 56 83 d0 ae b8 4b a9 1c 3b 4d 95 db fd 47 8e 00 V....K..;M...G.. 00:20:33.891 00000070 c2 c6 37 f5 3f 7a 31 c0 50 c8 4a 1b 0e 33 6f cb ..7.?z1.P.J..3o. 00:20:33.891 00000080 47 c8 6c 7b 03 b7 78 28 c1 56 e1 6d ae 39 c1 b2 G.l{..x(.V.m.9.. 00:20:33.891 00000090 b9 c2 d3 1b 80 f8 0e 98 4f 12 59 ff b7 00 ea b2 ........O.Y..... 00:20:33.891 000000a0 c7 f2 0c c9 ed e2 ba 46 e8 a9 7a b8 93 33 de 74 .......F..z..3.t 00:20:33.891 000000b0 f8 f8 85 57 2b c3 af fa e3 96 e4 62 80 4a d2 05 ...W+......b.J.. 00:20:33.891 000000c0 3c 40 f8 19 74 5b 4a 36 be 7f 2e 15 45 05 df 44 <@..t[J6....E..D 00:20:33.891 000000d0 b0 6d 1a c8 78 5d d7 6b 85 07 f1 6a a6 f7 a1 ec .m..x].k...j.... 00:20:33.891 000000e0 a2 85 06 3c ba b5 65 8f c5 dc 55 8b 4f c9 49 81 ...<..e...U.O.I. 00:20:33.891 000000f0 7b 99 44 79 17 15 ba c3 5f 9c fa 7d 7d 11 87 f7 {.Dy...._..}}... 00:20:33.891 00000100 a4 2b 82 36 2f 22 f6 44 f8 13 ae 7e c1 ec 8b 61 .+.6/".D...~...a 00:20:33.891 00000110 40 f8 79 d6 a9 77 80 6b 2c 55 70 f3 b1 0d b9 fd @.y..w.k,Up..... 00:20:33.891 00000120 61 b7 67 36 d7 e1 3a 1d 83 05 26 25 15 83 13 7d a.g6..:...&%...} 00:20:33.891 00000130 60 a6 f3 11 c3 4f bb 17 22 a9 d3 49 8c fb f0 48 `....O.."..I...H 00:20:33.891 00000140 fd 45 8b 5e e1 74 ea 19 b5 22 d5 38 7d 48 8d ef .E.^.t...".8}H.. 00:20:33.891 00000150 0e 91 d7 f7 79 5e ab 09 06 b7 dc d9 ec 8f e8 7d ....y^.........} 00:20:33.891 00000160 d2 a3 ea 41 9d 7b bb af 24 a2 de 0a 41 1d 0d ba ...A.{..$...A... 00:20:33.891 00000170 0f ee fd d8 b0 d8 4d 43 14 11 dc 30 cf 18 d1 d5 ......MC...0.... 00:20:33.891 00000180 32 66 6f 94 9d ed c4 19 30 e5 3f 2f ef a1 51 e5 2fo.....0.?/..Q. 00:20:33.891 00000190 53 a0 c3 46 a9 cf 1d e6 77 09 ef 08 e9 37 2f 20 S..F....w....7/ 00:20:33.891 000001a0 6e e8 a1 a1 f6 67 89 82 c4 61 91 4c 43 03 c2 fd n....g...a.LC... 00:20:33.891 000001b0 70 8f cc 73 56 24 11 8b 83 80 e9 56 b5 41 d1 2c p..sV$.....V.A., 00:20:33.891 000001c0 30 47 fd ea 52 3b f6 2e 67 6a 5d f5 ce 8e 61 68 0G..R;..gj]...ah 00:20:33.891 000001d0 4e 07 e1 d2 e6 69 0f 23 61 66 76 23 4f 48 05 98 N....i.#afv#OH.. 00:20:33.891 000001e0 83 15 fa 7c b6 7e 97 7b ff 66 c0 66 2d f1 16 9f ...|.~.{.f.f-... 00:20:33.891 000001f0 7c ed 17 bb 4b 00 54 ba f4 64 36 d2 69 f3 a0 a2 |...K.T..d6.i... 00:20:33.891 00000200 59 56 6a 22 e6 84 ec 3e c3 9a 9a f4 27 f5 03 bb YVj"...>....'... 00:20:33.891 00000210 e7 13 3d ac 4d 18 a3 f9 87 b2 2d 7e b9 39 32 2b ..=.M.....-~.92+ 00:20:33.891 00000220 0b 57 6e e2 43 f9 97 58 57 eb e3 8f 61 9f c8 dd .Wn.C..XW...a... 00:20:33.891 00000230 71 a3 99 11 75 13 67 22 77 a0 dc ef 0f 91 9e 7a q...u.g"w......z 00:20:33.891 00000240 4b 3b ae 81 51 13 db 0f 2f 59 4c 8c 1a c2 cf 9c K;..Q.../YL..... 00:20:33.891 00000250 e9 6a 22 b4 4c 08 3d e4 5b 8f 56 4f 41 ba 1a 8b .j".L.=.[.VOA... 00:20:33.891 00000260 94 f1 98 0d e2 e8 cd 8f 4c be 78 b3 93 a7 ae b0 ........L.x..... 00:20:33.891 00000270 8a 21 cc f4 a4 49 d9 a8 57 3d c3 69 b1 9a 82 7e .!...I..W=.i...~ 00:20:33.891 00000280 82 64 2d b2 01 9c 84 07 5b 34 21 d5 0b 95 7a af .d-.....[4!...z. 00:20:33.891 00000290 5f ab 26 a7 3b 9c 4b ad 5f 65 98 95 3d 28 5d 89 _.&.;.K._e..=(]. 00:20:33.891 000002a0 87 38 c3 52 bb dd 2a 2d e0 ca 4f 9c f4 bf 75 27 .8.R..*-..O...u' 00:20:33.891 000002b0 62 a2 b0 6b a2 bf 72 6e a6 6e 44 b7 f3 dc ca dd b..k..rn.nD..... 00:20:33.891 000002c0 36 3f e2 ca 45 9f 09 83 08 4e 6c 27 a7 15 41 c1 6?..E....Nl'..A. 00:20:33.891 000002d0 38 60 1b c0 e2 a9 17 bc fd 14 42 28 9c 9c 97 af 8`........B(.... 00:20:33.891 000002e0 8f c2 17 90 e7 75 10 3d 7a 89 d9 ee 08 76 6a 7d .....u.=z....vj} 00:20:33.891 000002f0 40 2e e7 c1 12 86 e0 50 21 f4 37 da 4f f7 d2 b1 @......P!.7.O... 00:20:33.891 00000300 e6 d9 10 98 be 96 eb ea 68 c7 09 bf 72 15 f9 23 ........h...r..# 00:20:33.891 00000310 3f 7b ca 34 8d 2a 76 81 8e a7 0e e7 2b bf 00 4f ?{.4.*v.....+..O 00:20:33.891 00000320 3c 33 77 fe 08 4b f9 21 86 2f 88 8d 25 66 04 7f <3w..K.!./..%f.. 00:20:33.891 00000330 f1 f2 cf 9b 12 b2 63 f6 9c d5 79 b7 61 d6 a4 2f ......c...y.a../ 00:20:33.891 00000340 a3 e8 b7 23 75 b7 c0 2d 8a 81 7d ae 19 23 b3 62 ...#u..-..}..#.b 00:20:33.891 00000350 da 21 a9 98 df 52 e4 2c bd 0d 01 6c 52 0b c9 3c .!...R.,...lR..< 00:20:33.891 00000360 35 5d e8 80 d0 61 64 d3 2d 13 bf e3 96 33 e7 76 5]...ad.-....3.v 00:20:33.891 00000370 b3 0f 83 9c 50 23 4d f1 1c 29 b7 cd 2e 74 17 ae ....P#M..)...t.. 00:20:33.891 00000380 aa d4 34 bc b0 56 c7 2f 38 5f 8e 20 5b 45 25 7e ..4..V./8_. [E%~ 00:20:33.891 00000390 19 ad 52 b4 1d bc ef 39 99 83 f2 37 ce 94 e2 7a ..R....9...7...z 00:20:33.891 000003a0 ab b9 bb 3a 72 01 42 06 5e b4 16 b9 07 5f b3 2e ...:r.B.^...._.. 00:20:33.891 000003b0 93 ad 25 e9 ee 57 44 11 4d f3 79 a1 31 25 b2 d9 ..%..WD.M.y.1%.. 00:20:33.891 000003c0 1c 25 6e ff 29 b7 30 0b 0e 9a b6 d5 1f 9c df c2 .%n.).0......... 00:20:33.891 000003d0 88 6d 4d 3f 86 93 b9 b0 a6 e2 23 b8 2f 4b 76 bb .mM?......#./Kv. 00:20:33.891 000003e0 07 37 5c b6 7b 76 61 51 f3 da 5c ae e5 20 1c 33 .7\.{vaQ..\.. .3 00:20:33.891 000003f0 68 03 9f 42 8e 82 11 bd 74 d4 81 75 d2 5e e8 42 h..B....t..u.^.B 00:20:33.891 [2024-09-27 15:25:12.910624] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=1, dhgroup=5, seq=3428451740, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.891 [2024-09-27 15:25:12.910734] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.891 [2024-09-27 15:25:12.989529] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.891 [2024-09-27 15:25:12.989573] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.891 [2024-09-27 15:25:12.989584] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.891 [2024-09-27 15:25:12.989611] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.891 [2024-09-27 15:25:13.180293] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.891 [2024-09-27 15:25:13.180313] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.891 [2024-09-27 15:25:13.180320] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.891 [2024-09-27 15:25:13.180336] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.891 [2024-09-27 15:25:13.184320] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.891 ctrlr pubkey: 00:20:33.891 00000000 38 b1 a5 3d d9 95 d9 7e c3 4a f8 29 87 b0 c9 e4 8..=...~.J.).... 00:20:33.891 00000010 86 c5 2a 90 b7 b0 1d 9e aa 07 69 cd d1 6f 6a c2 ..*.......i..oj. 00:20:33.891 00000020 33 4c f0 61 b6 87 dd 16 ff e2 aa a2 0f ea cc b1 3L.a............ 00:20:33.891 00000030 82 59 90 eb 75 87 62 6d e5 a9 10 b7 9e 65 bc 8f .Y..u.bm.....e.. 00:20:33.891 00000040 6e 98 59 b7 76 8a 50 84 a5 84 c2 d1 c3 d1 8a c0 n.Y.v.P......... 00:20:33.891 00000050 a6 aa a0 6e c4 5f fe 67 66 d3 12 23 81 49 e1 3b ...n._.gf..#.I.; 00:20:33.891 00000060 e1 da ea a0 0f 3a 5b 7c 2a 5e 66 b5 57 26 c3 1d .....:[|*^f.W&.. 00:20:33.891 00000070 d7 46 c8 69 8a 21 f2 0d 29 be 98 74 2d f0 ad 3b .F.i.!..)..t-..; 00:20:33.891 00000080 87 b3 d6 af a2 9f b8 fb 57 7b 4c 36 76 99 62 af ........W{L6v.b. 00:20:33.891 00000090 6f b1 71 f2 41 c0 ec fe 0d 1c 18 9e 7b f4 a6 c4 o.q.A.......{... 00:20:33.891 000000a0 73 20 bc 9b 75 8b a6 17 7b 8d 37 34 65 41 66 b3 s ..u...{.74eAf. 00:20:33.891 000000b0 8e 91 5d f6 20 04 3a 38 34 b0 a8 21 73 a4 9e 61 ..]. .:84..!s..a 00:20:33.891 000000c0 15 0d 56 7f 26 a5 7a da 31 16 26 17 ab cc b6 4c ..V.&.z.1.&....L 00:20:33.891 000000d0 9e 22 84 a3 57 f3 d9 2b 0f 89 2a 66 2f 44 1d e6 ."..W..+..*f/D.. 00:20:33.891 000000e0 d2 7f 62 06 94 38 90 70 8f 6e 8b da 7e f9 4e 81 ..b..8.p.n..~.N. 00:20:33.891 000000f0 f5 5c ce 36 b9 a1 43 a9 52 79 45 51 23 15 f7 18 .\.6..C.RyEQ#... 00:20:33.891 00000100 e8 4c 46 69 01 99 01 49 1b 93 9f 3e c7 65 ee 5a .LFi...I...>.e.Z 00:20:33.891 00000110 8d 80 28 74 98 39 16 01 75 80 c4 d0 c4 b7 8f ee ..(t.9..u....... 00:20:33.891 00000120 c1 e2 bf 8a 69 c3 d4 f3 7d 5a dd 60 1c a3 5a a0 ....i...}Z.`..Z. 00:20:33.891 00000130 f7 0c 34 34 0e 34 9e 5b 4c c4 cb 38 c3 fd e7 d4 ..44.4.[L..8.... 00:20:33.891 00000140 99 f8 03 47 07 f6 1f b8 fb 1c 8a 7e d3 57 6a 3b ...G.......~.Wj; 00:20:33.891 00000150 f9 ef 91 39 bb b7 14 ef 75 3d 12 07 47 37 eb f3 ...9....u=..G7.. 00:20:33.891 00000160 06 bf 4b 3a 22 be d5 1a 2b e4 68 a9 0a c9 92 22 ..K:"...+.h...." 00:20:33.891 00000170 f7 1e c2 c2 60 77 37 70 e9 ee 22 3e ae ef c4 98 ....`w7p..">.... 00:20:33.891 00000180 44 24 94 e8 74 1f 6e b1 bd 1d cf 2b 39 d3 63 f5 D$..t.n....+9.c. 00:20:33.891 00000190 a7 4b 6f fe 5c 10 ac 36 22 86 27 e9 22 b6 29 68 .Ko.\..6".'.".)h 00:20:33.891 000001a0 23 3a c2 93 a6 dd 43 d5 3a 70 f8 d0 04 fd 63 e7 #:....C.:p....c. 00:20:33.891 000001b0 19 32 a2 f2 b2 15 e6 00 89 65 bb 89 76 0c 19 76 .2.......e..v..v 00:20:33.891 000001c0 e0 a9 3a 56 55 03 9f 31 7d 6f d6 67 1e 5c 04 b3 ..:VU..1}o.g.\.. 00:20:33.891 000001d0 ba 61 be f2 0f ed 1b 18 c4 c9 eb aa 9e 26 02 4e .a...........&.N 00:20:33.891 000001e0 58 b2 d5 27 cb 81 1b e3 07 8c 50 40 da be d4 73 X..'......P@...s 00:20:33.891 000001f0 cb 0c 36 48 5e cc 2f 62 ee 67 b9 37 83 83 66 45 ..6H^./b.g.7..fE 00:20:33.891 00000200 9b e1 6e d6 c7 11 6f 3a 4b e0 f3 29 c9 1f f8 b2 ..n...o:K..).... 00:20:33.891 00000210 bf 15 74 9f ce 3c a6 1b cc 98 f2 64 02 28 5e 6a ..t..<.....d.(^j 00:20:33.891 00000220 c8 d5 a3 93 fe e3 31 c6 ed 83 37 ba 97 5f 1d 0b ......1...7.._.. 00:20:33.891 00000230 7a ae 6a 01 46 12 6f 52 18 36 99 33 b6 00 e8 8c z.j.F.oR.6.3.... 00:20:33.891 00000240 c2 1f ea 0b 0a 7f 64 4e 73 3d d5 ef 16 b6 d0 ab ......dNs=...... 00:20:33.891 00000250 26 42 5c d8 40 94 29 dd 9a fa bb 54 b3 33 b2 55 &B\.@.)....T.3.U 00:20:33.891 00000260 5f 99 f1 93 40 28 8b 5b 60 07 06 48 2a a9 60 ac _...@(.[`..H*.`. 00:20:33.891 00000270 2d 0a d2 72 65 73 fe 4f 04 c4 d8 4b 06 48 73 9f -..res.O...K.Hs. 00:20:33.891 00000280 c2 6a 0c c0 0a 1d d6 c5 82 0a b0 3f 08 63 d2 71 .j.........?.c.q 00:20:33.891 00000290 32 13 84 ff cb df 86 a9 2b 15 f9 1c e6 df c3 b4 2.......+....... 00:20:33.891 000002a0 da 90 e6 eb 31 c2 cd 1c 0a 2b 49 40 05 31 98 82 ....1....+I@.1.. 00:20:33.891 000002b0 e9 6c 04 78 3c f4 aa 8b af 13 3b be de 53 29 f9 .l.x<.....;..S). 00:20:33.891 000002c0 14 da 96 8c 14 9e d3 32 aa 85 92 c4 da 74 92 e2 .......2.....t.. 00:20:33.891 000002d0 c9 1a 0e 3e 9e cf 17 eb b4 ea 30 c1 1f 77 c8 7a ...>......0..w.z 00:20:33.891 000002e0 7b 88 0f 9f 9b 47 a1 19 85 dc 31 4a 53 aa 1f cd {....G....1JS... 00:20:33.891 000002f0 8a 83 c1 ca d5 86 da 96 b5 42 19 ca eb 9e e0 f5 .........B...... 00:20:33.891 00000300 f9 72 ec ee ff b6 d8 4e 44 ed 01 3d ee e9 80 f0 .r.....ND..=.... 00:20:33.891 00000310 2a fa 24 6b 46 5c 2a 15 66 2c 37 08 0c 3f af 2b *.$kF\*.f,7..?.+ 00:20:33.891 00000320 bb 12 b0 e6 d3 45 14 d6 8d 9f 5e 40 51 5c 3f 49 .....E....^@Q\?I 00:20:33.891 00000330 c8 31 f0 10 17 84 d9 f5 73 98 93 fe 8c 35 81 ed .1......s....5.. 00:20:33.891 00000340 bc a2 9f ba b7 48 37 b8 e2 06 3f e1 de ae c4 8e .....H7...?..... 00:20:33.891 00000350 94 73 8f 11 8d c8 38 08 06 8d af f1 84 d4 37 c9 .s....8.......7. 00:20:33.891 00000360 69 7a 63 be 7f 43 87 b8 40 20 65 5e 5d 7a 46 20 izc..C..@ e^]zF 00:20:33.891 00000370 5d 59 6e 80 1c 10 12 96 cf d1 7f a5 30 58 9c ec ]Yn.........0X.. 00:20:33.891 00000380 1f 5b 4a ff 70 82 7e 5d 69 9c c9 90 43 46 45 4d .[J.p.~]i...CFEM 00:20:33.891 00000390 ca 65 ce fa d4 df dc 10 fc 19 8e 16 04 0f 54 2b .e............T+ 00:20:33.891 000003a0 73 33 cf 47 d6 14 c1 ca 79 98 70 4f 46 86 26 74 s3.G....y.pOF.&t 00:20:33.891 000003b0 08 1b be 4a 4c 54 f6 e7 c1 71 6e 42 b0 c6 6c d1 ...JLT...qnB..l. 00:20:33.891 000003c0 15 af da 1f b7 9d 8c 00 0c 80 ca 99 cf ee b2 e7 ................ 00:20:33.891 000003d0 ed 30 69 c5 25 e2 af 98 e2 8f 04 55 ba 55 1b 2d .0i.%......U.U.- 00:20:33.891 000003e0 5d 89 4f e1 fc 66 5a db 3a ab 90 52 5c f7 ee 44 ].O..fZ.:..R\..D 00:20:33.891 000003f0 a6 13 a6 5e c8 02 81 65 9f 65 82 9d 9e 59 87 b5 ...^...e.e...Y.. 00:20:33.891 host pubkey: 00:20:33.891 00000000 16 9a 33 e3 ca 33 9d 2d ce 9a f8 a2 b4 80 28 4b ..3..3.-......(K 00:20:33.892 00000010 8d 8d 55 fd 04 d4 7d 02 1b 84 cf 68 45 91 c0 f2 ..U...}....hE... 00:20:33.892 00000020 39 68 0f 05 0e 27 35 e9 05 37 cf 81 66 03 8d ba 9h...'5..7..f... 00:20:33.892 00000030 16 84 cc d5 27 5b 8f 85 59 1b 04 bf 61 81 ff 6b ....'[..Y...a..k 00:20:33.892 00000040 25 29 9f a1 b5 70 15 7d 05 e1 92 f2 21 78 06 c9 %)...p.}....!x.. 00:20:33.892 00000050 00 3a d0 4f 4d 8f f3 87 64 d4 a6 d3 a0 d0 dc 62 .:.OM...d......b 00:20:33.892 00000060 3c ea 60 cb 66 4f e3 b3 26 e4 86 1a df 09 8d f9 <.`.fO..&....... 00:20:33.892 00000070 50 e3 87 46 17 aa fc 7d da ce 36 93 80 c9 8f 5f P..F...}..6...._ 00:20:33.892 00000080 d2 6d 06 ec f3 c0 6d 9f 31 cc 26 0e 4b 72 cc 1e .m....m.1.&.Kr.. 00:20:33.892 00000090 95 55 04 ce 6d cb 93 5c 2c 4c 85 b7 0f 07 f5 0c .U..m..\,L...... 00:20:33.892 000000a0 ec b0 89 49 55 61 ac 90 97 75 b5 ef 49 f5 ee 15 ...IUa...u..I... 00:20:33.892 000000b0 76 53 5d 91 10 77 d9 dd 00 9f 49 3e 4b af 7c 95 vS]..w....I>K.|. 00:20:33.892 000000c0 23 67 fd 7f c4 a5 d0 a1 4e cf 0a f1 cf 27 c8 44 #g......N....'.D 00:20:33.892 000000d0 cb d4 9f 0f de e1 5c 49 ff 65 bb a1 21 52 b1 0b ......\I.e..!R.. 00:20:33.892 000000e0 26 1c b5 1f fb fa b9 4a 0b a1 e4 d5 37 6a 3d d6 &......J....7j=. 00:20:33.892 000000f0 42 6d 43 6c 41 da 23 ad 60 e3 09 6e e5 1d ed ba BmClA.#.`..n.... 00:20:33.892 00000100 b8 2d 9b 5c 95 1c ca c6 07 10 7f 6e a5 03 08 35 .-.\.......n...5 00:20:33.892 00000110 04 7f f6 f8 f8 9d bd 84 b6 a1 0f 4d a6 5b 65 23 ...........M.[e# 00:20:33.892 00000120 0d 15 bc 38 2a ae 46 69 69 b8 e9 fe 5b 04 a1 41 ...8*.Fii...[..A 00:20:33.892 00000130 52 6f 34 67 89 c3 8f 0a 13 1b 39 ec 26 53 6b 11 Ro4g......9.&Sk. 00:20:33.892 00000140 d5 2b b8 9b 28 a7 a4 e1 8e 3a fb fa 1f d0 2a 13 .+..(....:....*. 00:20:33.892 00000150 10 a1 89 35 9d 23 94 72 b7 82 b3 f2 12 9b 52 31 ...5.#.r......R1 00:20:33.892 00000160 08 dc 21 a1 d9 ce 82 4b 45 be 89 31 74 ee 67 cf ..!....KE..1t.g. 00:20:33.892 00000170 54 e0 6c 6f af 1c 13 28 a3 1e ab f7 31 46 82 6c T.lo...(....1F.l 00:20:33.892 00000180 49 da ee e3 2b 20 63 bb 98 01 05 88 8b 6a e0 ae I...+ c......j.. 00:20:33.892 00000190 3c d8 fb 04 b5 5b 63 d5 fd ba b7 cc d0 13 45 ea <....[c.......E. 00:20:33.892 000001a0 d0 51 f3 77 c8 5b 2b 0a d9 a8 7e bb 2c a7 b8 9c .Q.w.[+...~.,... 00:20:33.892 000001b0 be 94 84 ca c2 a9 2a e1 a8 db d0 72 de 0c 39 8e ......*....r..9. 00:20:33.892 000001c0 df 00 0a 25 da 1b 70 a8 b4 0e 03 7a 33 d1 3a 77 ...%..p....z3.:w 00:20:33.892 000001d0 08 43 2e 6d 82 92 6a c1 d6 a8 82 29 6e 17 f5 84 .C.m..j....)n... 00:20:33.892 000001e0 16 84 00 2d 88 08 7e 6b de 0a 74 99 95 fd b3 8d ...-..~k..t..... 00:20:33.892 000001f0 84 d5 92 35 27 17 3f 91 ad 31 7a c0 4c ae 15 35 ...5'.?..1z.L..5 00:20:33.892 00000200 4b ee 38 7d b6 b0 38 77 a2 f1 b9 35 38 8d 73 d4 K.8}..8w...58.s. 00:20:33.892 00000210 99 b0 00 21 22 50 fb 2f 81 59 61 c4 b8 99 44 87 ...!"P./.Ya...D. 00:20:33.892 00000220 87 b1 85 e5 ca bf 5e 22 ab 08 c4 1a c8 d9 9c 40 ......^".......@ 00:20:33.892 00000230 71 3b 91 01 8c a8 93 20 31 d4 80 a2 71 4d 1c e8 q;..... 1...qM.. 00:20:33.892 00000240 3d 4e 90 fe d6 94 9c a4 7f a5 b3 18 a0 cf e5 96 =N.............. 00:20:33.892 00000250 ab 24 ce 3d 1a de 2c 92 80 cb 5d 8c 07 2e 73 07 .$.=..,...]...s. 00:20:33.892 00000260 c1 c0 00 6c c0 7c 74 2a bb 29 a7 be e8 94 c1 0e ...l.|t*.)...... 00:20:33.892 00000270 06 04 59 27 77 ae 91 44 76 ff 9f f3 98 f2 b0 e1 ..Y'w..Dv....... 00:20:33.892 00000280 1e 34 7a cd 73 47 2b 2c 61 20 6a be b9 5b 85 48 .4z.sG+,a j..[.H 00:20:33.892 00000290 69 41 82 90 b2 a9 33 df fc ac 59 6c c7 26 16 f9 iA....3...Yl.&.. 00:20:33.892 000002a0 5d 52 f1 7b 8a 6e 68 dd 3c 20 74 3e 58 fd 5f ea ]R.{.nh.< t>X._. 00:20:33.892 000002b0 4f d0 a9 26 9e c0 ad d8 b0 e1 ae 33 a8 0c 78 6f O..&.......3..xo 00:20:33.892 000002c0 f8 0e 45 de b8 a1 37 7f ae 1e ba 87 2e 01 d3 9c ..E...7......... 00:20:33.892 000002d0 f4 32 a7 73 a6 69 85 62 9e ca 4b 5e df 49 21 9e .2.s.i.b..K^.I!. 00:20:33.892 000002e0 0e ab b9 78 29 53 21 36 d0 3c 5b ad 78 da 49 62 ...x)S!6.<[.x.Ib 00:20:33.892 000002f0 a0 f5 f7 48 eb 75 05 7e 7e 4b f2 1f 91 5f 37 6e ...H.u.~~K..._7n 00:20:33.892 00000300 87 72 65 68 80 03 fd bf f8 dc 2d d7 e4 6f 0a c1 .reh......-..o.. 00:20:33.892 00000310 b4 24 03 24 14 35 94 1f 68 12 45 ee fb 19 ca 51 .$.$.5..h.E....Q 00:20:33.892 00000320 83 94 45 4d 00 bf 21 eb 2c ca c9 4b e5 5f ad 62 ..EM..!.,..K._.b 00:20:33.892 00000330 53 fe 0a c0 74 c4 c5 32 a8 69 4d a1 bd dd 98 2d S...t..2.iM....- 00:20:33.892 00000340 24 26 dc 12 76 91 25 5e c3 67 c1 28 53 03 76 fb $&..v.%^.g.(S.v. 00:20:33.892 00000350 49 a3 ae de 21 bc a2 69 d3 b2 22 07 ea ca 4d 47 I...!..i.."...MG 00:20:33.892 00000360 07 9f 78 f4 b6 30 7f 52 27 e8 9e 2f fa 1f fe 49 ..x..0.R'../...I 00:20:33.892 00000370 fd 2f c3 f8 c2 b7 9f 2e b3 1e 1d 22 af 61 b1 92 ./.........".a.. 00:20:33.892 00000380 c8 f5 6e 08 c7 dd 06 4c c1 76 a4 c6 db 87 34 76 ..n....L.v....4v 00:20:33.892 00000390 5f a8 e0 5d 73 14 83 df e2 e1 cc 80 d2 5b 9e a2 _..]s........[.. 00:20:33.892 000003a0 9b de e4 7f a4 b3 26 96 e1 0a 4b 4f 38 fc 36 2e ......&...KO8.6. 00:20:33.892 000003b0 81 4b 8a 71 f4 6b fe cd 71 13 68 40 92 40 8f 1b .K.q.k..q.h@.@.. 00:20:33.892 000003c0 b9 8f 10 81 f4 2d e3 1b 02 88 03 58 32 82 1a 89 .....-.....X2... 00:20:33.892 000003d0 f5 36 4e f7 9d a3 e0 eb 70 82 2a 42 2c a3 4a f7 .6N.....p.*B,.J. 00:20:33.892 000003e0 7d fb 4f 85 a2 ea d7 14 5b 2e 09 cb 67 b9 2c 16 }.O.....[...g.,. 00:20:33.892 000003f0 75 af 2d d2 c5 66 c2 7b ee 4e ba 0d 5a 09 9c b3 u.-..f.{.N..Z... 00:20:33.892 dh secret: 00:20:33.892 00000000 b8 e1 2d 9c a1 0d 49 e8 92 41 7f 37 ef 94 31 13 ..-...I..A.7..1. 00:20:33.892 00000010 a6 43 0d 56 11 de 21 bf 2d 75 35 e5 b3 21 8d eb .C.V..!.-u5..!.. 00:20:33.892 00000020 96 1b 1c 35 d0 35 c3 c7 35 9a 64 bc 56 cf 1f b0 ...5.5..5.d.V... 00:20:33.892 00000030 75 19 75 58 c8 ce 9a e0 1c 2b dd 70 95 6f ec 52 u.uX.....+.p.o.R 00:20:33.892 00000040 6c 65 14 cf 52 9b 3a 6d ab d4 d4 66 f9 06 5c be le..R.:m...f..\. 00:20:33.892 00000050 b0 0b 69 ee ef f1 6c 51 af 5b 6b 37 47 8c 26 e9 ..i...lQ.[k7G.&. 00:20:33.892 00000060 c0 0b 31 66 58 30 8c 30 40 17 d4 46 2e b6 3b 12 ..1fX0.0@..F..;. 00:20:33.892 00000070 6d 48 56 bc 9e 81 42 6c 2b d5 44 42 98 a3 89 9c mHV...Bl+.DB.... 00:20:33.892 00000080 4e e8 ff 0b cd b2 de 28 ae f3 f0 13 e0 60 7f af N......(.....`.. 00:20:33.892 00000090 5a 81 66 d3 27 98 8e 7c 3e ee 81 58 e6 e1 d0 cc Z.f.'..|>..X.... 00:20:33.892 000000a0 6d 6a 90 29 b4 dc 4a db d6 a8 bf d7 4c 46 c8 88 mj.)..J.....LF.. 00:20:33.892 000000b0 be 96 c7 6d 5e 65 56 e8 e4 7b 0b 25 5d 53 3c 11 ...m^eV..{.%]S<. 00:20:33.892 000000c0 e6 3e 0c 64 5c d3 d6 39 e8 35 e1 c8 0d d2 71 62 .>.d\..9.5....qb 00:20:33.892 000000d0 91 d7 2f 0c 46 4e de 89 99 9a df ef 65 d9 d6 45 ../.FN......e..E 00:20:33.892 000000e0 97 20 2b dd 2e e1 fc 8c fd 7b d8 04 f0 f3 54 10 . +......{....T. 00:20:33.892 000000f0 fc db 78 d2 47 e9 4b c1 58 85 81 1d c1 26 ae d4 ..x.G.K.X....&.. 00:20:33.892 00000100 d3 1a c6 71 27 5e 31 bd a5 5d f8 d7 2c 50 da 1a ...q'^1..]..,P.. 00:20:33.892 00000110 22 ab 08 a7 3b a6 79 0f ed 87 ef 76 d4 f9 ff 1e "...;.y....v.... 00:20:33.892 00000120 6d 9e 41 f4 53 cd 83 ef 8f 7a 74 f1 6f c0 22 68 m.A.S....zt.o."h 00:20:33.892 00000130 07 e7 6a 4a bf ca 88 dc 42 87 6a 7f db 8a a8 d2 ..jJ....B.j..... 00:20:33.892 00000140 cf bd 53 a2 ff 5b 9d 74 2c 58 d2 93 8c 97 56 34 ..S..[.t,X....V4 00:20:33.892 00000150 56 72 e7 ef 78 58 27 9a 45 06 8d 12 a5 67 7d 98 Vr..xX'.E....g}. 00:20:33.892 00000160 6d a2 f4 78 54 1a 03 9a 5e c1 4c 4d dd e4 68 df m..xT...^.LM..h. 00:20:33.892 00000170 f6 96 38 55 99 88 e7 2f 3c e8 c4 e8 17 39 f7 36 ..8U.../<....9.6 00:20:33.892 00000180 dd 39 08 03 62 25 d8 89 30 2d b9 9a fd b8 9b 35 .9..b%..0-.....5 00:20:33.892 00000190 75 09 1e 8a 2c 64 ed 74 60 b6 69 d7 dd 1a fa 49 u...,d.t`.i....I 00:20:33.892 000001a0 c7 82 a3 3f 43 ae 7d cf c6 16 c2 80 b2 42 c4 6c ...?C.}......B.l 00:20:33.892 000001b0 87 e3 5b 85 9a 60 c6 c9 2b 23 9e 3a de 7b 3e 4f ..[..`..+#.:.{>O 00:20:33.892 000001c0 fd 54 13 f5 7d ba a4 c5 34 05 08 0e 01 54 41 0f .T..}...4....TA. 00:20:33.892 000001d0 69 6f ce 1d 4c ed ce ac 57 ce e9 ee 58 2c f4 39 io..L...W...X,.9 00:20:33.892 000001e0 19 86 45 9d 1c 67 e1 86 38 a6 d5 38 5a 5c f2 8c ..E..g..8..8Z\.. 00:20:33.892 000001f0 4c 71 f1 3a 01 4e 68 84 1e bc 47 1c c1 00 e0 92 Lq.:.Nh...G..... 00:20:33.892 00000200 14 fa 0c 30 c5 49 23 bc e9 22 1c c9 ab 62 cf 11 ...0.I#.."...b.. 00:20:33.892 00000210 cd d3 94 a5 94 44 87 d8 5e fc 5a cf a6 45 15 b5 .....D..^.Z..E.. 00:20:33.892 00000220 3c 3a 1b 98 fb e3 ed 52 5c 32 b8 b7 94 2c 00 6c <:.....R\2...,.l 00:20:33.892 00000230 46 dc 09 12 da 6f 8d 9f 63 15 8a 4a 4b a1 1a 87 F....o..c..JK... 00:20:33.892 00000240 7b ca b1 82 a7 d9 c8 60 0e 44 ac 9f a8 bd 0c 3a {......`.D.....: 00:20:33.892 00000250 76 ca e5 a1 be 83 5c bd 4d 7c 0e bd 7a 84 2e 99 v.....\.M|..z... 00:20:33.892 00000260 14 26 1e 3a 88 9c 91 e4 ea a5 59 3a 6e b1 b5 15 .&.:......Y:n... 00:20:33.892 00000270 89 7f ae bf 56 2e 5c 5f 9f 90 13 9e e0 32 26 6b ....V.\_.....2&k 00:20:33.892 00000280 95 81 8f 81 94 d8 52 82 0d 70 6f 9b f9 39 5e 47 ......R..po..9^G 00:20:33.892 00000290 53 a9 85 df cb a1 f1 57 d9 31 39 98 2d 24 5f ed S......W.19.-$_. 00:20:33.892 000002a0 9d 1e 42 bc 3e f9 3b c9 50 3f d5 26 99 ca 8c 39 ..B.>.;.P?.&...9 00:20:33.892 000002b0 91 1d 46 0a f7 71 f1 ab f2 eb 38 2e c0 68 08 a2 ..F..q....8..h.. 00:20:33.892 000002c0 44 58 c0 a5 f6 58 1d 2b 6f c4 dc ca 49 80 b1 64 DX...X.+o...I..d 00:20:33.892 000002d0 89 f1 99 a2 a2 6e 19 bb 5b 5d 29 9d 4a f4 bd 7c .....n..[]).J..| 00:20:33.892 000002e0 37 5f c1 e5 c6 a0 09 b4 b2 35 f0 e5 70 74 c4 56 7_.......5..pt.V 00:20:33.892 000002f0 09 ef 5b d1 79 43 96 3a 51 22 b6 0c f5 56 7e a4 ..[.yC.:Q"...V~. 00:20:33.892 00000300 5d a4 10 a1 1d df ee 6b 8e cd 61 bd 47 74 3a e3 ]......k..a.Gt:. 00:20:33.892 00000310 ab 5b 8a 4b b5 81 d3 69 b9 98 86 69 b3 dd b5 0c .[.K...i...i.... 00:20:33.892 00000320 33 79 ce 1f 04 a8 a7 ed b5 02 55 a7 f6 7f 92 17 3y........U..... 00:20:33.892 00000330 d3 b9 0e 80 05 4f ad b5 3f b9 14 c3 73 99 d3 21 .....O..?...s..! 00:20:33.892 00000340 63 79 4b 63 c9 e9 02 ff 08 78 78 1a 49 77 44 f6 cyKc.....xx.IwD. 00:20:33.892 00000350 fe 34 58 1b ce 94 89 8b 7e e9 f5 f3 d1 8b 5c 00 .4X.....~.....\. 00:20:33.892 00000360 ae 69 6e 0b c5 7b 94 67 bc e2 cb 12 be 46 6c 5f .in..{.g.....Fl_ 00:20:33.892 00000370 ec 79 8d b3 5e 44 4c 50 09 6d 8b b2 5d a0 a1 f3 .y..^DLP.m..]... 00:20:33.892 00000380 1d 4f 40 ec 9b e9 ae 12 d1 65 a9 ca 4a d5 90 92 .O@......e..J... 00:20:33.892 00000390 74 d0 b8 4d ac 51 6d 0f 81 21 88 82 fa 05 b8 37 t..M.Qm..!.....7 00:20:33.892 000003a0 8c 0e 6e ef 3f b7 76 45 6d 12 5e f7 b0 d8 08 0b ..n.?.vEm.^..... 00:20:33.892 000003b0 b2 36 2a a7 3e 07 a8 da 96 8e f3 b6 f3 f8 7f 6a .6*.>..........j 00:20:33.892 000003c0 b6 c3 e3 99 4d 9f 66 19 fa ec bd 7e 31 3e 52 67 ....M.f....~1>Rg 00:20:33.892 000003d0 f0 21 56 25 0f 29 ee 51 f1 b9 21 12 33 27 c7 e2 .!V%.).Q..!.3'.. 00:20:33.892 000003e0 08 72 f8 a1 6a 59 7a 0c 20 04 4e 71 0b 8c cc 51 .r..jYz. .Nq...Q 00:20:33.892 000003f0 96 d2 58 30 02 89 08 43 db 13 61 a1 d7 71 0c 84 ..X0...C..a..q.. 00:20:33.892 [2024-09-27 15:25:13.297204] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=1, dhgroup=5, seq=3428451741, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.892 [2024-09-27 15:25:13.358397] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.892 [2024-09-27 15:25:13.358442] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.892 [2024-09-27 15:25:13.358460] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.892 [2024-09-27 15:25:13.358491] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.892 [2024-09-27 15:25:13.358502] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.892 [2024-09-27 15:25:13.463669] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.892 [2024-09-27 15:25:13.463687] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.892 [2024-09-27 15:25:13.463694] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.892 [2024-09-27 15:25:13.463704] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.892 [2024-09-27 15:25:13.463758] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.892 ctrlr pubkey: 00:20:33.892 00000000 38 b1 a5 3d d9 95 d9 7e c3 4a f8 29 87 b0 c9 e4 8..=...~.J.).... 00:20:33.892 00000010 86 c5 2a 90 b7 b0 1d 9e aa 07 69 cd d1 6f 6a c2 ..*.......i..oj. 00:20:33.892 00000020 33 4c f0 61 b6 87 dd 16 ff e2 aa a2 0f ea cc b1 3L.a............ 00:20:33.892 00000030 82 59 90 eb 75 87 62 6d e5 a9 10 b7 9e 65 bc 8f .Y..u.bm.....e.. 00:20:33.892 00000040 6e 98 59 b7 76 8a 50 84 a5 84 c2 d1 c3 d1 8a c0 n.Y.v.P......... 00:20:33.893 00000050 a6 aa a0 6e c4 5f fe 67 66 d3 12 23 81 49 e1 3b ...n._.gf..#.I.; 00:20:33.893 00000060 e1 da ea a0 0f 3a 5b 7c 2a 5e 66 b5 57 26 c3 1d .....:[|*^f.W&.. 00:20:33.893 00000070 d7 46 c8 69 8a 21 f2 0d 29 be 98 74 2d f0 ad 3b .F.i.!..)..t-..; 00:20:33.893 00000080 87 b3 d6 af a2 9f b8 fb 57 7b 4c 36 76 99 62 af ........W{L6v.b. 00:20:33.893 00000090 6f b1 71 f2 41 c0 ec fe 0d 1c 18 9e 7b f4 a6 c4 o.q.A.......{... 00:20:33.893 000000a0 73 20 bc 9b 75 8b a6 17 7b 8d 37 34 65 41 66 b3 s ..u...{.74eAf. 00:20:33.893 000000b0 8e 91 5d f6 20 04 3a 38 34 b0 a8 21 73 a4 9e 61 ..]. .:84..!s..a 00:20:33.893 000000c0 15 0d 56 7f 26 a5 7a da 31 16 26 17 ab cc b6 4c ..V.&.z.1.&....L 00:20:33.893 000000d0 9e 22 84 a3 57 f3 d9 2b 0f 89 2a 66 2f 44 1d e6 ."..W..+..*f/D.. 00:20:33.893 000000e0 d2 7f 62 06 94 38 90 70 8f 6e 8b da 7e f9 4e 81 ..b..8.p.n..~.N. 00:20:33.893 000000f0 f5 5c ce 36 b9 a1 43 a9 52 79 45 51 23 15 f7 18 .\.6..C.RyEQ#... 00:20:33.893 00000100 e8 4c 46 69 01 99 01 49 1b 93 9f 3e c7 65 ee 5a .LFi...I...>.e.Z 00:20:33.893 00000110 8d 80 28 74 98 39 16 01 75 80 c4 d0 c4 b7 8f ee ..(t.9..u....... 00:20:33.893 00000120 c1 e2 bf 8a 69 c3 d4 f3 7d 5a dd 60 1c a3 5a a0 ....i...}Z.`..Z. 00:20:33.893 00000130 f7 0c 34 34 0e 34 9e 5b 4c c4 cb 38 c3 fd e7 d4 ..44.4.[L..8.... 00:20:33.893 00000140 99 f8 03 47 07 f6 1f b8 fb 1c 8a 7e d3 57 6a 3b ...G.......~.Wj; 00:20:33.893 00000150 f9 ef 91 39 bb b7 14 ef 75 3d 12 07 47 37 eb f3 ...9....u=..G7.. 00:20:33.893 00000160 06 bf 4b 3a 22 be d5 1a 2b e4 68 a9 0a c9 92 22 ..K:"...+.h...." 00:20:33.893 00000170 f7 1e c2 c2 60 77 37 70 e9 ee 22 3e ae ef c4 98 ....`w7p..">.... 00:20:33.893 00000180 44 24 94 e8 74 1f 6e b1 bd 1d cf 2b 39 d3 63 f5 D$..t.n....+9.c. 00:20:33.893 00000190 a7 4b 6f fe 5c 10 ac 36 22 86 27 e9 22 b6 29 68 .Ko.\..6".'.".)h 00:20:33.893 000001a0 23 3a c2 93 a6 dd 43 d5 3a 70 f8 d0 04 fd 63 e7 #:....C.:p....c. 00:20:33.893 000001b0 19 32 a2 f2 b2 15 e6 00 89 65 bb 89 76 0c 19 76 .2.......e..v..v 00:20:33.893 000001c0 e0 a9 3a 56 55 03 9f 31 7d 6f d6 67 1e 5c 04 b3 ..:VU..1}o.g.\.. 00:20:33.893 000001d0 ba 61 be f2 0f ed 1b 18 c4 c9 eb aa 9e 26 02 4e .a...........&.N 00:20:33.893 000001e0 58 b2 d5 27 cb 81 1b e3 07 8c 50 40 da be d4 73 X..'......P@...s 00:20:33.893 000001f0 cb 0c 36 48 5e cc 2f 62 ee 67 b9 37 83 83 66 45 ..6H^./b.g.7..fE 00:20:33.893 00000200 9b e1 6e d6 c7 11 6f 3a 4b e0 f3 29 c9 1f f8 b2 ..n...o:K..).... 00:20:33.893 00000210 bf 15 74 9f ce 3c a6 1b cc 98 f2 64 02 28 5e 6a ..t..<.....d.(^j 00:20:33.893 00000220 c8 d5 a3 93 fe e3 31 c6 ed 83 37 ba 97 5f 1d 0b ......1...7.._.. 00:20:33.893 00000230 7a ae 6a 01 46 12 6f 52 18 36 99 33 b6 00 e8 8c z.j.F.oR.6.3.... 00:20:33.893 00000240 c2 1f ea 0b 0a 7f 64 4e 73 3d d5 ef 16 b6 d0 ab ......dNs=...... 00:20:33.893 00000250 26 42 5c d8 40 94 29 dd 9a fa bb 54 b3 33 b2 55 &B\.@.)....T.3.U 00:20:33.893 00000260 5f 99 f1 93 40 28 8b 5b 60 07 06 48 2a a9 60 ac _...@(.[`..H*.`. 00:20:33.893 00000270 2d 0a d2 72 65 73 fe 4f 04 c4 d8 4b 06 48 73 9f -..res.O...K.Hs. 00:20:33.893 00000280 c2 6a 0c c0 0a 1d d6 c5 82 0a b0 3f 08 63 d2 71 .j.........?.c.q 00:20:33.893 00000290 32 13 84 ff cb df 86 a9 2b 15 f9 1c e6 df c3 b4 2.......+....... 00:20:33.893 000002a0 da 90 e6 eb 31 c2 cd 1c 0a 2b 49 40 05 31 98 82 ....1....+I@.1.. 00:20:33.893 000002b0 e9 6c 04 78 3c f4 aa 8b af 13 3b be de 53 29 f9 .l.x<.....;..S). 00:20:33.893 000002c0 14 da 96 8c 14 9e d3 32 aa 85 92 c4 da 74 92 e2 .......2.....t.. 00:20:33.893 000002d0 c9 1a 0e 3e 9e cf 17 eb b4 ea 30 c1 1f 77 c8 7a ...>......0..w.z 00:20:33.893 000002e0 7b 88 0f 9f 9b 47 a1 19 85 dc 31 4a 53 aa 1f cd {....G....1JS... 00:20:33.893 000002f0 8a 83 c1 ca d5 86 da 96 b5 42 19 ca eb 9e e0 f5 .........B...... 00:20:33.893 00000300 f9 72 ec ee ff b6 d8 4e 44 ed 01 3d ee e9 80 f0 .r.....ND..=.... 00:20:33.893 00000310 2a fa 24 6b 46 5c 2a 15 66 2c 37 08 0c 3f af 2b *.$kF\*.f,7..?.+ 00:20:33.893 00000320 bb 12 b0 e6 d3 45 14 d6 8d 9f 5e 40 51 5c 3f 49 .....E....^@Q\?I 00:20:33.893 00000330 c8 31 f0 10 17 84 d9 f5 73 98 93 fe 8c 35 81 ed .1......s....5.. 00:20:33.893 00000340 bc a2 9f ba b7 48 37 b8 e2 06 3f e1 de ae c4 8e .....H7...?..... 00:20:33.893 00000350 94 73 8f 11 8d c8 38 08 06 8d af f1 84 d4 37 c9 .s....8.......7. 00:20:33.893 00000360 69 7a 63 be 7f 43 87 b8 40 20 65 5e 5d 7a 46 20 izc..C..@ e^]zF 00:20:33.893 00000370 5d 59 6e 80 1c 10 12 96 cf d1 7f a5 30 58 9c ec ]Yn.........0X.. 00:20:33.893 00000380 1f 5b 4a ff 70 82 7e 5d 69 9c c9 90 43 46 45 4d .[J.p.~]i...CFEM 00:20:33.893 00000390 ca 65 ce fa d4 df dc 10 fc 19 8e 16 04 0f 54 2b .e............T+ 00:20:33.893 000003a0 73 33 cf 47 d6 14 c1 ca 79 98 70 4f 46 86 26 74 s3.G....y.pOF.&t 00:20:33.893 000003b0 08 1b be 4a 4c 54 f6 e7 c1 71 6e 42 b0 c6 6c d1 ...JLT...qnB..l. 00:20:33.893 000003c0 15 af da 1f b7 9d 8c 00 0c 80 ca 99 cf ee b2 e7 ................ 00:20:33.893 000003d0 ed 30 69 c5 25 e2 af 98 e2 8f 04 55 ba 55 1b 2d .0i.%......U.U.- 00:20:33.893 000003e0 5d 89 4f e1 fc 66 5a db 3a ab 90 52 5c f7 ee 44 ].O..fZ.:..R\..D 00:20:33.893 000003f0 a6 13 a6 5e c8 02 81 65 9f 65 82 9d 9e 59 87 b5 ...^...e.e...Y.. 00:20:33.893 host pubkey: 00:20:33.893 00000000 66 d5 dc 84 e6 b2 a8 d0 e5 8c 68 94 d8 ac 90 55 f.........h....U 00:20:33.893 00000010 77 70 17 66 58 65 f8 dd d1 77 08 55 de 46 b0 3c wp.fXe...w.U.F.< 00:20:33.893 00000020 a7 f9 4b 99 d4 16 39 74 ca 9d d2 f0 2a b4 01 2e ..K...9t....*... 00:20:33.893 00000030 95 09 27 db b4 82 38 6a 60 2a 4c b5 f2 fd 8f 4f ..'...8j`*L....O 00:20:33.893 00000040 9f 29 12 ae 8b 25 ca d0 45 44 d1 00 4b 6f 94 e4 .)...%..ED..Ko.. 00:20:33.893 00000050 72 e2 4a b1 71 e1 4d e1 3b 5f 9a a1 53 6a 08 bf r.J.q.M.;_..Sj.. 00:20:33.893 00000060 83 f6 6d db c8 40 67 28 b5 c2 56 28 76 26 83 14 ..m..@g(..V(v&.. 00:20:33.893 00000070 2b 02 3d 8c ae 3b 8f 2a e4 0a 1c 33 96 91 06 7c +.=..;.*...3...| 00:20:33.893 00000080 4e b2 f7 50 bd de 84 8b 5f 15 61 eb 7f c3 48 d4 N..P...._.a...H. 00:20:33.893 00000090 08 82 67 86 d9 20 54 17 1c 65 ad 95 cc 0e 9d 37 ..g.. T..e.....7 00:20:33.893 000000a0 ee ca e4 8a 44 ee 35 d0 46 6c ae 89 98 f1 e5 e8 ....D.5.Fl...... 00:20:33.893 000000b0 22 de b4 e5 31 9a bc b3 61 9a d6 10 8e a8 64 c3 "...1...a.....d. 00:20:33.893 000000c0 34 6c 9b c0 cd 9e ea 19 15 96 8d 46 d7 d4 76 24 4l.........F..v$ 00:20:33.893 000000d0 6c 14 15 f7 dd 92 64 6f 73 a8 4a 49 af 7e 58 f0 l.....dos.JI.~X. 00:20:33.893 000000e0 bc 10 7a 49 92 c1 93 09 94 32 5e ee 6e d5 78 2c ..zI.....2^.n.x, 00:20:33.893 000000f0 64 73 38 cf b5 77 a1 e4 16 af 7e 1c d2 11 1a 3b ds8..w....~....; 00:20:33.893 00000100 9e da f8 43 78 c5 07 eb f2 ed e9 79 b2 24 8a ef ...Cx......y.$.. 00:20:33.893 00000110 a2 e8 2f 54 7c 3b 3a 3b 36 45 2c 21 79 c9 08 96 ../T|;:;6E,!y... 00:20:33.893 00000120 1c 9c be c5 37 49 fa 86 bf a4 a2 e7 a1 6c 69 3f ....7I.......li? 00:20:33.893 00000130 e1 61 26 03 58 3e b1 e3 2b d6 1e 2b be 76 32 a2 .a&.X>..+..+.v2. 00:20:33.893 00000140 31 36 da e4 e7 4e 41 ae d5 34 15 b8 ff b8 58 0f 16...NA..4....X. 00:20:33.893 00000150 f3 0a 0a 61 eb 58 4f 75 a0 2b f6 ae 3e db a9 9a ...a.XOu.+..>... 00:20:33.893 00000160 96 7f 54 67 ff 41 f3 f0 6f af e3 7d a5 cf 98 e5 ..Tg.A..o..}.... 00:20:33.893 00000170 e7 0b 84 ef 3e 56 78 aa 67 f6 63 c3 22 d7 1a 0f ....>Vx.g.c."... 00:20:33.893 00000180 a0 a5 23 06 62 6d 04 c2 00 ec 17 f9 63 d1 d3 20 ..#.bm......c.. 00:20:33.893 00000190 68 45 ef 6c 65 65 a2 6e a5 61 c2 8b 84 e9 7f b9 hE.lee.n.a...... 00:20:33.893 000001a0 9f 4b 47 0d bd 44 6c ea a4 57 dd fc 18 fa 18 73 .KG..Dl..W.....s 00:20:33.893 000001b0 d2 54 12 8d 56 98 a7 ce 56 00 ce e5 3c e6 28 76 .T..V...V...<.(v 00:20:33.893 000001c0 5b b7 14 f9 0b 80 9a 72 dd ce ba 29 8b 35 42 ab [......r...).5B. 00:20:33.893 000001d0 06 be 29 bd 88 e3 94 d8 a6 88 55 c9 10 71 cc 4f ..).......U..q.O 00:20:33.893 000001e0 6d ae 23 ee ad d0 49 b4 f6 18 d2 e2 6f 11 96 c3 m.#...I.....o... 00:20:33.893 000001f0 7c 0f a5 e9 4f 85 92 87 36 e7 d3 1a 67 80 7e d4 |...O...6...g.~. 00:20:33.893 00000200 cb f3 ee b0 20 b3 c2 78 87 55 6e 6c 94 06 b8 c9 .... ..x.Unl.... 00:20:33.893 00000210 d1 09 ba f7 ea 2f 82 fb f5 f7 e4 ca 2d 2f 3e 1f ...../......-/>. 00:20:33.893 00000220 c4 20 6b 93 e3 92 04 b5 eb 5f ce ed 4e ab b9 a9 . k......_..N... 00:20:33.893 00000230 ea 3b 37 c8 9a 42 eb 34 72 d5 37 22 45 57 c4 33 .;7..B.4r.7"EW.3 00:20:33.893 00000240 bf b6 35 ee 81 4c 0d 78 80 b2 38 ab 90 8c ea 79 ..5..L.x..8....y 00:20:33.893 00000250 90 f1 e2 fc 55 2d db 33 f4 63 d5 29 47 5a 7e 75 ....U-.3.c.)GZ~u 00:20:33.893 00000260 78 64 c5 3d 17 9a 1a 0d 5f 0e 08 d9 20 56 e4 1d xd.=...._... V.. 00:20:33.893 00000270 c6 05 b1 bb f0 dc cc c1 d0 77 73 72 97 68 c1 b8 .........wsr.h.. 00:20:33.893 00000280 f1 99 6d 20 d2 9d e7 c1 d8 06 82 3b 72 1c 71 a4 ..m .......;r.q. 00:20:33.893 00000290 a2 15 6c 86 1d 7e 7c 5e 2f c4 c1 9e f9 b9 99 e6 ..l..~|^/....... 00:20:33.893 000002a0 6e 43 d5 cb 17 fb 80 29 c4 35 5b 2f b1 3e de a2 nC.....).5[/.>.. 00:20:33.893 000002b0 07 c2 08 65 1d a1 bc 8f 54 50 21 2a 4d 84 3b fb ...e....TP!*M.;. 00:20:33.893 000002c0 2c 81 01 7f 2e 38 dd 3d 9f b5 bd be 25 47 a5 e9 ,....8.=....%G.. 00:20:33.893 000002d0 a8 e8 ab a0 9a 1f a2 73 19 f1 bf 5c 2e 3a c6 f2 .......s...\.:.. 00:20:33.893 000002e0 93 f6 55 e5 4c d9 aa c6 3a b1 5e c4 e7 cc b5 82 ..U.L...:.^..... 00:20:33.893 000002f0 60 0f f7 6e 0e 08 1c 89 9f 1b 22 9f 78 af 00 db `..n......".x... 00:20:33.893 00000300 2a 28 41 ab 90 9f 57 3e 56 44 6c 18 e1 f5 ce c0 *(A...W>VDl..... 00:20:33.893 00000310 83 76 2e 5c cf fa f2 08 3d 58 6c 3d be f3 92 75 .v.\....=Xl=...u 00:20:33.893 00000320 72 fc de bf fd 94 03 11 be 5f 1f 11 66 05 6a ef r........_..f.j. 00:20:33.893 00000330 2e 54 bc cb f4 41 f9 4e 2f a7 50 0a df eb 75 9a .T...A.N/.P...u. 00:20:33.893 00000340 a5 17 fb f2 fd d2 63 be 73 7b 8b 8d 55 f2 1a 88 ......c.s{..U... 00:20:33.893 00000350 d1 7c 50 4a 96 bf 12 e2 87 39 b4 de 7e 9b 85 bd .|PJ.....9..~... 00:20:33.893 00000360 29 9f 3c 07 05 90 29 bb e3 d0 71 ef 6d 74 f9 ce ).<...)...q.mt.. 00:20:33.893 00000370 2c 96 7e d6 1c 20 18 f3 39 20 de 95 b2 fe 45 8d ,.~.. ..9 ....E. 00:20:33.893 00000380 1a 5c c1 77 83 c6 f9 db 55 69 de dc 54 51 a0 3c .\.w....Ui..TQ.< 00:20:33.893 00000390 6c 46 50 fe 1f 12 b8 73 e3 ef 04 2b d7 ca c8 f9 lFP....s...+.... 00:20:33.893 000003a0 c0 b3 4e 39 2a 33 a4 b8 41 8c 8f 47 91 fd 19 bf ..N9*3..A..G.... 00:20:33.893 000003b0 b8 85 31 03 f6 c9 89 11 eb 9d 40 de 1b d8 c2 69 ..1.......@....i 00:20:33.893 000003c0 3e bc 69 d5 7b 28 9e 03 4f eb ab 7b c4 d8 a0 f0 >.i.{(..O..{.... 00:20:33.893 000003d0 12 b1 d2 49 f4 a7 c5 a0 0e e8 11 29 ec ab 9e b5 ...I.......).... 00:20:33.893 000003e0 fb 20 06 8d 53 29 9d ff 4e a2 5f 58 e6 88 eb ce . ..S)..N._X.... 00:20:33.893 000003f0 59 6c 95 32 f1 db 10 38 17 44 65 ee 5c 3c bf 83 Yl.2...8.De.\<.. 00:20:33.893 dh secret: 00:20:33.893 00000000 fe 9e 8b 03 80 5c cb 01 db c6 b1 75 bd b4 ca 06 .....\.....u.... 00:20:33.893 00000010 7d 7d 94 13 f1 0d 9b 55 33 ae 2d 5e 43 50 60 bd }}.....U3.-^CP`. 00:20:33.893 00000020 c8 f9 b5 8e c4 0f 2f 79 05 a3 26 7e e2 ed 98 63 ....../y..&~...c 00:20:33.893 00000030 11 b8 e6 34 4b f1 c9 13 e1 fa 3a c3 2b b6 18 1e ...4K.....:.+... 00:20:33.893 00000040 dd 70 d5 6b 13 8d e0 14 08 17 9b c5 c1 de b0 13 .p.k............ 00:20:33.893 00000050 cb db b4 68 3a 42 99 1c 15 d1 1e f0 fa 11 e6 1c ...h:B.......... 00:20:33.893 00000060 48 00 1f 9a 52 5e bc 59 21 1e be df e6 5b a5 c7 H...R^.Y!....[.. 00:20:33.893 00000070 f5 4c cb ca a2 cb c9 34 85 5a ce 57 91 ef be 96 .L.....4.Z.W.... 00:20:33.893 00000080 43 81 3b 17 84 3e 6a 9d b6 3c 6d 6b c2 7c d4 22 C.;..>j......p.40.p 00:20:33.894 000001a0 92 7b 62 ad ab 3e e4 a1 e0 c0 88 5a 51 25 8c a6 .{b..>.....ZQ%.. 00:20:33.894 000001b0 1d 3f 49 95 6e 4f d6 90 34 a7 40 a6 f0 9b dc d1 .?I.nO..4.@..... 00:20:33.894 000001c0 dd 36 ce f3 e4 7e 95 1a c8 8d b3 16 a9 77 e3 31 .6...~.......w.1 00:20:33.894 000001d0 4a 11 5f a6 93 ae 8a 5f 00 f1 38 83 ac 1e 9c 45 J._...._..8....E 00:20:33.894 000001e0 e1 84 66 fc 4e 52 c0 f6 63 2a 9e c2 a4 7f 66 66 ..f.NR..c*....ff 00:20:33.894 000001f0 09 c2 89 cf 1c 6b 2d 29 6c 9d 15 d5 85 2f 76 94 .....k-)l..../v. 00:20:33.894 00000200 54 4f 26 b1 ac cd 39 2e 03 46 d0 5e bf 3a 83 15 TO&...9..F.^.:.. 00:20:33.894 00000210 21 4c 91 cd ab 82 6b a3 16 cb ba d3 47 af 78 0a !L....k.....G.x. 00:20:33.894 00000220 80 f6 ea 90 51 e0 ee 43 98 03 26 a4 8a 25 40 54 ....Q..C..&..%@T 00:20:33.894 00000230 f1 40 49 4a 67 5f 67 02 5e c4 99 87 93 a4 cc a7 .@IJg_g.^....... 00:20:33.894 00000240 1c fe 5e 7b 5c ad e8 6f 18 bc 2a 5a 41 ec f8 f2 ..^{\..o..*ZA... 00:20:33.894 00000250 9b 73 d2 f3 94 5e e6 12 0a 23 45 dc b5 00 25 3e .s...^...#E...%> 00:20:33.894 00000260 3e 93 c7 02 6e 43 25 b0 ee ff 6d 39 63 fc ee 67 >...nC%...m9c..g 00:20:33.894 00000270 cd c5 06 ec 1f 39 33 97 06 b3 c1 1b a6 a9 2b 3c .....93.......+< 00:20:33.894 00000280 fe b6 9b d8 de d8 e7 b7 9a 58 f0 5f 42 a4 39 a9 .........X._B.9. 00:20:33.894 00000290 54 85 c9 3a 22 e1 e2 e2 fe 55 00 af 31 32 62 ea T..:"....U..12b. 00:20:33.894 000002a0 ce 03 54 e6 bb c1 ca e4 1c 18 35 3b fe fe e4 d1 ..T.......5;.... 00:20:33.894 000002b0 e9 b9 56 8c 24 7e 3e b1 bc af b0 08 e9 8d a3 5b ..V.$~>........[ 00:20:33.894 000002c0 4b a9 4c a9 53 07 9a 89 10 fb af d7 06 60 59 e3 K.L.S........`Y. 00:20:33.894 000002d0 e0 3e a5 bd ba b2 af 0d e4 57 4b 54 33 0c da 24 .>.......WKT3..$ 00:20:33.894 000002e0 16 a7 6b 73 e6 22 09 09 43 46 87 53 8a 76 ca 34 ..ks."..CF.S.v.4 00:20:33.894 000002f0 8b 2a ad 23 80 cd 81 ca 14 d0 9b e2 33 0e d0 7f .*.#........3... 00:20:33.894 00000300 2b db d9 14 ab 71 3a cd 53 54 b0 24 f3 b1 b0 fb +....q:.ST.$.... 00:20:33.894 00000310 93 8a 12 26 b2 43 46 d5 c2 09 f3 04 90 e5 47 6a ...&.CF.......Gj 00:20:33.894 00000320 4e 3d c1 cb 68 aa 6e 3d dd 2f f7 67 4d f0 2d 68 N=..h.n=./.gM.-h 00:20:33.894 00000330 1b 0d 68 55 c2 a8 dd ff 02 c0 9d dd a3 8c 90 4c ..hU...........L 00:20:33.894 00000340 89 23 24 b6 94 5f 4e 19 3b de 3f ed de d9 58 ac .#$.._N.;.?...X. 00:20:33.894 00000350 b8 54 25 c9 da 36 16 93 98 08 a0 eb 8f 2f 23 fc .T%..6......./#. 00:20:33.894 00000360 ad f5 89 d9 fe f5 3c 29 80 23 cf 2d 24 11 38 7e ......<).#.-$.8~ 00:20:33.894 00000370 86 58 bc 11 de b6 da 56 8c a9 8b 32 94 4b 5f 83 .X.....V...2.K_. 00:20:33.894 00000380 02 7f 57 2e fa 70 fe c3 47 40 bc b4 e1 a5 e1 1d ..W..p..G@...... 00:20:33.894 00000390 29 58 3f d1 32 dc bb 2d ba 4f 6b 1e 95 e5 af f1 )X?.2..-.Ok..... 00:20:33.894 000003a0 c5 8e ce 2a 8b c7 7f 39 96 86 de 5d 03 d2 c9 30 ...*...9...]...0 00:20:33.894 000003b0 17 07 ed 99 39 ca dd 85 c6 5a 4d cc f4 89 2c a9 ....9....ZM...,. 00:20:33.894 000003c0 e1 37 04 13 39 eb 3e eb e5 7d cb 64 66 4b d5 6e .7..9.>..}.dfK.n 00:20:33.894 000003d0 d0 88 ea d3 dc ef b2 58 78 25 80 d4 7f e8 12 9d .......Xx%...... 00:20:33.894 000003e0 d4 9f c5 50 8a 8d 18 56 4d 9b 8f ab c8 c7 a7 6d ...P...VM......m 00:20:33.894 000003f0 3d 92 d4 f9 1d 8f b6 a0 3b 80 21 d5 8c 45 d9 cb =.......;.!..E.. 00:20:33.894 [2024-09-27 15:25:13.574570] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=1, dhgroup=5, seq=3428451742, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.894 [2024-09-27 15:25:13.574677] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.894 [2024-09-27 15:25:13.656984] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.894 [2024-09-27 15:25:13.657026] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.894 [2024-09-27 15:25:13.657036] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.894 [2024-09-27 15:25:13.657063] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.894 [2024-09-27 15:25:13.850768] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.894 [2024-09-27 15:25:13.850786] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:33.894 [2024-09-27 15:25:13.850794] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.894 [2024-09-27 15:25:13.850837] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.894 [2024-09-27 15:25:13.850860] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.894 ctrlr pubkey: 00:20:33.894 00000000 f5 f1 fc 86 70 58 4c b7 4e 22 2f 06 a6 ac d1 f1 ....pXL.N"/..... 00:20:33.894 00000010 5a b1 73 ec d3 d0 e6 c0 79 60 a0 78 0e 47 65 6d Z.s.....y`.x.Gem 00:20:33.894 00000020 2c 41 14 c6 ee 43 58 55 01 cc 36 ab 00 2f 7c a3 ,A...CXU..6../|. 00:20:33.894 00000030 4c 59 ad 22 af 94 4e bc 05 c9 65 39 ee f2 86 9a LY."..N...e9.... 00:20:33.894 00000040 28 e9 86 80 b2 39 8f 03 4d b4 3f 51 5f e2 c6 2e (....9..M.?Q_... 00:20:33.894 00000050 8c e7 74 fb 2e 46 41 3c c0 b4 16 94 c3 c5 59 f3 ..t..FA<......Y. 00:20:33.894 00000060 22 37 8a 45 51 23 2f 30 78 60 a0 81 c6 cf ea 9b "7.EQ#/0x`...... 00:20:33.894 00000070 56 28 55 e0 a4 25 16 0b a0 c3 b1 bb d3 a0 4b 5c V(U..%........K\ 00:20:33.894 00000080 6b 86 0e fd 43 cd c0 da 67 0f 49 6c d8 a6 fa bb k...C...g.Il.... 00:20:33.894 00000090 73 14 03 bf d3 34 36 dc b4 31 8d cc 18 24 4b d3 s....46..1...$K. 00:20:33.894 000000a0 96 e4 37 81 83 47 cc ba e0 40 80 79 f4 60 f3 b4 ..7..G...@.y.`.. 00:20:33.894 000000b0 27 5b 2e 54 f4 7f 96 d2 49 95 bd a5 01 83 7b a4 '[.T....I.....{. 00:20:33.894 000000c0 ca 64 47 4a b6 4f 19 96 a2 85 f8 1c 7c ec 1d 82 .dGJ.O......|... 00:20:33.894 000000d0 e0 76 d2 2f 08 b8 be 4d c9 c1 7c 43 e3 07 f8 2f .v./...M..|C.../ 00:20:33.894 000000e0 49 d0 33 45 f7 bd c5 d4 f0 96 8a a0 12 dc 78 d7 I.3E..........x. 00:20:33.894 000000f0 39 d8 ad 55 14 3f df 43 8b c8 b3 90 19 b1 15 49 9..U.?.C.......I 00:20:33.894 00000100 cc b3 1c 0e ee ca c2 52 93 dd 9d 51 f0 a0 45 e0 .......R...Q..E. 00:20:33.894 00000110 a3 f6 71 6a 75 a5 59 8b 72 fa a4 6d 4e 9d 14 38 ..qju.Y.r..mN..8 00:20:33.894 00000120 35 22 df 97 3b 1c 91 c1 c3 5e 64 7d 33 c1 99 19 5"..;....^d}3... 00:20:33.894 00000130 71 b2 13 4d 71 04 a5 e4 c1 19 b1 94 b2 64 ac f9 q..Mq........d.. 00:20:33.894 00000140 1f 8b a9 48 a4 6d 74 44 2e a2 08 18 aa 6d 11 92 ...H.mtD.....m.. 00:20:33.894 00000150 a3 59 bf 12 a7 f0 97 7d fd 6a a3 43 45 9a 9d 37 .Y.....}.j.CE..7 00:20:33.894 00000160 f4 37 58 b7 38 24 55 02 0a 71 7c 28 52 9a b9 67 .7X.8$U..q|(R..g 00:20:33.894 00000170 a0 35 48 b1 80 47 d3 40 be 6c 17 6d 87 b4 bf 1e .5H..G.@.l.m.... 00:20:33.894 00000180 57 8e 54 b9 32 81 99 d9 69 48 21 4b 53 8f b7 e0 W.T.2...iH!KS... 00:20:33.894 00000190 e3 d6 0b af 8a d6 d6 6b 55 c4 12 96 73 f7 80 67 .......kU...s..g 00:20:33.894 000001a0 4b 0c f3 dd 09 2e e5 e2 bc a4 ca 18 80 f0 6f 1b K.............o. 00:20:33.894 000001b0 36 d6 cb f1 b7 89 9e 47 69 5d 0c 18 82 e1 56 d3 6......Gi]....V. 00:20:33.894 000001c0 4d 34 d1 07 c0 79 80 89 6a 1c 91 b8 f8 29 05 88 M4...y..j....).. 00:20:33.894 000001d0 87 15 f0 00 cb a7 65 cc a3 05 a8 29 42 d1 67 87 ......e....)B.g. 00:20:33.894 000001e0 aa d7 3c 4c 0a 59 ab 67 a7 0c 1b 6a 8b 4f a0 bf ...7..TP4.<.G.7e. 00:20:33.894 00000200 aa 48 91 02 44 a3 dd d9 50 70 0e 01 d5 6b 70 90 .H..D...Pp...kp. 00:20:33.894 00000210 78 0f cc c3 1d a9 d6 90 2a 03 b7 42 ed 8e 6f 53 x.......*..B..oS 00:20:33.894 00000220 39 bd e3 b5 03 fc ad 13 d6 29 d4 e1 73 6e b9 1d 9........)..sn.. 00:20:33.894 00000230 94 77 b3 e3 fe 7a 71 39 62 25 61 0f 60 68 0e 60 .w...zq9b%a.`h.` 00:20:33.894 00000240 07 df 96 98 f1 4d bc ca 07 09 3b fe 17 f2 73 a0 .....M....;...s. 00:20:33.894 00000250 bd 8b 34 4a dd 66 e2 d0 2e 54 49 4e 70 ae 85 fe ..4J.f...TINp... 00:20:33.894 00000260 5f c6 1c 01 6d 1b a0 0d 0f 4f 13 b4 d7 59 ed a2 _...m....O...Y.. 00:20:33.894 00000270 43 2c 5e b4 bc 43 44 d2 6f dd 71 43 59 c2 42 14 C,^..CD.o.qCY.B. 00:20:33.894 00000280 9f f1 e8 30 c2 20 b9 c5 c8 d4 da b9 4a a9 0a 90 ...0. ......J... 00:20:33.894 00000290 22 47 93 6c 77 47 71 0b 67 f4 b4 c7 d3 a7 b5 ed "G.lwGq.g....... 00:20:33.894 000002a0 d5 ab ec 41 2e 9b 59 8e 94 d8 e8 cb 8d c4 b8 80 ...A..Y......... 00:20:33.894 000002b0 78 25 d3 f7 45 cc 1f 17 35 d6 05 28 36 9e 54 c0 x%..E...5..(6.T. 00:20:33.894 000002c0 cf 41 e5 e5 34 fc 23 67 6d d1 93 06 b7 27 2b 09 .A..4.#gm....'+. 00:20:33.894 000002d0 c6 58 35 03 b1 c2 94 43 10 d4 7e 14 d3 36 a2 5c .X5....C..~..6.\ 00:20:33.894 000002e0 b4 dd 51 a5 56 81 f2 3e f1 e8 a8 4f 61 96 fd a8 ..Q.V..>...Oa... 00:20:33.894 000002f0 a0 db 2c 64 d4 86 75 61 89 14 b4 70 3c d2 bd af ..,d..ua...p<... 00:20:33.894 00000300 4c 3f c7 9d b7 39 39 83 ee 4a 03 0e 66 10 8b 64 L?...99..J..f..d 00:20:33.894 00000310 04 f8 1c fc c4 5d 47 93 59 18 c9 9b 4d 10 f9 cb .....]G.Y...M... 00:20:33.894 00000320 b0 fe 6e 0c 74 a2 3b d8 bf 55 71 fa 30 61 ba 7f ..n.t.;..Uq.0a.. 00:20:33.894 00000330 51 e5 36 47 43 d6 8f af a6 6a f6 a8 de 04 5e c9 Q.6GC....j....^. 00:20:33.894 00000340 0b cd 5f b3 a3 8b fa 5d b2 55 53 d2 eb 5a 78 6b .._....].US..Zxk 00:20:33.894 00000350 81 1d 8e ae bb 80 75 a3 28 26 a1 01 99 61 0e 66 ......u.(&...a.f 00:20:33.894 00000360 7b b7 d1 d0 b5 05 90 d7 03 58 12 81 86 d2 73 3e {........X....s> 00:20:33.894 00000370 a0 78 fe 87 bb 10 99 6f 3b e0 d7 7c 9b 8e 36 ea .x.....o;..|..6. 00:20:33.894 00000380 c9 dd f6 48 14 bb ba d0 76 8d c9 53 46 e1 e7 c2 ...H....v..SF... 00:20:33.894 00000390 1f db 40 43 99 8e 16 49 89 f2 62 d7 47 ca ad 2d ..@C...I..b.G..- 00:20:33.894 000003a0 ed 63 51 de fb dc fd ae 94 34 23 b2 33 64 9f 5d .cQ......4#.3d.] 00:20:33.894 000003b0 b4 ba 72 6b b1 be 7c c2 51 5f d1 71 ca 9b 48 95 ..rk..|.Q_.q..H. 00:20:33.894 000003c0 8b ef 08 88 d1 5a 30 53 17 1e bf 2a fe 9a 9b 12 .....Z0S...*.... 00:20:33.894 000003d0 30 35 0c 42 9c b0 48 2c 4a 14 87 42 11 79 b5 42 05.B..H,J..B.y.B 00:20:33.894 000003e0 a7 70 e8 e9 21 bc 1d 8f c5 88 4e ed fb 64 3c 0f .p..!.....N..d<. 00:20:33.894 000003f0 af 71 22 d4 e8 06 e1 57 e9 f2 31 b8 68 30 1b d7 .q"....W..1.h0.. 00:20:33.894 host pubkey: 00:20:33.894 00000000 68 45 8b 97 0b 44 8c 0e ba 11 36 cc 4a b2 40 b3 hE...D....6.J.@. 00:20:33.894 00000010 93 e7 ab fe 8d f8 4c ff 7f 03 d3 0c d2 0c cd 70 ......L........p 00:20:33.894 00000020 28 1f 9c a8 f2 69 9f 4b 22 0b 63 bf b7 ad ef a2 (....i.K".c..... 00:20:33.894 00000030 11 2d 59 52 38 c9 7b 18 f9 68 8f cf ef b6 37 64 .-YR8.{..h....7d 00:20:33.894 00000040 fb a7 ef 28 a0 09 5c db b6 8a 57 0b f2 c0 cb 8a ...(..\...W..... 00:20:33.894 00000050 11 d9 d2 b9 d1 a9 ca fd bc 2e 6a a8 c2 53 36 6a ..........j..S6j 00:20:33.894 00000060 41 92 90 b1 bd 56 15 10 11 ab 4b fe 17 5e 3e cc A....V....K..^>. 00:20:33.894 00000070 85 49 1d a7 72 60 dc 73 ed 24 8b d8 76 ed ad 3b .I..r`.s.$..v..; 00:20:33.894 00000080 0d eb fd fb f2 7f c8 b0 21 ba fe 3b ce c4 97 3f ........!..;...? 00:20:33.894 00000090 bd f7 fc 40 5a 65 5a d5 19 1a c5 61 de d9 dd fb ...@ZeZ....a.... 00:20:33.894 000000a0 26 a6 cd ba 28 78 84 15 dd 58 49 1e 08 9b d2 67 &...(x...XI....g 00:20:33.894 000000b0 db 8a 25 b7 dc 9f 9e 30 32 b9 18 a7 1a ff da 1d ..%....02....... 00:20:33.894 000000c0 c8 54 a0 22 11 a0 a7 17 eb ba 7e 70 3e 54 91 9f .T."......~p>T.. 00:20:33.894 000000d0 f5 81 24 52 f6 5e 4f 99 ce ae 2a 27 03 e8 d2 03 ..$R.^O...*'.... 00:20:33.894 000000e0 1e eb f3 26 25 28 b6 92 da 19 4f 08 bd c6 cd 4e ...&%(....O....N 00:20:33.894 000000f0 a7 7b fe 1f f3 ec 28 8c ef c5 17 69 5e be fb d2 .{....(....i^... 00:20:33.894 00000100 be 29 1e 1a 7d 3e e5 a9 e4 b7 54 2f 05 3c d8 16 .)..}>....T/.<.. 00:20:33.894 00000110 69 c5 76 8b c6 e0 c5 f8 69 ab 9c a8 2c ed fe 22 i.v.....i...,.." 00:20:33.894 00000120 07 9b eb 7a 1a 73 07 2d a2 e0 93 5c 13 df ec 4f ...z.s.-...\...O 00:20:33.895 00000130 e3 57 cd 82 85 32 e0 f8 f6 46 37 fa 26 f6 3c 50 .W...2...F7.&.

#.&. 00:20:33.895 00000200 01 bd 9f fc 18 b2 74 af ef 9b 68 41 12 a8 06 27 ......t...hA...' 00:20:33.895 00000210 d4 46 3e 25 a5 c8 8f 32 87 fc 02 29 06 d7 60 9a .F>%...2...)..`. 00:20:33.895 00000220 3c 86 d7 41 5b 62 4e d5 99 97 78 7e 4c 03 c2 2c <..A[bN...x~L.., 00:20:33.895 00000230 2d 29 36 bb 2b 75 54 3e 2f 6c f1 19 4d ec c9 84 -)6.+uT>/l..M... 00:20:33.895 00000240 97 b7 a0 8d 9e d5 4b 11 e7 52 20 f0 eb c5 71 a4 ......K..R ...q. 00:20:33.895 00000250 e3 78 11 be bb ea 4c b1 ba 08 24 c9 7b 5c 9f a7 .x....L...$.{\.. 00:20:33.895 00000260 bc 03 d7 69 dd f1 63 fc 78 c9 94 4a 74 57 44 39 ...i..c.x..JtWD9 00:20:33.895 00000270 50 e1 3c b0 f2 38 7e b0 93 33 cd 13 40 8c 7f 72 P.<..8~..3..@..r 00:20:33.895 00000280 51 0f d1 1f df f4 20 e6 77 e0 12 a9 94 ed f8 11 Q..... .w....... 00:20:33.895 00000290 1f f6 41 0a bb ae a5 03 05 dd 8a 23 bc 74 45 38 ..A........#.tE8 00:20:33.895 000002a0 95 00 6d b8 94 36 8e 50 dd fc 01 84 06 fb f3 39 ..m..6.P.......9 00:20:33.895 000002b0 e2 24 5b 47 70 21 53 69 b3 bf e8 56 60 c2 72 10 .$[Gp!Si...V`.r. 00:20:33.895 000002c0 fe 5c fc cd 1d 88 3b 04 7a 37 b8 17 d2 93 07 02 .\....;.z7...... 00:20:33.895 000002d0 16 36 c5 c3 af 0c 89 37 fc f8 f4 64 67 15 5d 35 .6.....7...dg.]5 00:20:33.895 000002e0 7a 68 6d d5 83 8d 0f d7 74 ec 37 8e e7 67 09 d8 zhm.....t.7..g.. 00:20:33.895 000002f0 bb 1d d5 21 df fd 33 cd 74 cc 95 11 6b aa f6 53 ...!..3.t...k..S 00:20:33.895 00000300 76 26 4e 3f 2c 97 66 ec 57 a8 10 cf f9 10 97 da v&N?,.f.W....... 00:20:33.895 00000310 e0 b4 f9 1b b2 d1 fd ad 19 03 bc fd a9 7f d1 14 ................ 00:20:33.895 00000320 45 37 52 27 9f e4 a1 4a 71 a2 32 fb f0 09 d4 a6 E7R'...Jq.2..... 00:20:33.895 00000330 43 5d e1 4c 61 7c 27 13 67 bf c1 d8 16 a1 a9 88 C].La|'.g....... 00:20:33.895 00000340 01 20 f8 5d 79 1d 76 41 c6 86 55 31 43 e8 ab e2 . .]y.vA..U1C... 00:20:33.895 00000350 ae b2 4b 18 83 ab fe c9 19 80 e3 e9 95 5f 3d 5f ..K.........._=_ 00:20:33.895 00000360 e3 fe 9c c4 5a a2 c2 31 a9 a9 a9 e1 f1 6e 8d 4d ....Z..1.....n.M 00:20:33.895 00000370 85 28 53 3f e7 04 c9 fa 8e 3c df 6a 53 9a a5 2d .(S?.....<.jS..- 00:20:33.895 00000380 1e e5 a5 d8 2e f5 f2 dc 0d b5 ab 53 0d db 4d b0 ...........S..M. 00:20:33.895 00000390 d7 49 43 4b a7 97 d4 db ae e9 f7 51 4d 5b 6f d3 .ICK.......QM[o. 00:20:33.895 000003a0 8f 19 e9 dc e7 27 f6 2f 7c d3 45 af fc de 48 15 .....'./|.E...H. 00:20:33.895 000003b0 99 df 7b db 4e 74 c7 a8 81 b8 41 e1 8b 80 37 24 ..{.Nt....A...7$ 00:20:33.895 000003c0 74 19 78 4c cc 90 6d 56 81 1d 96 85 47 38 1a a5 t.xL..mV....G8.. 00:20:33.895 000003d0 19 b5 72 a7 5b 39 c5 d2 97 c7 a6 d0 eb 9d 86 81 ..r.[9.......... 00:20:33.895 000003e0 44 83 b6 46 e3 9c a5 19 a6 78 ce 53 46 c5 eb 64 D..F.....x.SF..d 00:20:33.895 000003f0 a1 64 98 aa 70 c8 3b fb 7f 49 7e cf 58 2c c5 e9 .d..p.;..I~.X,.. 00:20:33.895 dh secret: 00:20:33.895 00000000 26 66 9f c4 9a fb 2a 98 eb b5 a5 0e 4a e0 ff 81 &f....*.....J... 00:20:33.895 00000010 cf 82 17 1d 8c e1 1c 5a e3 e4 59 23 93 9b 37 ff .......Z..Y#..7. 00:20:33.895 00000020 d4 c0 f3 38 3f ae 05 aa 2f d1 79 85 e5 4a 83 76 ...8?.../.y..J.v 00:20:33.895 00000030 87 a6 ff 9c 2e b0 ba 96 62 89 58 63 d9 5b 0c 61 ........b.Xc.[.a 00:20:33.895 00000040 be ae 90 69 17 4c 07 9c d3 05 d1 13 32 0e e2 24 ...i.L......2..$ 00:20:33.895 00000050 73 48 52 73 25 f0 c6 0e 93 24 e8 cb e9 72 60 1b sHRs%....$...r`. 00:20:33.895 00000060 0c 93 f1 fc af 76 85 d2 36 4b 61 1f f2 1a b9 e3 .....v..6Ka..... 00:20:33.895 00000070 b9 f4 e6 8c 17 8f 18 b6 ac bf 74 07 00 78 fd b3 ..........t..x.. 00:20:33.895 00000080 b7 61 f2 3b c6 85 40 31 18 22 c0 df 6a 24 d2 ec .a.;..@1."..j$.. 00:20:33.895 00000090 c6 9f b6 a0 ae ab 34 1f 56 14 42 30 ed fe 6a 3c ......4.V.B0..j< 00:20:33.895 000000a0 55 eb a4 af de 0f a1 1c 29 08 d2 9a 91 4b db 02 U.......)....K.. 00:20:33.895 000000b0 6b 30 31 65 ce 94 4a 76 57 42 0e f7 f9 da 48 dd k01e..JvWB....H. 00:20:33.895 000000c0 32 3c 4b f8 37 a3 b8 11 35 32 71 cc 6b 50 c5 45 2..t.U... 00:20:33.895 000000f0 8d fc c7 07 90 23 51 d4 42 db d7 cf c3 a2 ea 90 .....#Q.B....... 00:20:33.895 00000100 21 8a 37 13 62 9f a3 69 39 da 09 b4 a5 6c be 52 !.7.b..i9....l.R 00:20:33.895 00000110 8f 35 08 8c 73 e2 0f ef b5 02 24 33 9d 14 c2 c7 .5..s.....$3.... 00:20:33.895 00000120 a8 d2 26 88 ab eb 44 cf 1a 24 18 2a 75 c9 06 07 ..&...D..$.*u... 00:20:33.895 00000130 c4 ef b1 eb a8 1f 42 ec bb 03 fe e7 da 7d 26 fc ......B......}&. 00:20:33.895 00000140 1a 3c cd 71 f8 78 18 d7 da b2 c4 e9 1a 36 56 89 .<.q.x.......6V. 00:20:33.895 00000150 32 fe cd 93 42 b3 63 66 2d 32 77 a2 b4 2c 62 78 2...B.cf-2w..,bx 00:20:33.895 00000160 8b 3f 80 ae 3c 52 aa 9d f2 09 95 2c 33 f9 c1 30 .?.... 00:20:33.895 00000180 e3 0d 8a 65 02 ab 54 96 b5 ea ac 5c 56 60 99 85 ...e..T....\V`.. 00:20:33.895 00000190 c1 ac ec 1e 00 28 d1 c7 12 fa c9 bd e7 6b e4 2a .....(.......k.* 00:20:33.895 000001a0 99 51 21 06 0b a7 fc a6 d6 0f 4d 11 86 f4 e3 3a .Q!.......M....: 00:20:33.895 000001b0 ea 10 d3 1a 63 4f 97 6b f0 33 03 e5 0a da 09 b7 ....cO.k.3...... 00:20:33.895 000001c0 ab c1 6c 18 d8 f6 11 f9 3b ff 21 a5 76 83 f9 3f ..l.....;.!.v..? 00:20:33.895 000001d0 6e f1 7a d2 d5 27 60 93 88 9f dc 7d d6 4d 4e 74 n.z..'`....}.MNt 00:20:33.895 000001e0 9a 26 37 14 ae 48 72 3f 4d 37 b8 e3 e7 37 c3 99 .&7..Hr?M7...7.. 00:20:33.895 000001f0 44 f3 44 ec 9c 3d 89 a3 89 67 5d 63 41 6a b6 24 D.D..=...g]cAj.$ 00:20:33.895 00000200 67 e0 2b 78 e5 8a 1b 6f 56 be 4a 7e b4 89 99 35 g.+x...oV.J~...5 00:20:33.895 00000210 c9 10 6a 09 63 b8 cf 7b e9 df d9 61 ed 38 a0 7a ..j.c..{...a.8.z 00:20:33.897 00000220 be c1 cb ca 3c 19 da bd 78 93 69 a4 87 51 d8 fb ....<...x.i..Q.. 00:20:33.897 00000230 5e c4 b1 e1 07 1b 46 f6 34 8e bf 7b 06 8e d7 10 ^.....F.4..{.... 00:20:33.897 00000240 b7 74 f8 1c 4c 51 5e 05 43 80 39 71 9f a7 07 fb .t..LQ^.C.9q.... 00:20:33.897 00000250 2d 9d cb 23 73 69 2a 43 a4 76 01 dc 36 85 27 fe -..#si*C.v..6.'. 00:20:33.897 00000260 cf 8e dd b8 ec 33 c1 25 b4 eb fd 8d c6 de 2f d5 .....3.%....../. 00:20:33.897 00000270 4d b6 4d 92 fc 63 23 ac 9b 69 7a 05 0e 99 b4 c9 M.M..c#..iz..... 00:20:33.897 00000280 02 20 9d 21 e7 c7 09 13 51 f3 ed 57 a2 92 2f b1 . .!....Q..W../. 00:20:33.897 00000290 bc 65 f8 d8 44 46 38 1f 43 bf c0 a3 2e 3c 84 80 .e..DF8.C....<.. 00:20:33.897 000002a0 71 5b 4a db 1a 55 3e 54 44 d2 6d 30 bc 53 0f 18 q[J..U>TD.m0.S.. 00:20:33.897 000002b0 69 ca ce 12 78 b8 75 8c f7 04 67 04 fd f3 da 00 i...x.u...g..... 00:20:33.897 000002c0 39 ad 00 d8 7e 05 33 86 7e 63 72 a2 0c 7b 04 b3 9...~.3.~cr..{.. 00:20:33.897 000002d0 d8 79 29 ff 3b 4d b9 99 12 f5 00 48 17 3c bc f4 .y).;M.....H.<.. 00:20:33.897 000002e0 1c d0 14 78 23 43 ea 65 ad 40 a5 ce e0 21 7d 4a ...x#C.e.@...!}J 00:20:33.897 000002f0 14 06 87 d1 f2 ab 82 f8 9d 6f 85 8d c8 43 95 d1 .........o...C.. 00:20:33.897 00000300 79 87 ad ca 80 02 79 cf 80 80 fe e2 d8 7e 67 ae y.....y......~g. 00:20:33.897 00000310 72 95 ae 67 85 47 1b d5 13 ea d9 97 81 92 0f 3b r..g.G.........; 00:20:33.897 00000320 af f0 29 11 07 78 b7 3b d7 2a c1 de 74 3d 88 c2 ..)..x.;.*..t=.. 00:20:33.897 00000330 3b e2 51 d9 1f 5e b8 30 42 63 9d c6 42 da 44 b8 ;.Q..^.0Bc..B.D. 00:20:33.897 00000340 76 83 1b f8 64 ca 0a ff 1a b4 9f 56 fc 77 af 5b v...d......V.w.[ 00:20:33.897 00000350 23 4c 89 5c c8 09 ca 1b 4f cc 87 c1 85 c6 9b b2 #L.\....O....... 00:20:33.897 00000360 d4 76 5e ca d0 06 d4 8c 9d d5 25 cd 0c 71 27 e3 .v^.......%..q'. 00:20:33.897 00000370 4e c1 2e 2e 14 f3 63 f9 a7 95 b9 c6 dd 81 22 a8 N.....c.......". 00:20:33.897 00000380 f2 3e 71 fc 89 f9 18 eb 21 01 52 f4 ba d7 e6 8e .>q.....!.R..... 00:20:33.897 00000390 13 a6 de 97 eb f8 ff 9d 6c 09 0b 94 c8 8d 39 bb ........l.....9. 00:20:33.897 000003a0 2b da c8 97 63 d0 62 64 30 86 17 30 72 24 19 5b +...c.bd0..0r$.[ 00:20:33.897 000003b0 89 87 48 11 7a bc 48 02 64 d7 95 26 5d 24 fa 85 ..H.z.H.d..&]$.. 00:20:33.897 000003c0 4b 9e 01 87 e2 1d 80 ce 14 28 78 d4 b9 74 cc ba K........(x..t.. 00:20:33.897 000003d0 ed 9e a1 f5 16 fc d2 10 99 6f 32 87 9d d6 0f 3a .........o2....: 00:20:33.897 000003e0 e7 44 c8 6d a7 38 ec 6a 17 2c 8f 9d 45 88 dd c5 .D.m.8.j.,..E... 00:20:33.897 000003f0 bc 87 78 12 ea d1 e5 07 3d 3f 3f d0 4f cf 67 21 ..x.....=??.O.g! 00:20:33.897 [2024-09-27 15:25:13.962315] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=1, dhgroup=5, seq=3428451743, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.897 [2024-09-27 15:25:14.020728] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.897 [2024-09-27 15:25:14.020760] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.898 [2024-09-27 15:25:14.020776] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.898 [2024-09-27 15:25:14.020782] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.898 [2024-09-27 15:25:14.127205] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.898 [2024-09-27 15:25:14.127222] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 1 (sha256) 00:20:33.898 [2024-09-27 15:25:14.127230] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.898 [2024-09-27 15:25:14.127239] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.898 [2024-09-27 15:25:14.127293] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.898 ctrlr pubkey: 00:20:33.898 00000000 f5 f1 fc 86 70 58 4c b7 4e 22 2f 06 a6 ac d1 f1 ....pXL.N"/..... 00:20:33.898 00000010 5a b1 73 ec d3 d0 e6 c0 79 60 a0 78 0e 47 65 6d Z.s.....y`.x.Gem 00:20:33.898 00000020 2c 41 14 c6 ee 43 58 55 01 cc 36 ab 00 2f 7c a3 ,A...CXU..6../|. 00:20:33.898 00000030 4c 59 ad 22 af 94 4e bc 05 c9 65 39 ee f2 86 9a LY."..N...e9.... 00:20:33.898 00000040 28 e9 86 80 b2 39 8f 03 4d b4 3f 51 5f e2 c6 2e (....9..M.?Q_... 00:20:33.898 00000050 8c e7 74 fb 2e 46 41 3c c0 b4 16 94 c3 c5 59 f3 ..t..FA<......Y. 00:20:33.898 00000060 22 37 8a 45 51 23 2f 30 78 60 a0 81 c6 cf ea 9b "7.EQ#/0x`...... 00:20:33.898 00000070 56 28 55 e0 a4 25 16 0b a0 c3 b1 bb d3 a0 4b 5c V(U..%........K\ 00:20:33.898 00000080 6b 86 0e fd 43 cd c0 da 67 0f 49 6c d8 a6 fa bb k...C...g.Il.... 00:20:33.898 00000090 73 14 03 bf d3 34 36 dc b4 31 8d cc 18 24 4b d3 s....46..1...$K. 00:20:33.898 000000a0 96 e4 37 81 83 47 cc ba e0 40 80 79 f4 60 f3 b4 ..7..G...@.y.`.. 00:20:33.898 000000b0 27 5b 2e 54 f4 7f 96 d2 49 95 bd a5 01 83 7b a4 '[.T....I.....{. 00:20:33.898 000000c0 ca 64 47 4a b6 4f 19 96 a2 85 f8 1c 7c ec 1d 82 .dGJ.O......|... 00:20:33.898 000000d0 e0 76 d2 2f 08 b8 be 4d c9 c1 7c 43 e3 07 f8 2f .v./...M..|C.../ 00:20:33.898 000000e0 49 d0 33 45 f7 bd c5 d4 f0 96 8a a0 12 dc 78 d7 I.3E..........x. 00:20:33.898 000000f0 39 d8 ad 55 14 3f df 43 8b c8 b3 90 19 b1 15 49 9..U.?.C.......I 00:20:33.898 00000100 cc b3 1c 0e ee ca c2 52 93 dd 9d 51 f0 a0 45 e0 .......R...Q..E. 00:20:33.898 00000110 a3 f6 71 6a 75 a5 59 8b 72 fa a4 6d 4e 9d 14 38 ..qju.Y.r..mN..8 00:20:33.898 00000120 35 22 df 97 3b 1c 91 c1 c3 5e 64 7d 33 c1 99 19 5"..;....^d}3... 00:20:33.898 00000130 71 b2 13 4d 71 04 a5 e4 c1 19 b1 94 b2 64 ac f9 q..Mq........d.. 00:20:33.898 00000140 1f 8b a9 48 a4 6d 74 44 2e a2 08 18 aa 6d 11 92 ...H.mtD.....m.. 00:20:33.898 00000150 a3 59 bf 12 a7 f0 97 7d fd 6a a3 43 45 9a 9d 37 .Y.....}.j.CE..7 00:20:33.898 00000160 f4 37 58 b7 38 24 55 02 0a 71 7c 28 52 9a b9 67 .7X.8$U..q|(R..g 00:20:33.898 00000170 a0 35 48 b1 80 47 d3 40 be 6c 17 6d 87 b4 bf 1e .5H..G.@.l.m.... 00:20:33.898 00000180 57 8e 54 b9 32 81 99 d9 69 48 21 4b 53 8f b7 e0 W.T.2...iH!KS... 00:20:33.898 00000190 e3 d6 0b af 8a d6 d6 6b 55 c4 12 96 73 f7 80 67 .......kU...s..g 00:20:33.898 000001a0 4b 0c f3 dd 09 2e e5 e2 bc a4 ca 18 80 f0 6f 1b K.............o. 00:20:33.898 000001b0 36 d6 cb f1 b7 89 9e 47 69 5d 0c 18 82 e1 56 d3 6......Gi]....V. 00:20:33.898 000001c0 4d 34 d1 07 c0 79 80 89 6a 1c 91 b8 f8 29 05 88 M4...y..j....).. 00:20:33.898 000001d0 87 15 f0 00 cb a7 65 cc a3 05 a8 29 42 d1 67 87 ......e....)B.g. 00:20:33.898 000001e0 aa d7 3c 4c 0a 59 ab 67 a7 0c 1b 6a 8b 4f a0 bf ...7..TP4.<.G.7e. 00:20:33.898 00000200 aa 48 91 02 44 a3 dd d9 50 70 0e 01 d5 6b 70 90 .H..D...Pp...kp. 00:20:33.898 00000210 78 0f cc c3 1d a9 d6 90 2a 03 b7 42 ed 8e 6f 53 x.......*..B..oS 00:20:33.898 00000220 39 bd e3 b5 03 fc ad 13 d6 29 d4 e1 73 6e b9 1d 9........)..sn.. 00:20:33.898 00000230 94 77 b3 e3 fe 7a 71 39 62 25 61 0f 60 68 0e 60 .w...zq9b%a.`h.` 00:20:33.898 00000240 07 df 96 98 f1 4d bc ca 07 09 3b fe 17 f2 73 a0 .....M....;...s. 00:20:33.898 00000250 bd 8b 34 4a dd 66 e2 d0 2e 54 49 4e 70 ae 85 fe ..4J.f...TINp... 00:20:33.898 00000260 5f c6 1c 01 6d 1b a0 0d 0f 4f 13 b4 d7 59 ed a2 _...m....O...Y.. 00:20:33.898 00000270 43 2c 5e b4 bc 43 44 d2 6f dd 71 43 59 c2 42 14 C,^..CD.o.qCY.B. 00:20:33.898 00000280 9f f1 e8 30 c2 20 b9 c5 c8 d4 da b9 4a a9 0a 90 ...0. ......J... 00:20:33.898 00000290 22 47 93 6c 77 47 71 0b 67 f4 b4 c7 d3 a7 b5 ed "G.lwGq.g....... 00:20:33.898 000002a0 d5 ab ec 41 2e 9b 59 8e 94 d8 e8 cb 8d c4 b8 80 ...A..Y......... 00:20:33.898 000002b0 78 25 d3 f7 45 cc 1f 17 35 d6 05 28 36 9e 54 c0 x%..E...5..(6.T. 00:20:33.898 000002c0 cf 41 e5 e5 34 fc 23 67 6d d1 93 06 b7 27 2b 09 .A..4.#gm....'+. 00:20:33.898 000002d0 c6 58 35 03 b1 c2 94 43 10 d4 7e 14 d3 36 a2 5c .X5....C..~..6.\ 00:20:33.898 000002e0 b4 dd 51 a5 56 81 f2 3e f1 e8 a8 4f 61 96 fd a8 ..Q.V..>...Oa... 00:20:33.898 000002f0 a0 db 2c 64 d4 86 75 61 89 14 b4 70 3c d2 bd af ..,d..ua...p<... 00:20:33.898 00000300 4c 3f c7 9d b7 39 39 83 ee 4a 03 0e 66 10 8b 64 L?...99..J..f..d 00:20:33.898 00000310 04 f8 1c fc c4 5d 47 93 59 18 c9 9b 4d 10 f9 cb .....]G.Y...M... 00:20:33.898 00000320 b0 fe 6e 0c 74 a2 3b d8 bf 55 71 fa 30 61 ba 7f ..n.t.;..Uq.0a.. 00:20:33.898 00000330 51 e5 36 47 43 d6 8f af a6 6a f6 a8 de 04 5e c9 Q.6GC....j....^. 00:20:33.898 00000340 0b cd 5f b3 a3 8b fa 5d b2 55 53 d2 eb 5a 78 6b .._....].US..Zxk 00:20:33.898 00000350 81 1d 8e ae bb 80 75 a3 28 26 a1 01 99 61 0e 66 ......u.(&...a.f 00:20:33.898 00000360 7b b7 d1 d0 b5 05 90 d7 03 58 12 81 86 d2 73 3e {........X....s> 00:20:33.898 00000370 a0 78 fe 87 bb 10 99 6f 3b e0 d7 7c 9b 8e 36 ea .x.....o;..|..6. 00:20:33.898 00000380 c9 dd f6 48 14 bb ba d0 76 8d c9 53 46 e1 e7 c2 ...H....v..SF... 00:20:33.898 00000390 1f db 40 43 99 8e 16 49 89 f2 62 d7 47 ca ad 2d ..@C...I..b.G..- 00:20:33.898 000003a0 ed 63 51 de fb dc fd ae 94 34 23 b2 33 64 9f 5d .cQ......4#.3d.] 00:20:33.898 000003b0 b4 ba 72 6b b1 be 7c c2 51 5f d1 71 ca 9b 48 95 ..rk..|.Q_.q..H. 00:20:33.898 000003c0 8b ef 08 88 d1 5a 30 53 17 1e bf 2a fe 9a 9b 12 .....Z0S...*.... 00:20:33.898 000003d0 30 35 0c 42 9c b0 48 2c 4a 14 87 42 11 79 b5 42 05.B..H,J..B.y.B 00:20:33.898 000003e0 a7 70 e8 e9 21 bc 1d 8f c5 88 4e ed fb 64 3c 0f .p..!.....N..d<. 00:20:33.898 000003f0 af 71 22 d4 e8 06 e1 57 e9 f2 31 b8 68 30 1b d7 .q"....W..1.h0.. 00:20:33.898 host pubkey: 00:20:33.898 00000000 05 45 6c 67 58 89 b5 c1 79 1a 9e 74 90 9d 2d 76 .ElgX...y..t..-v 00:20:33.898 00000010 dd e8 dd 2e 6c 13 b8 f0 aa ff b6 c0 95 e8 3b 51 ....l.........;Q 00:20:33.898 00000020 3f a4 bc 8c 56 75 ee 8d 60 ee 6e 8d 7e 3f 02 03 ?...Vu..`.n.~?.. 00:20:33.898 00000030 c9 a0 5d 2f 48 2b ec a7 31 1d 5a e3 94 e2 d2 71 ..]/H+..1.Z....q 00:20:33.898 00000040 89 70 df 1b 3d fd 78 fb c1 76 df 36 c4 d1 31 a1 .p..=.x..v.6..1. 00:20:33.898 00000050 98 be 61 28 63 39 fa 51 34 15 de bb b4 83 d4 37 ..a(c9.Q4......7 00:20:33.898 00000060 da 6e 8b 22 9d fa 69 61 f6 ef 2a a1 9c 18 09 e6 .n."..ia..*..... 00:20:33.898 00000070 42 32 d6 fe d5 81 95 68 12 4a 67 bc b2 d2 4f 0a B2.....h.Jg...O. 00:20:33.898 00000080 7d fe 47 b7 29 d1 f8 00 d7 8f 46 ca be af 7b e2 }.G.).....F...{. 00:20:33.898 00000090 4b d7 03 91 28 45 f3 5e 6a 77 93 ae 9d 81 59 3a K...(E.^jw....Y: 00:20:33.898 000000a0 ff fe bb c0 0e 50 d0 5b a6 0d c4 22 02 32 64 33 .....P.[...".2d3 00:20:33.898 000000b0 01 82 e4 8e c8 ec c7 66 da 0b 76 f9 87 dc 17 34 .......f..v....4 00:20:33.898 000000c0 55 9a 29 65 95 ae f8 fb ae de b3 2b 0d 1e 68 81 U.)e.......+..h. 00:20:33.898 000000d0 e0 26 e0 ae 90 41 20 47 96 f2 e2 0e 94 53 de 23 .&...A G.....S.# 00:20:33.898 000000e0 05 e9 3f 63 24 a5 d2 61 c9 dc 17 58 d3 2e d2 90 ..?c$..a...X.... 00:20:33.898 000000f0 48 24 d2 90 ff 07 40 be 72 e4 81 61 7b 90 da 50 H$....@.r..a{..P 00:20:33.898 00000100 96 a0 c1 2d de 47 57 37 bc f2 ec 49 64 f0 b9 df ...-.GW7...Id... 00:20:33.898 00000110 11 0a 2d 07 c0 25 2a 2e d7 7e bb 9d 7e da c3 75 ..-..%*..~..~..u 00:20:33.898 00000120 1f c6 7e a6 15 e4 7a e0 74 f8 4c 80 dc 33 4e 48 ..~...z.t.L..3NH 00:20:33.898 00000130 2f 04 97 77 2d 9b db da dc 79 22 6e 17 4d cf d6 /..w-....y"n.M.. 00:20:33.898 00000140 b6 fb 10 8f 9a 1a 45 8b ed 0c 56 19 e3 9e 91 55 ......E...V....U 00:20:33.898 00000150 95 2c ec 83 43 19 bd 29 9e 6a f2 47 27 10 5e c5 .,..C..).j.G'.^. 00:20:33.898 00000160 30 ef 75 a5 12 b3 ea e1 c7 e3 de c0 68 86 57 ce 0.u.........h.W. 00:20:33.898 00000170 9f a3 f2 2b 0d 59 2d 47 77 71 47 13 8f db fb 71 ...+.Y-GwqG....q 00:20:33.898 00000180 73 9f 80 78 92 5c da cd 2e a8 0f 0f 34 d6 7d 15 s..x.\......4.}. 00:20:33.898 00000190 d3 af b0 63 56 ad 09 13 d5 26 69 2b 22 e1 28 64 ...cV....&i+".(d 00:20:33.898 000001a0 df 47 bb fd 45 76 8e ca 28 dd d4 08 ec d4 33 a0 .G..Ev..(.....3. 00:20:33.898 000001b0 47 bd d2 53 05 ea 49 37 0b 54 21 73 16 36 53 72 G..S..I7.T!s.6Sr 00:20:33.898 000001c0 9e 28 8e 9e 6e 31 17 ca e5 fc c4 ea 5a 23 2f 0f .(..n1......Z#/. 00:20:33.898 000001d0 4f 93 40 cf d5 fa 84 a6 cb f8 b9 bb 1a 8b d4 0c O.@............. 00:20:33.898 000001e0 a7 8a 28 e1 c4 e5 b5 48 d1 cc d0 46 ac 10 58 17 ..(....H...F..X. 00:20:33.898 000001f0 ea ea b0 d4 46 52 88 0d ea 6d 3f 7d 40 e5 f2 a4 ....FR...m?}@... 00:20:33.898 00000200 5c 8a 48 26 3c ac e0 9c d0 5e b6 5c 34 a1 c3 c5 \.H&<....^.\4... 00:20:33.898 00000210 2a ac e5 f5 5c ee 1d 8c ff d6 44 1a 28 21 99 58 *...\.....D.(!.X 00:20:33.898 00000220 cb 6f ee 28 8f 59 b6 a1 d3 be 7a 36 ed ae 5e 24 .o.(.Y....z6..^$ 00:20:33.898 00000230 c7 f4 19 09 fd c7 d0 ac 55 1d eb d9 4b ec 98 94 ........U...K... 00:20:33.898 00000240 3b 7c b5 84 23 40 86 87 b8 fa 30 a6 f1 b0 bc 98 ;|..#@....0..... 00:20:33.898 00000250 85 f5 63 7f f6 27 0b d5 d5 01 b7 a8 55 3a 06 97 ..c..'......U:.. 00:20:33.898 00000260 c2 6d e1 b8 13 9a a3 9c 4c 11 d4 8b d8 96 0f 62 .m......L......b 00:20:33.898 00000270 5a f3 f5 4d 4b 83 07 d6 c3 ea 0c b3 31 c6 19 70 Z..MK.......1..p 00:20:33.898 00000280 46 7a 4e 00 4b a8 d6 91 63 14 5a 52 dd 14 a1 0a FzN.K...c.ZR.... 00:20:33.898 00000290 c4 c2 c6 f2 50 39 91 bb 50 fd ff b0 e2 e3 94 b7 ....P9..P....... 00:20:33.898 000002a0 26 54 c8 ce 77 4b 96 97 f2 3b 6b 19 cf 10 68 cc &T..wK...;k...h. 00:20:33.898 000002b0 52 14 87 10 6b 31 c2 34 28 5f f9 27 7d 7c 94 a1 R...k1.4(_.'}|.. 00:20:33.898 000002c0 da 86 3b a6 ef e6 dc 1a c4 fa 94 4a 5d d7 f3 af ..;........J]... 00:20:33.898 000002d0 cd 25 f9 5c 93 47 c8 24 52 f6 7f 0a 7a 5b 3c c9 .%.\.G.$R...z[<. 00:20:33.898 000002e0 11 2a b2 03 8a 1f e7 3b 9e e0 7e f2 1b 02 76 d6 .*.....;..~...v. 00:20:33.898 000002f0 15 83 0e a9 a4 2a 92 41 2d 69 e8 64 83 be 3e c4 .....*.A-i.d..>. 00:20:33.898 00000300 d6 3b b9 bd 54 46 b1 e5 bd 57 9e ac 35 af 9c 14 .;..TF...W..5... 00:20:33.898 00000310 9e 4d 47 38 1f 18 54 9d 24 62 dc 54 83 16 ad 88 .MG8..T.$b.T.... 00:20:33.898 00000320 b0 c3 ac 65 64 57 6d 82 81 b8 3c 97 7e d7 4a 1b ...edWm...<.~.J. 00:20:33.898 00000330 d7 09 89 4f 93 19 7b 1b 18 6a 1a a7 ab 0e 55 a7 ...O..{..j....U. 00:20:33.898 00000340 da 08 3a 63 25 25 1a 35 59 83 2a 3a aa 97 0b b8 ..:c%%.5Y.*:.... 00:20:33.898 00000350 8d 28 ed 34 5a d8 e8 16 c8 4f 45 46 17 6f c0 99 .(.4Z....OEF.o.. 00:20:33.899 00000360 3f 3c a0 80 1a 38 04 b3 43 54 7c 2c 52 52 f6 24 ?<...8..CT|,RR.$ 00:20:33.899 00000370 50 c0 e5 25 f3 cd 5f 57 7d 1a a9 4f 93 19 a7 0d P..%.._W}..O.... 00:20:33.899 00000380 bb 5b 4c 38 f8 21 f1 e7 f6 d3 71 bf 58 49 66 2d .[L8.!....q.XIf- 00:20:33.899 00000390 91 97 93 e2 4a f1 fb 57 5e 0d e6 a9 d9 85 30 82 ....J..W^.....0. 00:20:33.899 000003a0 62 ae 08 ed ce 30 eb 03 7e 46 c1 98 be b6 17 b0 b....0..~F...... 00:20:33.899 000003b0 1f a7 a6 dc c0 ce 68 1f ce 7b aa 4e 27 8f f3 fa ......h..{.N'... 00:20:33.899 000003c0 0e 4e 30 53 c2 a6 5b 69 2b be e8 87 93 7d f6 03 .N0S..[i+....}.. 00:20:33.899 000003d0 5e e7 0f a3 5f 45 a9 49 bc f4 c4 84 d1 88 9a 3c ^..._E.I.......< 00:20:33.899 000003e0 30 c4 aa a3 00 c4 b0 79 c1 63 58 3b 0b 81 77 45 0......y.cX;..wE 00:20:33.899 000003f0 7e 08 0e 79 bf 82 9e d4 0f 77 87 b2 c1 fb 40 04 ~..y.....w....@. 00:20:33.899 dh secret: 00:20:33.899 00000000 ab b0 c9 fa a4 a4 4a 2c f7 75 55 c3 88 9c af d0 ......J,.uU..... 00:20:33.899 00000010 c7 04 e5 43 96 46 c5 9d d7 75 a4 72 c2 cb 88 f2 ...C.F...u.r.... 00:20:33.899 00000020 53 42 cc e1 1a 1c b6 75 58 aa 45 e3 d7 07 c1 50 SB.....uX.E....P 00:20:33.899 00000030 e2 32 45 17 b4 57 14 29 20 06 09 ac f7 18 5b c5 .2E..W.) .....[. 00:20:33.899 00000040 3d 41 14 79 6f 50 0a 76 df 0e b7 c3 e9 75 f8 da =A.yoP.v.....u.. 00:20:33.899 00000050 08 96 20 80 64 17 6a 34 2c 8f 04 05 ba 5a 52 bd .. .d.j4,....ZR. 00:20:33.899 00000060 0f 27 80 b2 c4 b5 3f 35 9a 00 1f de 17 2b db e2 .'....?5.....+.. 00:20:33.899 00000070 f3 54 c3 1c 2d fb 8c 57 66 13 67 ba de 4e 11 f2 .T..-..Wf.g..N.. 00:20:33.899 00000080 46 1d c9 10 9b 53 ef 03 a6 fc 2a 18 7f eb 81 c8 F....S....*..... 00:20:33.899 00000090 b6 c7 9a 8d 1c 72 5b 0a 8c a8 a5 70 90 d2 61 93 .....r[....p..a. 00:20:33.899 000000a0 2d 87 ef 41 ed 4c 64 f7 da e6 b9 d9 50 14 05 13 -..A.Ld.....P... 00:20:33.899 000000b0 14 f1 fe d6 9a 1b 14 08 0c 5c ca 93 6b f9 e8 dd .........\..k... 00:20:33.899 000000c0 41 6c c8 7d 96 2d 89 78 a8 95 c8 c7 cf aa a2 73 Al.}.-.x.......s 00:20:33.899 000000d0 d6 a9 60 b6 05 33 a0 8d d2 50 3a 17 e0 8e cd d8 ..`..3...P:..... 00:20:33.899 000000e0 91 c2 25 8a f9 b8 ad dd c5 e7 7a e5 d8 9c fa 33 ..%.......z....3 00:20:33.899 000000f0 7f e7 3c 1b cb c7 b4 e5 a0 1e 7e 3b 8f 7b b7 eb ..<.......~;.{.. 00:20:33.899 00000100 62 24 32 06 e1 95 4c c0 d0 b5 fe 25 93 5d 62 f7 b$2...L....%.]b. 00:20:33.899 00000110 41 d8 9a 56 00 ab ac 91 6a 79 9b 50 e4 be 43 15 A..V....jy.P..C. 00:20:33.899 00000120 68 c9 7a d2 d6 5a 79 4d 97 74 07 46 cd c8 b5 e8 h.z..ZyM.t.F.... 00:20:33.899 00000130 87 fd 28 db 9f 4e 70 e5 95 e0 bb 4c 86 c8 a9 16 ..(..Np....L.... 00:20:33.899 00000140 07 92 2f 7f 3b 25 7d e0 86 f7 f4 a1 63 25 bc 2b ../.;%}.....c%.+ 00:20:33.899 00000150 15 20 39 ef d6 67 02 45 28 4d 7d 62 36 c6 2e 4a . 9..g.E(M}b6..J 00:20:33.899 00000160 12 12 53 71 fb 9c 1c d4 df d2 be fa 8d 88 89 bd ..Sq............ 00:20:33.899 00000170 c5 1a f7 af 60 81 5e c9 31 e9 d3 8b c6 4d 35 b8 ....`.^.1....M5. 00:20:33.899 00000180 a9 fa 70 aa 12 ae 71 a8 3e 0f 8a 8d 9b 9f e5 ab ..p...q.>....... 00:20:33.899 00000190 a8 f4 d3 40 3a c9 32 a7 3f 06 8b 5e a9 17 89 26 ...@:.2.?..^...& 00:20:33.899 000001a0 8f 6c 99 6f 96 2f 95 f9 63 4e c6 7c 37 20 4a 42 .l.o./..cN.|7 JB 00:20:33.899 000001b0 1b 44 68 28 da a0 99 3d a7 62 29 00 f7 ba 77 d8 .Dh(...=.b)...w. 00:20:33.899 000001c0 6d 78 f8 3e 15 07 1a 76 b2 f3 e7 af 1e 88 3d 2b mx.>...v......=+ 00:20:33.899 000001d0 e7 f8 8f fc a1 3f 93 30 0a 62 82 34 c7 82 31 e9 .....?.0.b.4..1. 00:20:33.899 000001e0 b9 f6 04 ae 59 79 d6 c9 6a ab 9a 9e 5a 6a 6f 0f ....Yy..j...Zjo. 00:20:33.899 000001f0 41 78 28 f6 5a b2 67 c3 b6 4e a2 9e da 85 e2 5c Ax(.Z.g..N.....\ 00:20:33.899 00000200 97 b6 f6 4f cc 4e b3 8d 9e 19 63 68 b7 d6 02 39 ...O.N....ch...9 00:20:33.899 00000210 94 03 43 93 0b 0c c3 2f 3a 02 9a 5c 38 e3 61 00 ..C..../:..\8.a. 00:20:33.899 00000220 02 30 ef 61 75 62 ac 95 e4 18 53 7b f2 99 cb 54 .0.aub....S{...T 00:20:33.899 00000230 a2 f0 19 6a ae 57 2d 0d 78 2c e1 e4 ca 46 2a a3 ...j.W-.x,...F*. 00:20:33.899 00000240 70 9e 74 ae 54 73 16 97 a8 03 57 27 e7 2a 61 5b p.t.Ts....W'.*a[ 00:20:33.899 00000250 ee ce 6d 02 fd ce 87 aa 6a f4 29 4c fd bb 29 43 ..m.....j.)L..)C 00:20:33.899 00000260 66 5e 91 7f f6 9f 59 3b 95 39 15 e5 c2 4a 35 2a f^....Y;.9...J5* 00:20:33.899 00000270 b9 64 5b 35 f4 e6 8a 13 1f 85 37 c2 6d 6d a6 18 .d[5......7.mm.. 00:20:33.899 00000280 6e 4b 98 ff 26 de 45 f9 c9 f2 fc 5a 74 99 78 1b nK..&.E....Zt.x. 00:20:33.899 00000290 76 d0 1f 27 58 cc 10 de ac 5d c5 dd e7 45 94 19 v..'X....]...E.. 00:20:33.899 000002a0 04 af eb 58 c1 3c 10 7c 3f d4 31 91 cf 27 b0 0a ...X.<.|?.1..'.. 00:20:33.899 000002b0 9e 63 58 bc ab c6 89 35 92 06 1b f7 3c 09 a1 df .cX....5....<... 00:20:33.899 000002c0 90 d4 11 79 19 9a 1e 74 60 af f7 95 86 b7 1f ac ...y...t`....... 00:20:33.899 000002d0 b1 82 17 9f c8 89 f5 b9 c4 46 1d 2f 1a e4 2b e0 .........F./..+. 00:20:33.899 000002e0 b1 6a 13 13 10 33 ed 6f 3f 0b 2c cf a3 4b cf 4f .j...3.o?.,..K.O 00:20:33.899 000002f0 7f 33 e8 a9 bf 49 aa e2 2e 49 de 96 e5 09 09 1f .3...I...I...... 00:20:33.899 00000300 e4 e9 7b 6b 0a 12 d9 f1 89 69 d8 28 5c b0 23 34 ..{k.....i.(\.#4 00:20:33.899 00000310 46 d4 02 aa 6e f1 59 b9 ad 28 ca 10 4c 98 f0 5a F...n.Y..(..L..Z 00:20:33.899 00000320 e4 30 04 2e 58 fa 7a d4 9a 0b 08 e2 5b 4c 18 7f .0..X.z.....[L.. 00:20:33.899 00000330 46 18 f1 c9 44 c5 4e 1f b6 fe 41 df df 67 df 1b F...D.N...A..g.. 00:20:33.899 00000340 d5 2d a2 8b cb 75 3e 4c 88 23 e5 12 f3 e2 47 36 .-...u>L.#....G6 00:20:33.899 00000350 f0 44 41 95 67 d1 94 b1 6d c0 a3 63 f1 d9 a7 5b .DA.g...m..c...[ 00:20:33.899 00000360 e9 bb 1a 14 95 b2 d0 b2 30 87 26 5c 8b 87 c1 aa ........0.&\.... 00:20:33.899 00000370 b8 95 17 6f e4 12 22 39 f4 48 73 e1 6c 11 46 92 ...o.."9.Hs.l.F. 00:20:33.899 00000380 9c 98 74 7d 87 51 e8 66 bf 23 2f c1 79 8a 78 76 ..t}.Q.f.#/.y.xv 00:20:33.899 00000390 da 7c df bb 3d e2 23 2d 9b 80 b9 bb ef 76 cb 99 .|..=.#-.....v.. 00:20:33.899 000003a0 5d b7 c3 c0 62 7a a1 3d a8 49 a3 9c e0 7d c8 4b ]...bz.=.I...}.K 00:20:33.899 000003b0 84 d0 41 e0 dc 4b 00 47 73 72 a5 b3 82 d6 24 25 ..A..K.Gsr....$% 00:20:33.899 000003c0 2b b2 64 0c c3 d8 af db ae 42 35 a0 26 a9 06 e2 +.d......B5.&... 00:20:33.899 000003d0 b8 a5 35 92 a0 b1 5a f4 e2 af d8 9c d4 2a 55 48 ..5...Z......*UH 00:20:33.899 000003e0 e0 d9 22 d0 37 46 25 d0 d9 e3 49 9d 75 70 44 cb ..".7F%...I.upD. 00:20:33.899 000003f0 94 43 fb c0 46 ff 6d d5 2b 39 6d 2a 02 86 17 5e .C..F.m.+9m*...^ 00:20:33.899 [2024-09-27 15:25:14.238945] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=1, dhgroup=5, seq=3428451744, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:33.899 [2024-09-27 15:25:14.239014] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.899 [2024-09-27 15:25:14.321023] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.899 [2024-09-27 15:25:14.321054] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.899 [2024-09-27 15:25:14.321061] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.899 [2024-09-27 15:25:14.472045] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.899 [2024-09-27 15:25:14.472066] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.899 [2024-09-27 15:25:14.472073] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.899 [2024-09-27 15:25:14.472117] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.899 [2024-09-27 15:25:14.472140] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.899 ctrlr pubkey: 00:20:33.899 00000000 7f 48 c1 59 a4 c9 c3 06 2e 02 a3 2d 85 43 ee 5a .H.Y.......-.C.Z 00:20:33.899 00000010 49 e2 a0 d8 63 5c 67 70 ac 71 86 4e 35 a6 c3 15 I...c\gp.q.N5... 00:20:33.899 00000020 42 cd 20 11 db 4a b8 fa c1 d5 be a5 f0 83 98 e3 B. ..J.......... 00:20:33.899 00000030 c6 9e 3f 66 e7 8f ad a8 12 00 f1 ad 74 ea da a9 ..?f........t... 00:20:33.899 00000040 80 b1 22 b1 b9 9f 50 a1 7c 81 37 d2 01 c4 9d 8f .."...P.|.7..... 00:20:33.899 00000050 78 b1 67 62 7e f3 e3 2c 8c 05 61 04 d4 ec d0 74 x.gb~..,..a....t 00:20:33.899 00000060 2c 2a e9 1a 19 36 92 52 cd 2e a1 96 c8 ff 44 ad ,*...6.R......D. 00:20:33.899 00000070 1b 97 0d c3 bf 5e 7e 0f 9b 06 15 78 61 9b 22 40 .....^~....xa."@ 00:20:33.899 00000080 7e 02 38 04 bb 81 11 11 4f 4b 6e ff 76 48 79 00 ~.8.....OKn.vHy. 00:20:33.899 00000090 10 f9 a1 39 ca 4b 13 06 da a3 af 06 ff 50 38 2a ...9.K.......P8* 00:20:33.899 000000a0 f9 98 df 20 1a e5 df 92 e0 21 fe 74 c2 cd 6f 0f ... .....!.t..o. 00:20:33.899 000000b0 31 70 c9 40 21 da 70 08 fb 1b ba 17 2f 80 54 fe 1p.@!.p...../.T. 00:20:33.899 000000c0 52 df b3 d5 aa 3c be 5c 7d c7 02 f2 a6 40 7a 1e R....<.\}....@z. 00:20:33.899 000000d0 fc 70 83 65 bf 4b 68 7c ba f6 56 08 56 65 4f b1 .p.e.Kh|..V.VeO. 00:20:33.899 000000e0 b2 86 e7 fc 88 2c 50 57 50 33 ce b0 8f c5 b0 63 .....,PWP3.....c 00:20:33.899 000000f0 87 79 97 8c 13 cc 59 58 39 5a 19 dd 78 0c df 79 .y....YX9Z..x..y 00:20:33.899 host pubkey: 00:20:33.899 00000000 33 de 1e 9e 94 da a7 bf b5 b0 97 60 ee 14 6b cb 3..........`..k. 00:20:33.899 00000010 98 df 1c 68 ab 91 3c 4d e8 22 10 75 8b ab 19 d8 ...h...?{p.. 00:20:33.900 dh secret: 00:20:33.900 00000000 6f d5 03 97 03 f5 dd d9 03 bc 27 e4 fd 25 55 2a o.........'..%U* 00:20:33.900 00000010 c7 bc ba ab bb d5 3a 89 e8 db d8 55 57 af cc b3 ......:....UW... 00:20:33.900 00000020 93 a7 fc 19 a2 a9 da d2 8c 8a d0 0b ff b6 6b 02 ..............k. 00:20:33.900 00000030 e8 bd 84 62 1d 36 20 b4 4f d9 dd a9 04 92 80 f0 ...b.6 .O....... 00:20:33.900 00000040 de bb 1e 1c 52 c4 c7 1a bf 8d 1a f7 89 7c 49 e8 ....R........|I. 00:20:33.900 00000050 2d 9a bd 0e 0f de 36 af 73 1f 55 af 79 d6 8f 90 -.....6.s.U.y... 00:20:33.900 00000060 93 f2 96 53 21 e0 8c 25 46 09 14 2c f6 a2 e8 21 ...S!..%F..,...! 00:20:33.900 00000070 82 81 63 f0 d1 7c b6 e5 98 4c 71 d8 ef 96 6f da ..c..|...Lq...o. 00:20:33.900 00000080 e6 17 dd 62 8f f0 2f d1 5f c2 e7 80 aa 26 b2 34 ...b../._....&.4 00:20:33.900 00000090 a4 66 aa db d6 53 4b d3 ec 16 ab 96 f2 49 86 84 .f...SK......I.. 00:20:33.900 000000a0 ae 1c 57 40 7e 91 93 59 4c c1 a3 4f 6e 6e 17 24 ..W@~..YL..Onn.$ 00:20:33.900 000000b0 e7 91 b4 eb c4 14 46 b2 ed be 16 4d 2a 91 35 26 ......F....M*.5& 00:20:33.900 000000c0 be 7b f4 79 9d d1 06 b3 51 45 02 08 24 26 ac 97 .{.y....QE..$&.. 00:20:33.900 000000d0 17 a9 82 64 72 05 6b 86 ad 9f a6 96 d1 c8 89 57 ...dr.k........W 00:20:33.900 000000e0 52 8b 1c 9e d1 0a e7 c5 fa 02 7d 4a 98 ff ba 41 R.........}J...A 00:20:33.900 000000f0 57 b6 1f f6 12 04 16 55 35 84 66 e3 ff 66 9e a4 W......U5.f..f.. 00:20:33.900 [2024-09-27 15:25:14.474638] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=2, dhgroup=1, seq=3428451745, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.900 [2024-09-27 15:25:14.477123] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.900 [2024-09-27 15:25:14.477163] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.900 [2024-09-27 15:25:14.477179] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.900 [2024-09-27 15:25:14.477194] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.900 [2024-09-27 15:25:14.477212] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.900 [2024-09-27 15:25:14.583475] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.900 [2024-09-27 15:25:14.583493] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.900 [2024-09-27 15:25:14.583501] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.900 [2024-09-27 15:25:14.583511] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.900 [2024-09-27 15:25:14.583565] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.900 ctrlr pubkey: 00:20:33.900 00000000 7f 48 c1 59 a4 c9 c3 06 2e 02 a3 2d 85 43 ee 5a .H.Y.......-.C.Z 00:20:33.900 00000010 49 e2 a0 d8 63 5c 67 70 ac 71 86 4e 35 a6 c3 15 I...c\gp.q.N5... 00:20:33.900 00000020 42 cd 20 11 db 4a b8 fa c1 d5 be a5 f0 83 98 e3 B. ..J.......... 00:20:33.900 00000030 c6 9e 3f 66 e7 8f ad a8 12 00 f1 ad 74 ea da a9 ..?f........t... 00:20:33.900 00000040 80 b1 22 b1 b9 9f 50 a1 7c 81 37 d2 01 c4 9d 8f .."...P.|.7..... 00:20:33.900 00000050 78 b1 67 62 7e f3 e3 2c 8c 05 61 04 d4 ec d0 74 x.gb~..,..a....t 00:20:33.900 00000060 2c 2a e9 1a 19 36 92 52 cd 2e a1 96 c8 ff 44 ad ,*...6.R......D. 00:20:33.900 00000070 1b 97 0d c3 bf 5e 7e 0f 9b 06 15 78 61 9b 22 40 .....^~....xa."@ 00:20:33.900 00000080 7e 02 38 04 bb 81 11 11 4f 4b 6e ff 76 48 79 00 ~.8.....OKn.vHy. 00:20:33.900 00000090 10 f9 a1 39 ca 4b 13 06 da a3 af 06 ff 50 38 2a ...9.K.......P8* 00:20:33.900 000000a0 f9 98 df 20 1a e5 df 92 e0 21 fe 74 c2 cd 6f 0f ... .....!.t..o. 00:20:33.900 000000b0 31 70 c9 40 21 da 70 08 fb 1b ba 17 2f 80 54 fe 1p.@!.p...../.T. 00:20:33.900 000000c0 52 df b3 d5 aa 3c be 5c 7d c7 02 f2 a6 40 7a 1e R....<.\}....@z. 00:20:33.900 000000d0 fc 70 83 65 bf 4b 68 7c ba f6 56 08 56 65 4f b1 .p.e.Kh|..V.VeO. 00:20:33.900 000000e0 b2 86 e7 fc 88 2c 50 57 50 33 ce b0 8f c5 b0 63 .....,PWP3.....c 00:20:33.900 000000f0 87 79 97 8c 13 cc 59 58 39 5a 19 dd 78 0c df 79 .y....YX9Z..x..y 00:20:33.900 host pubkey: 00:20:33.900 00000000 39 67 8a f8 05 5c 53 04 40 05 3e ba 8b d2 57 2f 9g...\S.@.>...W/ 00:20:33.900 00000010 47 16 aa 2f 4d ff 6c 60 0b 71 05 45 f2 6d fd 4f G../M.l`.q.E.m.O 00:20:33.900 00000020 36 49 15 63 18 71 90 4a d2 db 9e 4c 66 71 0c 9a 6I.c.q.J...Lfq.. 00:20:33.900 00000030 42 ae e5 e9 22 6c 2d 32 d9 d0 41 0d 18 ab 76 9d B..."l-2..A...v. 00:20:33.900 00000040 c2 c5 8d 90 0d 36 12 c4 f6 b3 70 de 5e 0c 49 8a .....6....p.^.I. 00:20:33.900 00000050 c9 79 cb 68 2b 30 15 3c 28 37 a4 94 03 81 2f c7 .y.h+0.<(7..../. 00:20:33.900 00000060 27 99 48 e3 17 05 82 14 47 b1 3d 84 bc d7 34 d8 '.H.....G.=...4. 00:20:33.900 00000070 bb 07 4e 27 e9 d3 fc e8 7c 27 cf ee 4a e3 7c 16 ..N'....|'..J.|. 00:20:33.900 00000080 c7 8f 63 7d ed e9 0b dd 1f 3b fc 53 f5 db c3 77 ..c}.....;.S...w 00:20:33.900 00000090 07 40 13 52 6b 92 dc 38 2a cb e6 44 ba c7 e4 f2 .@.Rk..8*..D.... 00:20:33.900 000000a0 76 7a 6b 86 86 ff 18 5e 2b 1c 8c cb b9 bd 9f 3a vzk....^+......: 00:20:33.900 000000b0 18 cf 96 22 09 65 70 a0 2f 1a b1 8a fc 04 e4 4a ...".ep./......J 00:20:33.900 000000c0 79 7f bb 4b ed 7d ab d6 20 31 85 5e c2 18 10 83 y..K.}.. 1.^.... 00:20:33.900 000000d0 11 50 2a 79 3a 80 e3 89 61 a2 89 81 76 5d f2 1b .P*y:...a...v].. 00:20:33.900 000000e0 b0 64 4b 44 ae 49 64 58 13 ba bb 34 11 9d 0d 90 .dKD.IdX...4.... 00:20:33.900 000000f0 38 eb 22 ae da a9 d0 e2 45 d4 30 87 55 11 d7 31 8.".....E.0.U..1 00:20:33.900 dh secret: 00:20:33.900 00000000 d5 b2 63 04 52 0c bc a8 be 49 fa 40 b0 b6 87 d2 ..c.R....I.@.... 00:20:33.900 00000010 d5 2a 5d 93 e1 85 36 15 e2 12 78 dd b4 a5 a7 c8 .*]...6...x..... 00:20:33.900 00000020 5e df c8 f1 19 79 3e 91 f0 1f 19 29 09 25 a5 c8 ^....y>....).%.. 00:20:33.900 00000030 ce 6d 50 28 92 3f 0e 96 3a de be 09 86 32 35 10 .mP(.?..:....25. 00:20:33.900 00000040 16 25 ea 5f 95 ec 0c 66 e9 b1 19 23 57 5f 83 53 .%._...f...#W_.S 00:20:33.900 00000050 7d 9a 01 2f 6b 4e f0 b9 77 3d f1 f0 0a 1a 32 95 }../kN..w=....2. 00:20:33.900 00000060 5d 6f 79 0e c8 48 3e 18 14 36 2a f4 10 af d6 63 ]oy..H>..6*....c 00:20:33.900 00000070 80 2c 10 0e b8 b4 e0 67 94 00 91 11 03 14 f8 0e .,.....g........ 00:20:33.900 00000080 55 66 20 4e 97 cc ff ad 11 d2 c5 3f 99 0e 00 41 Uf N.......?...A 00:20:33.900 00000090 23 62 69 00 cd 07 9c 2c a9 04 37 ba f2 71 ee f6 #bi....,..7..q.. 00:20:33.900 000000a0 8d 08 f1 39 75 58 9f 98 5a 9a eb 71 1e b0 9b f1 ...9uX..Z..q.... 00:20:33.900 000000b0 74 36 c4 11 f4 8f 02 8d 0f 24 32 c7 4c 68 83 c3 t6.......$2.Lh.. 00:20:33.900 000000c0 ae 4f ad 53 05 a9 f9 95 c8 10 cb 89 89 1e 10 e2 .O.S............ 00:20:33.900 000000d0 9a ff bb 8c 51 fa c9 4a 8d c6 67 3b da 98 9f c8 ....Q..J..g;.... 00:20:33.900 000000e0 67 30 5a c8 7d e6 81 89 64 54 c2 3b 37 92 06 ba g0Z.}...dT.;7... 00:20:33.900 000000f0 18 ee 8b 1c 5e bd 6d 5c 4e ff e4 11 55 d8 d7 55 ....^.m\N...U..U 00:20:33.900 [2024-09-27 15:25:14.586193] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=2, dhgroup=1, seq=3428451746, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.900 [2024-09-27 15:25:14.586288] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.900 [2024-09-27 15:25:14.595057] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.900 [2024-09-27 15:25:14.595132] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.900 [2024-09-27 15:25:14.595142] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.900 [2024-09-27 15:25:14.595181] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.900 [2024-09-27 15:25:14.751569] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.900 [2024-09-27 15:25:14.751593] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.900 [2024-09-27 15:25:14.751601] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.900 [2024-09-27 15:25:14.751646] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.900 [2024-09-27 15:25:14.751669] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.900 ctrlr pubkey: 00:20:33.900 00000000 ab d4 97 80 a9 32 08 86 66 07 d3 36 c6 98 64 ae .....2..f..6..d. 00:20:33.900 00000010 8f a5 5d ae 46 b8 e0 81 5d a0 38 45 f6 17 a0 fd ..].F...].8E.... 00:20:33.900 00000020 b4 4e f3 43 9f c1 87 07 17 cf 83 b6 9a 08 c5 cb .N.C............ 00:20:33.900 00000030 65 a7 33 a6 c2 29 b9 b4 6a f3 e8 ee bf aa 3e 63 e.3..)..j.....>c 00:20:33.900 00000040 82 d5 62 ff 22 97 01 1f b0 70 d1 35 59 88 f9 f3 ..b."....p.5Y... 00:20:33.900 00000050 f8 25 25 0c 4b cc 9f e1 2a 3a 22 26 e1 c8 2f 95 .%%.K...*:"&../. 00:20:33.900 00000060 d7 62 9d 2a ac a3 42 a3 89 6b 69 93 05 39 5c 3b .b.*..B..ki..9\; 00:20:33.900 00000070 5d 6d 84 61 b9 41 f7 b2 b3 b2 2f 9c e2 6a ee 1e ]m.a.A..../..j.. 00:20:33.900 00000080 fd ca 35 c7 cc 2e 21 9e 60 78 04 76 f7 18 2d 33 ..5...!.`x.v..-3 00:20:33.900 00000090 50 1d 16 24 d5 97 89 87 70 9d 6b 4e fd 8a 2d a8 P..$....p.kN..-. 00:20:33.901 000000a0 58 cf 5e 3f 2d 58 b4 10 7b ab 71 3e 68 8b 18 38 X.^?-X..{.q>h..8 00:20:33.901 000000b0 71 03 2a 50 81 77 75 ef af d1 e8 ea 8c 00 42 c2 q.*P.wu.......B. 00:20:33.901 000000c0 26 9e 76 f8 22 ab ab 75 f7 a4 92 9e f5 37 65 54 &.v."..u.....7eT 00:20:33.901 000000d0 82 fb 5a e2 f6 1d d3 a4 bc 06 9c 6a ab 6c b3 d8 ..Z........j.l.. 00:20:33.901 000000e0 39 e7 52 b5 c3 ad 52 80 de b4 f6 34 13 1d 86 5e 9.R...R....4...^ 00:20:33.901 000000f0 06 0c 42 9a 66 0f 66 a9 dc bd e7 f2 91 66 8a 9a ..B.f.f......f.. 00:20:33.901 host pubkey: 00:20:33.901 00000000 db 5a a6 af 2b 4b 3d 03 53 56 a2 50 e8 1d 00 b3 .Z..+K=.SV.P.... 00:20:33.901 00000010 a5 6e e5 bb ab 76 d4 96 6c f2 3b 4d ca 2a 35 0a .n...v..l.;M.*5. 00:20:33.901 00000020 f2 03 72 9e 95 6f 51 b9 85 47 95 79 a3 e1 fb cc ..r..oQ..G.y.... 00:20:33.901 00000030 bd 70 a2 aa 8b d1 a6 64 84 b4 d2 ec 98 17 ac fb .p.....d........ 00:20:33.901 00000040 6b 26 fc 12 d6 d7 c9 05 f6 2f 3a 17 bd 2f 93 d1 k&......./:../.. 00:20:33.901 00000050 67 f3 b3 c8 97 10 1c 24 53 e9 b7 75 9b 6a 4f f9 g......$S..u.jO. 00:20:33.901 00000060 52 f3 9f 48 cb 47 d0 ec 16 94 d7 9a e0 ec e7 80 R..H.G.......... 00:20:33.901 00000070 58 4b 92 ab 88 d0 8c 32 24 61 77 b3 c1 aa 56 55 XK.....2$aw...VU 00:20:33.901 00000080 c1 ee cc 54 09 ce 28 ae 43 27 aa 49 96 bd ea 8d ...T..(.C'.I.... 00:20:33.901 00000090 a5 50 ac 74 40 2a 64 b3 dd e5 91 2d 26 cb 9d 8c .P.t@*d....-&... 00:20:33.901 000000a0 df b2 d0 14 37 e1 63 56 17 1d 43 91 87 b6 38 d7 ....7.cV..C...8. 00:20:33.901 000000b0 36 b9 91 50 b0 09 3a 0d 2f cc 88 ae b4 b1 27 85 6..P..:./.....'. 00:20:33.901 000000c0 1f 91 b6 9a e1 d1 1d e9 4f 6e 5e d3 ea 7c 6f 4a ........On^..|oJ 00:20:33.901 000000d0 98 8f dc 2f 1b 59 ac f2 ef 7a aa 8e 6c 4c 7c 6c .../.Y...z..lL|l 00:20:33.901 000000e0 fc fc 4a eb 5d 99 57 42 6c 2d f4 8d fd 2a 79 cc ..J.].WBl-...*y. 00:20:33.901 000000f0 2b 39 48 54 97 f9 a6 df 8b 3b b0 cd 51 2e a2 5f +9HT.....;..Q.._ 00:20:33.901 dh secret: 00:20:33.901 00000000 b9 34 b8 5b ea 87 49 d3 9d 64 0b 48 f6 76 d5 94 .4.[..I..d.H.v.. 00:20:33.901 00000010 0c f1 4c 2d 5c 7f 26 cb 26 e8 30 6d ff a2 70 2e ..L-\.&.&.0m..p. 00:20:33.901 00000020 0a 0a b8 df 38 4f e6 c6 fc 17 e9 00 9f db 33 0b ....8O........3. 00:20:33.901 00000030 de 08 8f 2e 16 2f d7 0d 52 71 20 95 1b e6 87 38 ...../..Rq ....8 00:20:33.901 00000040 4e 3b 44 22 13 fe e3 dd de e3 48 5f 0c bf 95 5a N;D"......H_...Z 00:20:33.901 00000050 36 ba 35 ad 2e 8e e7 4d c7 8b 47 ae c6 00 75 80 6.5....M..G...u. 00:20:33.901 00000060 49 50 27 80 f0 3a ad f2 6b 42 d3 cb 43 dc 12 35 IP'..:..kB..C..5 00:20:33.901 00000070 3f 69 dc 0f 0f 2a 5d 83 f5 33 35 fd 06 48 da 94 ?i...*]..35..H.. 00:20:33.901 00000080 d2 e8 67 4e 3c e3 f6 4c 0c 11 82 5f 0a 73 d2 0b ..gN<..L..._.s.. 00:20:33.901 00000090 61 4d 8d 71 03 d1 10 a5 50 c0 15 38 99 a4 7c bb aM.q....P..8..|. 00:20:33.901 000000a0 6d d4 1e 09 5c e7 60 01 95 ac 32 d3 a4 ed 22 29 m...\.`...2...") 00:20:33.901 000000b0 23 e9 81 82 5f 7b 6b 50 f1 52 5c 64 20 2d 6e 63 #..._{kP.R\d -nc 00:20:33.901 000000c0 ce 6f 1b e7 72 80 a6 d4 ea ef b9 dd 10 2c 19 a1 .o..r........,.. 00:20:33.901 000000d0 29 04 2c 38 ea 3c 33 28 be be 52 3e a9 9a 28 0f ).,8.<3(..R>..(. 00:20:33.901 000000e0 3e f8 d7 78 31 0c 8e 5e bf 31 a7 7d 17 a8 33 9c >..x1..^.1.}..3. 00:20:33.901 000000f0 7b 41 9c 6a 9a 83 6e fe 7e b5 79 e4 c2 a5 2e f9 {A.j..n.~.y..... 00:20:33.901 [2024-09-27 15:25:14.754302] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=2, dhgroup=1, seq=3428451747, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.901 [2024-09-27 15:25:14.756982] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.901 [2024-09-27 15:25:14.757026] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.901 [2024-09-27 15:25:14.757039] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.901 [2024-09-27 15:25:14.757063] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.901 [2024-09-27 15:25:14.757074] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.901 [2024-09-27 15:25:14.863517] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.901 [2024-09-27 15:25:14.863565] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.901 [2024-09-27 15:25:14.863588] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.901 [2024-09-27 15:25:14.863624] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.901 [2024-09-27 15:25:14.863679] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.901 ctrlr pubkey: 00:20:33.901 00000000 ab d4 97 80 a9 32 08 86 66 07 d3 36 c6 98 64 ae .....2..f..6..d. 00:20:33.901 00000010 8f a5 5d ae 46 b8 e0 81 5d a0 38 45 f6 17 a0 fd ..].F...].8E.... 00:20:33.901 00000020 b4 4e f3 43 9f c1 87 07 17 cf 83 b6 9a 08 c5 cb .N.C............ 00:20:33.901 00000030 65 a7 33 a6 c2 29 b9 b4 6a f3 e8 ee bf aa 3e 63 e.3..)..j.....>c 00:20:33.901 00000040 82 d5 62 ff 22 97 01 1f b0 70 d1 35 59 88 f9 f3 ..b."....p.5Y... 00:20:33.901 00000050 f8 25 25 0c 4b cc 9f e1 2a 3a 22 26 e1 c8 2f 95 .%%.K...*:"&../. 00:20:33.901 00000060 d7 62 9d 2a ac a3 42 a3 89 6b 69 93 05 39 5c 3b .b.*..B..ki..9\; 00:20:33.901 00000070 5d 6d 84 61 b9 41 f7 b2 b3 b2 2f 9c e2 6a ee 1e ]m.a.A..../..j.. 00:20:33.901 00000080 fd ca 35 c7 cc 2e 21 9e 60 78 04 76 f7 18 2d 33 ..5...!.`x.v..-3 00:20:33.901 00000090 50 1d 16 24 d5 97 89 87 70 9d 6b 4e fd 8a 2d a8 P..$....p.kN..-. 00:20:33.901 000000a0 58 cf 5e 3f 2d 58 b4 10 7b ab 71 3e 68 8b 18 38 X.^?-X..{.q>h..8 00:20:33.901 000000b0 71 03 2a 50 81 77 75 ef af d1 e8 ea 8c 00 42 c2 q.*P.wu.......B. 00:20:33.901 000000c0 26 9e 76 f8 22 ab ab 75 f7 a4 92 9e f5 37 65 54 &.v."..u.....7eT 00:20:33.901 000000d0 82 fb 5a e2 f6 1d d3 a4 bc 06 9c 6a ab 6c b3 d8 ..Z........j.l.. 00:20:33.901 000000e0 39 e7 52 b5 c3 ad 52 80 de b4 f6 34 13 1d 86 5e 9.R...R....4...^ 00:20:33.901 000000f0 06 0c 42 9a 66 0f 66 a9 dc bd e7 f2 91 66 8a 9a ..B.f.f......f.. 00:20:33.901 host pubkey: 00:20:33.901 00000000 db c0 fe 06 67 3d 0f 95 ee 23 f3 a9 9a eb b8 8d ....g=...#...... 00:20:33.901 00000010 34 85 5a d2 cd 40 ba 3a ea 7e d2 e7 a7 40 24 86 4.Z..@.:.~...@$. 00:20:33.901 00000020 49 1e 6e 7b 85 ab ad 87 95 d3 0f 38 41 95 0a 85 I.n{.......8A... 00:20:33.901 00000030 ac 51 e9 1c 07 0a 5c b2 b4 1a 6d 38 be 00 7b 00 .Q....\...m8..{. 00:20:33.901 00000040 72 72 c1 de f4 f7 fe 99 ab 57 0f ae 52 3c ff af rr.......W..R<.. 00:20:33.901 00000050 1d b2 2d 55 37 1c 5c 7e ae 37 56 f5 9e 47 c6 22 ..-U7.\~.7V..G." 00:20:33.901 00000060 6a ef a0 57 40 33 05 02 48 2c ea 0f 71 1f 69 71 j..W@3..H,..q.iq 00:20:33.901 00000070 db a9 f5 69 92 8a 57 8d 02 50 d0 7c 17 13 c7 29 ...i..W..P.|...) 00:20:33.901 00000080 23 da a6 6a 21 c1 33 94 7f ba bb d8 7f 82 6e 77 #..j!.3.......nw 00:20:33.901 00000090 13 d2 cf 8d 91 67 a5 dc cb ff 06 6e 75 27 83 99 .....g.....nu'.. 00:20:33.901 000000a0 f2 26 61 de 26 f3 b0 e4 33 8e 07 fd 4d a8 74 17 .&a.&...3...M.t. 00:20:33.901 000000b0 ec 50 07 c4 f2 28 0d 91 9d 6c 11 ae a6 3d 94 40 .P...(...l...=.@ 00:20:33.901 000000c0 76 22 82 6e 39 f1 1a 07 ab b8 b6 f9 b4 da d3 7e v".n9..........~ 00:20:33.901 000000d0 fc bf 35 8f 24 2e 59 d6 34 79 cd e5 aa cb 15 bd ..5.$.Y.4y...... 00:20:33.901 000000e0 d1 c6 d3 39 2f 5d 55 c9 c6 0f e7 e9 14 b2 98 d4 ...9/]U......... 00:20:33.901 000000f0 6d 57 63 e0 78 e1 12 ee 16 bb fc f4 b1 c9 e7 1d mWc.x........... 00:20:33.901 dh secret: 00:20:33.901 00000000 4f 14 fa 16 39 06 59 70 45 48 fe 12 d7 f4 db 96 O...9.YpEH...... 00:20:33.901 00000010 53 93 ed 7d e2 79 cc 68 57 65 76 90 0a 27 81 0c S..}.y.hWev..'.. 00:20:33.901 00000020 2c c9 7b 7b 69 e9 a5 46 41 e4 c2 26 32 c9 48 fc ,.{{i..FA..&2.H. 00:20:33.901 00000030 65 78 4a 23 9e d8 13 cd 94 5a f9 5e a7 22 27 3c exJ#.....Z.^."'< 00:20:33.901 00000040 e5 d7 e4 69 b9 61 b0 b3 89 47 7a 96 b5 dd 8b 12 ...i.a...Gz..... 00:20:33.901 00000050 87 58 4f 61 67 fb a4 6f 69 37 80 cd b4 4d bd 59 .XOag..oi7...M.Y 00:20:33.901 00000060 f6 de c2 34 76 a9 4b a3 24 10 59 14 70 2b c6 a8 ...4v.K.$.Y.p+.. 00:20:33.901 00000070 21 60 55 db 5f 74 d4 48 3a f1 4f 4e 49 03 be e0 !`U._t.H:.ONI... 00:20:33.901 00000080 10 e5 9e 4e 7d 43 07 a7 a5 fc b3 48 41 7d 5d 93 ...N}C.....HA}]. 00:20:33.901 00000090 67 a4 79 ba 87 bd 0f 98 ad ea 44 7d 65 14 d1 14 g.y.......D}e... 00:20:33.901 000000a0 45 39 2e 0c 05 96 c3 eb 37 15 33 db 28 fd 0c 91 E9......7.3.(... 00:20:33.901 000000b0 75 8d 40 8e f1 54 1e a9 94 97 1f 46 7c 3b 80 45 u.@..T.....F|;.E 00:20:33.901 000000c0 e7 0d 38 b6 2f cd 9c 01 1c b5 54 3a 1c 88 61 d1 ..8./.....T:..a. 00:20:33.901 000000d0 f2 54 ea 4a 73 38 00 e3 40 93 ad de 7d eb e8 c5 .T.Js8..@...}... 00:20:33.901 000000e0 83 50 78 ac 64 63 a7 ab 1f db e9 d6 f2 5d 04 14 .Px.dc.......].. 00:20:33.901 000000f0 2a bf e1 76 a6 88 d2 3b de f3 ff 8f 1f c7 bd 2f *..v...;......./ 00:20:33.901 [2024-09-27 15:25:14.866233] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=2, dhgroup=1, seq=3428451748, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.901 [2024-09-27 15:25:14.866328] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.901 [2024-09-27 15:25:14.875329] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.901 [2024-09-27 15:25:14.875415] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.901 [2024-09-27 15:25:14.875426] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.901 [2024-09-27 15:25:14.875471] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.901 [2024-09-27 15:25:15.032746] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.901 [2024-09-27 15:25:15.032765] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.901 [2024-09-27 15:25:15.032772] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.901 [2024-09-27 15:25:15.032815] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.901 [2024-09-27 15:25:15.032837] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.901 ctrlr pubkey: 00:20:33.901 00000000 ee ea 9c f4 50 4a f7 ec fb 6a a9 6e 15 6f 5b 60 ....PJ...j.n.o[` 00:20:33.901 00000010 08 ce f7 b0 b6 6d 14 3c 91 ed 07 41 d8 3d bb b7 .....m.<...A.=.. 00:20:33.902 00000020 51 24 ce 22 1f c1 6f 37 96 7d 70 0c 17 c2 18 26 Q$."..o7.}p....& 00:20:33.902 00000030 d3 83 a6 e2 e4 57 11 b1 13 00 b6 22 45 f4 61 41 .....W....."E.aA 00:20:33.902 00000040 ef a7 7b 28 c6 2d 35 8b 4f 93 c2 8e 18 47 94 86 ..{(.-5.O....G.. 00:20:33.902 00000050 94 1d 1c 3a bc ff 3d 6a 16 2a 56 46 25 b2 34 40 ...:..=j.*VF%.4@ 00:20:33.902 00000060 b0 8e ac 9f b1 d1 33 b6 a6 a6 63 fd 14 68 3d 02 ......3...c..h=. 00:20:33.902 00000070 84 ad 02 54 15 19 d9 33 d7 66 87 63 f9 d2 fb 88 ...T...3.f.c.... 00:20:33.902 00000080 99 69 70 a3 54 f3 cc e9 f4 3b 4c 02 1f 87 e4 e6 .ip.T....;L..... 00:20:33.902 00000090 a3 4c 91 2b 33 42 56 27 ef 00 54 bc cb d2 b9 16 .L.+3BV'..T..... 00:20:33.902 000000a0 78 2f 92 cf 69 8e 29 02 1c 65 a5 e4 3c 21 05 cb x/..i.)..e.......O..;.!V.. 00:20:33.902 000000e0 ca ab 81 3d 2f fe 52 77 e4 2a 2e 2b 81 69 ca 1d ...=/.Rw.*.+.i.. 00:20:33.902 000000f0 31 e7 8f c6 bc 90 21 66 f1 7c 04 45 32 29 65 27 1.....!f.|.E2)e' 00:20:33.902 host pubkey: 00:20:33.902 00000000 3e e1 b5 e6 3a f1 b3 d0 6b 55 23 bc af c5 ed ad >...:...kU#..... 00:20:33.902 00000010 0e 36 ba 4e 23 6c b7 43 55 1f 1d b6 d1 7d dc 0a .6.N#l.CU....}.. 00:20:33.902 00000020 e0 57 fa cc 3c 3d 77 7c f1 5b 63 d5 25 5e cb 76 .W..<=w|.[c.%^.v 00:20:33.902 00000030 7a 1a e9 53 42 38 e6 9b 7f 3c 79 23 88 51 1e 7c z..SB8........ 00:20:33.902 00000080 bf 74 f9 83 dc 67 99 d8 8c 99 f3 7c c5 40 70 59 .t...g.....|.@pY 00:20:33.902 00000090 55 fb 54 1c 63 69 f2 22 96 11 d2 c9 99 e9 31 75 U.T.ci."......1u 00:20:33.902 000000a0 01 45 e2 a4 31 55 a9 49 6f b3 07 bb b4 1b ef 78 .E..1U.Io......x 00:20:33.902 000000b0 bd 06 c9 79 14 b1 85 0a ab b4 e2 0b 38 c0 1a 20 ...y........8.. 00:20:33.902 000000c0 7f 4c 35 a4 01 96 90 90 04 07 32 95 e5 1b f1 ee .L5.......2..... 00:20:33.902 000000d0 2f 3a 62 b0 fb 8b 75 2f df 23 a8 58 68 f7 53 f5 /:b...u/.#.Xh.S. 00:20:33.902 000000e0 2e 51 37 71 da ff ff e0 18 c8 67 a1 fe 03 6d 60 .Q7q......g...m` 00:20:33.902 000000f0 35 1d 05 b1 71 03 9f 32 95 97 30 3a bb b5 c7 10 5...q..2..0:.... 00:20:33.902 dh secret: 00:20:33.902 00000000 11 38 64 42 39 5f 0b f9 8c be 53 0f 98 79 27 8d .8dB9_....S..y'. 00:20:33.902 00000010 e3 7f e5 65 30 67 34 25 03 c3 3b ec 6a 1f 57 c1 ...e0g4%..;.j.W. 00:20:33.902 00000020 99 56 af 68 99 e6 82 f4 d3 d0 e0 0e 2f c4 e9 9f .V.h......../... 00:20:33.902 00000030 96 69 39 19 95 cb 3c f0 ca 33 f0 25 c9 dc 44 be .i9...<..3.%..D. 00:20:33.902 00000040 fe 69 70 42 dd 4a 59 bd 7e db 3c 7f af ee 9a cc .ipB.JY.~.<..... 00:20:33.902 00000050 88 f5 b0 7c 8a e6 87 03 bf 3a 2d da 49 60 17 f1 ...|.....:-.I`.. 00:20:33.902 00000060 c7 d5 ae 17 56 d3 9e 0d 55 2f 3b fb 4a f6 48 00 ....V...U/;.J.H. 00:20:33.902 00000070 75 ca f2 5a c3 32 df 18 b9 63 a5 ed f6 2a f6 7f u..Z.2...c...*.. 00:20:33.902 00000080 6b ac 60 9f fa 30 2b c8 38 f1 e0 f2 b0 c1 5c e5 k.`..0+.8.....\. 00:20:33.902 00000090 4c 19 a2 fb d6 e2 87 1d 25 63 2d e4 55 92 bf 43 L.......%c-.U..C 00:20:33.902 000000a0 cb a4 f0 27 9c c5 e7 95 28 a9 eb 8b d3 cd c6 21 ...'....(......! 00:20:33.902 000000b0 de 32 4b 07 b5 81 3d c4 da 87 83 c6 59 a5 f7 d8 .2K...=.....Y... 00:20:33.902 000000c0 67 b7 ca 5a a5 01 b2 87 4d e7 e3 ac fe ae d6 0e g..Z....M....... 00:20:33.902 000000d0 7f c9 a6 68 56 f6 8b 41 6f 22 2a 45 2e c6 1b bd ...hV..Ao"*E.... 00:20:33.902 000000e0 87 e0 46 16 5a b9 c5 5c 8c 5f 3f 5b 43 e1 f8 a1 ..F.Z..\._?[C... 00:20:33.902 000000f0 93 44 2b 3e 3e 8c 8f 3c 5a b4 4b 99 e8 57 90 d4 .D+>>.......O..;.!V.. 00:20:33.902 000000e0 ca ab 81 3d 2f fe 52 77 e4 2a 2e 2b 81 69 ca 1d ...=/.Rw.*.+.i.. 00:20:33.902 000000f0 31 e7 8f c6 bc 90 21 66 f1 7c 04 45 32 29 65 27 1.....!f.|.E2)e' 00:20:33.902 host pubkey: 00:20:33.902 00000000 e5 74 51 24 e0 a5 e3 36 c1 3d 84 d9 e5 31 83 77 .tQ$...6.=...1.w 00:20:33.902 00000010 75 88 48 83 a6 de 48 9b af 7a 16 75 18 9f 24 a8 u.H...H..z.u..$. 00:20:33.902 00000020 89 9e a7 04 0d 51 ce 48 32 40 e0 74 0a e1 65 93 .....Q.H2@.t..e. 00:20:33.902 00000030 3f 2c f7 c5 b3 54 2c 7b 28 48 fd e8 13 b6 1e 25 ?,...T,{(H.....% 00:20:33.902 00000040 ed 94 b4 63 1b 49 6d c2 55 59 c1 9d 0a 8a 67 39 ...c.Im.UY....g9 00:20:33.902 00000050 58 36 89 41 a7 0e f1 23 fa 9d 04 85 9f 9f db 4b X6.A...#.......K 00:20:33.902 00000060 47 0f 80 c9 d2 c1 25 8d 22 b9 bc ff eb c7 b1 00 G.....%."....... 00:20:33.902 00000070 43 db 4f 1b c7 10 d9 79 a6 74 75 a0 09 e4 6d d4 C.O....y.tu...m. 00:20:33.902 00000080 c2 c0 cd 76 2f 99 6f 63 31 62 c3 ae 79 11 38 d1 ...v/.oc1b..y.8. 00:20:33.902 00000090 cc 8d 3e a0 3f ca 26 cc ba b7 00 be c5 ed 9a 79 ..>.?.&........y 00:20:33.902 000000a0 98 79 36 c7 e3 fb f3 1f 5f f7 e0 52 00 f1 b0 90 .y6....._..R.... 00:20:33.902 000000b0 ef 86 1c a4 1a 04 40 1a 36 c1 b1 32 91 88 ee 65 ......@.6..2...e 00:20:33.902 000000c0 0c b0 cc 0f 09 ed 2a cc d5 42 8b 43 08 77 5f c2 ......*..B.C.w_. 00:20:33.902 000000d0 34 a0 dd cc 58 f1 77 51 dc 1e 86 a5 f3 70 27 b3 4...X.wQ.....p'. 00:20:33.902 000000e0 79 b4 cc d7 fe f4 c5 ed 33 77 74 d6 25 4f 47 6b y.......3wt.%OGk 00:20:33.902 000000f0 c8 be f6 24 20 c5 4a 24 55 36 fe bc b1 3d ac ae ...$ .J$U6...=.. 00:20:33.902 dh secret: 00:20:33.902 00000000 8c aa 91 c7 41 3a 61 90 24 d7 5a 5e c8 4d 03 ab ....A:a.$.Z^.M.. 00:20:33.902 00000010 fd f1 2e 79 fa 5b 84 d3 6f e4 ab 62 32 b4 0d 7c ...y.[..o..b2..| 00:20:33.902 00000020 dd dc 0f 66 d4 a3 9c 15 b7 48 15 fe 4d a7 74 f6 ...f.....H..M.t. 00:20:33.902 00000030 dd 60 0a dc 54 69 98 e1 d5 07 8e 86 b0 b1 27 36 .`..Ti........'6 00:20:33.902 00000040 50 c1 1f e0 8c 3c 4a 0b 84 46 9e 8c 7c ae fd ed P....a ".w..Y%. 00:20:33.903 00000040 b3 9f b2 2c e6 c2 4f 24 69 45 08 d7 9a ca a8 31 ...,..O$iE.....1 00:20:33.903 00000050 ef 08 e2 7e d6 db da f6 03 67 e4 dd b8 e3 d8 08 ...~.....g...... 00:20:33.903 00000060 92 4f 4c b1 34 b6 06 15 92 1f 50 e8 6c e7 be c8 .OL.4.....P.l... 00:20:33.903 00000070 11 34 61 bb c4 ce ad 1d 36 b7 2f f3 03 1a 4f 00 .4a.....6./...O. 00:20:33.903 00000080 fc 02 7f 5f bf 98 47 7d 0a e8 ac 9d 13 4b c3 9d ..._..G}.....K.. 00:20:33.903 00000090 02 c4 97 c0 f5 38 15 80 25 0d 54 0e 63 91 6e 71 .....8..%.T.c.nq 00:20:33.903 000000a0 7f 3a db b8 9e 47 bf e6 76 70 80 eb b8 31 26 06 .:...G..vp...1&. 00:20:33.903 000000b0 11 90 d2 7e 79 d5 d7 c8 dc 0e 89 21 98 53 9a 17 ...~y......!.S.. 00:20:33.903 000000c0 b3 6e 13 33 ef 1e b1 a3 7f 50 43 0a 96 07 84 55 .n.3.....PC....U 00:20:33.903 000000d0 eb b8 f4 13 cf b6 64 2f 00 cd bc 7c b1 f2 2c 59 ......d/...|..,Y 00:20:33.903 000000e0 94 0e 62 78 14 a1 18 d3 6b 0d d1 a4 8e 37 9f cf ..bx....k....7.. 00:20:33.903 000000f0 c3 dc a4 1e 33 14 f8 72 71 db 1b 88 3b 92 fb 55 ....3..rq...;..U 00:20:33.903 host pubkey: 00:20:33.903 00000000 f3 4c c1 d9 26 87 2e 49 2a 23 1b 51 5e 04 da 5e .L..&..I*#.Q^..^ 00:20:33.903 00000010 83 d7 1b 18 04 ac 7d b6 2f c6 fa 09 19 2d 08 8c ......}./....-.. 00:20:33.903 00000020 97 71 88 de 62 0e 6e 80 90 66 de bf 63 ba ce 5a .q..b.n..f..c..Z 00:20:33.903 00000030 c6 df d3 8e e1 70 45 d3 3f dd 2c f1 49 ac cd 4a .....pE.?.,.I..J 00:20:33.903 00000040 58 43 32 d5 d4 e8 51 b4 2d 9e 4e 63 4f 4c 72 04 XC2...Q.-.NcOLr. 00:20:33.903 00000050 39 ca 7d 0f 1f a4 ae 56 ae c9 5e 0e 38 2a 52 51 9.}....V..^.8*RQ 00:20:33.903 00000060 cb f2 93 23 19 a9 61 36 60 f7 fc d6 50 68 6b 38 ...#..a6`...Phk8 00:20:33.903 00000070 fe c5 02 80 b3 51 ee e0 86 77 eb fd 41 0e 85 91 .....Q...w..A... 00:20:33.903 00000080 17 e3 42 17 3e 3d 7b 30 29 cb 3d 60 c7 de 8a ee ..B.>={0).=`.... 00:20:33.903 00000090 e2 d3 2a 51 66 d2 8c 9b 0b 61 ca 73 5d 09 4c 8f ..*Qf....a.s].L. 00:20:33.903 000000a0 5c 08 9b 74 77 58 3b 71 09 dc f4 c1 ed a8 1b fc \..twX;q........ 00:20:33.903 000000b0 88 7c da 41 63 71 0c 16 60 b8 e9 de f7 61 83 c5 .|.Acq..`....a.. 00:20:33.903 000000c0 75 9e 57 c0 05 5f 16 18 d1 03 1b 8c b5 55 92 2e u.W.._.......U.. 00:20:33.903 000000d0 06 75 97 26 ee 98 d8 09 f7 32 ee ea d2 d6 bd 2a .u.&.....2.....* 00:20:33.903 000000e0 9d 99 e3 6a f4 56 9f 72 b9 dd 4b 2d 18 2e 6d f7 ...j.V.r..K-..m. 00:20:33.903 000000f0 a3 a9 c8 1c a7 03 3c 7b 59 7d 91 e1 8d 55 1c 82 ......<{Y}...U.. 00:20:33.903 dh secret: 00:20:33.903 00000000 34 91 a5 da 02 86 1c e9 a0 43 eb ad f4 6c 6d 7e 4........C...lm~ 00:20:33.903 00000010 d1 45 2a 72 53 a5 69 37 c3 7b dc 1a 2b 53 36 ac .E*rS.i7.{..+S6. 00:20:33.903 00000020 23 5c 63 51 40 f0 a6 28 0f b3 4a 31 03 f8 49 8f #\cQ@..(..J1..I. 00:20:33.903 00000030 94 18 1d 94 99 2a 94 41 7e 43 c9 8c cd fc ac 1f .....*.A~C...... 00:20:33.903 00000040 4d a7 98 f6 00 bc 25 52 c2 b8 35 53 2d f2 98 81 M.....%R..5S-... 00:20:33.903 00000050 f0 de 1a 22 3e d3 17 97 70 0f 07 8c 23 82 5c 14 ...">...p...#.\. 00:20:33.903 00000060 5c 2c 25 b4 78 0f a8 47 73 e8 cd c3 68 9f 31 c8 \,%.x..Gs...h.1. 00:20:33.903 00000070 a2 5e da d5 7d 1e 79 2f 61 51 db 71 8e 81 44 4b .^..}.y/aQ.q..DK 00:20:33.903 00000080 10 2a e3 6c 39 59 5e 74 0f af c5 24 64 3c ae 9f .*.l9Y^t...$d<.. 00:20:33.903 00000090 69 d2 65 1f 4d 81 ee 68 7a 18 5b 9b 37 6f 61 fa i.e.M..hz.[.7oa. 00:20:33.903 000000a0 34 20 f2 a2 ff 69 a5 fb ef 1e 51 4b 17 26 d4 1d 4 ...i....QK.&.. 00:20:33.903 000000b0 70 d6 8a f9 1c 5e 08 72 bf 19 3b 1f 5a 67 a4 a3 p....^.r..;.Zg.. 00:20:33.903 000000c0 ed 8e 86 40 a7 74 99 95 be ff 28 50 d0 d4 e7 1e ...@.t....(P.... 00:20:33.903 000000d0 5e 3a a7 81 9f 59 f5 f6 76 1e f1 50 9f c4 dc 65 ^:...Y..v..P...e 00:20:33.903 000000e0 d5 bd 59 50 d2 3d 78 20 a6 d5 ad 2c f4 0f ae b3 ..YP.=x ...,.... 00:20:33.903 000000f0 12 8f 48 e8 55 44 23 a9 9d 25 d7 66 db 2b c2 a2 ..H.UD#..%.f.+.. 00:20:33.903 [2024-09-27 15:25:15.308988] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=2, dhgroup=1, seq=3428451751, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.903 [2024-09-27 15:25:15.311670] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.903 [2024-09-27 15:25:15.311707] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.903 [2024-09-27 15:25:15.311724] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.903 [2024-09-27 15:25:15.311748] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.903 [2024-09-27 15:25:15.311759] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.903 [2024-09-27 15:25:15.420412] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.903 [2024-09-27 15:25:15.420431] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.903 [2024-09-27 15:25:15.420439] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.903 [2024-09-27 15:25:15.420449] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.903 [2024-09-27 15:25:15.420506] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.903 ctrlr pubkey: 00:20:33.903 00000000 0e 21 8a 30 79 61 90 ac fe 87 f1 6c d2 b8 4f ef .!.0ya.....l..O. 00:20:33.903 00000010 11 07 ee 37 a8 d3 80 9e fd 36 b4 a9 32 f0 a4 03 ...7.....6..2... 00:20:33.903 00000020 0e 75 22 27 a6 a4 5b ad 12 28 c7 5f 5e 8b 26 c9 .u"'..[..(._^.&. 00:20:33.903 00000030 d9 2c 4c 75 4e 3e 61 20 22 db 77 1f 96 59 25 91 .,LuN>a ".w..Y%. 00:20:33.903 00000040 b3 9f b2 2c e6 c2 4f 24 69 45 08 d7 9a ca a8 31 ...,..O$iE.....1 00:20:33.903 00000050 ef 08 e2 7e d6 db da f6 03 67 e4 dd b8 e3 d8 08 ...~.....g...... 00:20:33.903 00000060 92 4f 4c b1 34 b6 06 15 92 1f 50 e8 6c e7 be c8 .OL.4.....P.l... 00:20:33.903 00000070 11 34 61 bb c4 ce ad 1d 36 b7 2f f3 03 1a 4f 00 .4a.....6./...O. 00:20:33.903 00000080 fc 02 7f 5f bf 98 47 7d 0a e8 ac 9d 13 4b c3 9d ..._..G}.....K.. 00:20:33.903 00000090 02 c4 97 c0 f5 38 15 80 25 0d 54 0e 63 91 6e 71 .....8..%.T.c.nq 00:20:33.903 000000a0 7f 3a db b8 9e 47 bf e6 76 70 80 eb b8 31 26 06 .:...G..vp...1&. 00:20:33.903 000000b0 11 90 d2 7e 79 d5 d7 c8 dc 0e 89 21 98 53 9a 17 ...~y......!.S.. 00:20:33.903 000000c0 b3 6e 13 33 ef 1e b1 a3 7f 50 43 0a 96 07 84 55 .n.3.....PC....U 00:20:33.903 000000d0 eb b8 f4 13 cf b6 64 2f 00 cd bc 7c b1 f2 2c 59 ......d/...|..,Y 00:20:33.903 000000e0 94 0e 62 78 14 a1 18 d3 6b 0d d1 a4 8e 37 9f cf ..bx....k....7.. 00:20:33.903 000000f0 c3 dc a4 1e 33 14 f8 72 71 db 1b 88 3b 92 fb 55 ....3..rq...;..U 00:20:33.903 host pubkey: 00:20:33.903 00000000 49 3e b6 21 6f 8c 5f 56 9a b8 14 f0 24 16 1f ac I>.!o._V....$... 00:20:33.903 00000010 52 ab fe 52 25 fb 13 35 f3 e3 f9 12 a9 bc 1a 4e R..R%..5.......N 00:20:33.903 00000020 41 7c 8a 58 07 59 68 a1 7a 19 fc 18 66 fc 0c 5c A|.X.Yh.z...f..\ 00:20:33.903 00000030 7c 2a c6 7a 67 d0 f6 9d b0 a4 34 b9 e0 f1 be 7c |*.zg.....4....| 00:20:33.903 00000040 e7 5e 94 4d 1e f5 c0 83 1d 83 55 cf e8 d0 14 c4 .^.M......U..... 00:20:33.903 00000050 c3 92 bd fd 2b 7d ec 2e a2 6f a9 8f 0e ad 9a a9 ....+}...o...... 00:20:33.903 00000060 e1 c3 93 79 85 78 19 bd a4 3d b7 e2 8e 38 18 8c ...y.x...=...8.. 00:20:33.903 00000070 8d 3b ea bc 93 13 54 42 aa 39 b2 d5 ca 2d 5a 81 .;....TB.9...-Z. 00:20:33.903 00000080 e8 eb 01 32 6f 0a 7e 1e 30 51 65 7e 18 9c 61 16 ...2o.~.0Qe~..a. 00:20:33.903 00000090 8f d4 57 c5 38 ff f4 51 ae df 64 e2 8a 83 a3 87 ..W.8..Q..d..... 00:20:33.903 000000a0 f5 36 ce 2a 84 c4 49 38 63 bb 43 b5 b3 92 7b 7c .6.*..I8c.C...{| 00:20:33.903 000000b0 13 36 04 2d df 5e 39 41 6a 45 13 a7 0c 9b 8e 08 .6.-.^9AjE...... 00:20:33.903 000000c0 f6 54 b8 05 86 66 4f 14 fb 00 4d 5b 6b 27 53 90 .T...fO...M[k'S. 00:20:33.903 000000d0 6b b4 bf 4a 71 52 fa de ba a5 46 95 de 47 3a a7 k..JqR....F..G:. 00:20:33.903 000000e0 30 3f 54 b3 29 d0 12 f3 8f 4e 10 89 9c 6d fb 20 0?T.)....N...m. 00:20:33.903 000000f0 2a 65 2f 95 7f 81 d7 25 eb c3 3a be a8 4c 01 1d *e/....%..:..L.. 00:20:33.903 dh secret: 00:20:33.903 00000000 4a b7 3a 4c 4d a9 1e af 35 ee 98 69 c0 86 13 13 J.:LM...5..i.... 00:20:33.903 00000010 aa 41 38 51 4e 69 a2 a7 e2 bb 88 e9 86 c4 19 4c .A8QNi.........L 00:20:33.903 00000020 ab 16 fb d0 7d fb c4 34 66 e5 cc 8d 20 0a 38 01 ....}..4f... .8. 00:20:33.903 00000030 4a d1 a2 0d f3 7f 13 9c 64 0e 98 cc 94 3c d4 40 J.......d....<.@ 00:20:33.903 00000040 67 46 97 e7 5d 97 ee 27 6f 35 69 a8 50 d2 c4 8a gF..]..'o5i.P... 00:20:33.903 00000050 d6 b7 22 04 b5 61 52 14 45 e3 bb f7 80 74 e7 a9 .."..aR.E....t.. 00:20:33.903 00000060 36 cf 1c f0 bf a5 ab 5c ac 31 17 da 47 ba f4 f8 6......\.1..G... 00:20:33.903 00000070 64 df 41 92 9e 17 35 72 10 f1 e4 22 9f cd a6 20 d.A...5r..."... 00:20:33.903 00000080 b9 03 22 dc cb 2c b9 9c 31 f7 3f 55 a3 36 0f d3 .."..,..1.?U.6.. 00:20:33.903 00000090 27 54 66 86 ce 04 3f 4a 2d 7b 5b 6f ce 8c e4 87 'Tf...?J-{[o.... 00:20:33.903 000000a0 3c df a2 6e a1 b4 7b 7a d3 b0 91 fd 0f 35 fd a0 <..n..{z.....5.. 00:20:33.903 000000b0 09 72 2c 15 da 01 f2 d4 bc dc ba 1f 60 5a a5 4c .r,.........`Z.L 00:20:33.903 000000c0 e9 c6 ea e5 42 ca f7 8c cb 8a 89 6a d9 06 9f 9b ....B......j.... 00:20:33.903 000000d0 fa d5 68 ef f8 7c 3d 0e 89 bc e8 1f b9 46 e6 d3 ..h..|=......F.. 00:20:33.903 000000e0 40 a5 7a 92 7f d1 3d 5f bc 39 70 d6 0c 70 d7 b1 @.z...=_.9p..p.. 00:20:33.903 000000f0 0e c8 d8 85 cd 10 51 40 d2 d7 55 c8 75 22 90 4b ......Q@..U.u".K 00:20:33.903 [2024-09-27 15:25:15.423258] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=2, dhgroup=1, seq=3428451752, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.903 [2024-09-27 15:25:15.423362] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.903 [2024-09-27 15:25:15.432369] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.904 [2024-09-27 15:25:15.432436] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.904 [2024-09-27 15:25:15.432447] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.904 [2024-09-27 15:25:15.432484] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.904 [2024-09-27 15:25:15.588931] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.904 [2024-09-27 15:25:15.588951] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.904 [2024-09-27 15:25:15.588958] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.904 [2024-09-27 15:25:15.589005] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.904 [2024-09-27 15:25:15.589029] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.904 ctrlr pubkey: 00:20:33.904 00000000 02 82 0a 30 2a 36 d7 d2 64 d8 b8 96 e2 4d e2 80 ...0*6..d....M.. 00:20:33.904 00000010 2e d8 28 48 3a 90 b7 b4 ce 68 17 4a 7e 7b 55 15 ..(H:....h.J~{U. 00:20:33.904 00000020 91 d4 01 f7 61 e3 a6 42 9b 4e 61 34 86 1f 83 09 ....a..B.Na4.... 00:20:33.904 00000030 ac 8a b0 a3 e4 e4 1e c4 44 fa da 62 ef 78 1d 7c ........D..b.x.| 00:20:33.904 00000040 a2 20 9d 77 65 b4 98 84 57 b3 86 bd 6d 71 36 2a . .we...W...mq6* 00:20:33.904 00000050 ee 05 13 3a 22 8e 85 a4 25 99 49 13 bd 89 24 3b ...:"...%.I...$; 00:20:33.904 00000060 33 2e 84 91 0a d5 28 a9 2f 21 ba 07 cb 91 d6 21 3.....(./!.....! 00:20:33.904 00000070 57 87 29 2b 20 63 4d d9 34 07 a4 2f 5f ed 80 a3 W.)+ cM.4../_... 00:20:33.904 00000080 be 98 db ac 5b 73 09 bc b1 ba bf 4d b2 9c 17 7f ....[s.....M.... 00:20:33.904 00000090 ed ce 6b 7d 99 80 7b 04 d7 0e f3 1e f2 3e b3 d8 ..k}..{......>.. 00:20:33.904 000000a0 5f 08 d1 1b e3 a4 70 9a e4 4c 64 67 08 01 af be _.....p..Ldg.... 00:20:33.904 000000b0 89 60 15 92 e2 13 5f 21 84 70 eb 74 e9 f9 da e5 .`...._!.p.t.... 00:20:33.904 000000c0 7b e6 43 a6 58 b0 78 c9 9e 88 0a f4 dd 54 91 ff {.C.X.x......T.. 00:20:33.904 000000d0 2b 40 03 04 47 6a e7 38 1a e3 da 98 ba 6f 3e 9f +@..Gj.8.....o>. 00:20:33.904 000000e0 5c 69 22 df 1d 94 22 ee b7 0e a6 16 0e 2d 50 78 \i"..."......-Px 00:20:33.904 000000f0 37 9b e5 39 7c fb fa 4d 4c a4 12 7c a8 33 89 28 7..9|..ML..|.3.( 00:20:33.904 host pubkey: 00:20:33.904 00000000 ef 21 f0 53 77 3f 50 4c c2 e7 94 8d 35 bd e4 bb .!.Sw?PL....5... 00:20:33.904 00000010 42 b9 17 5a ce eb b8 a6 99 b5 7b 8b f8 03 40 d5 B..Z......{...@. 00:20:33.904 00000020 ee d8 10 07 92 59 4c b2 b8 2b c1 cf c5 fc 02 85 .....YL..+...... 00:20:33.904 00000030 e8 27 99 c9 11 d9 bc e1 d7 b8 1a fe bf 20 72 23 .'........... r# 00:20:33.904 00000040 47 e1 3c 26 4e 2f ef ec b9 86 0c 59 d2 7d e1 ea G.<&N/.....Y.}.. 00:20:33.904 00000050 79 9c 1f 91 e3 40 73 7e fc 1e 2f c2 29 bf 01 72 y....@s~../.)..r 00:20:33.904 00000060 23 4e d9 78 33 db dc 53 4f 84 30 71 4b 7d 7d 7e #N.x3..SO.0qK}}~ 00:20:33.904 00000070 19 53 d3 b6 47 92 bc 0e d9 af b2 98 2e c0 01 43 .S..G..........C 00:20:33.904 00000080 19 de 79 d9 ad 70 55 fc d2 40 24 28 da 5a 40 57 ..y..pU..@$(.Z@W 00:20:33.904 00000090 05 bd bf 17 84 0d bb e4 c1 72 79 b6 0b 2b b3 11 .........ry..+.. 00:20:33.904 000000a0 3a 22 f4 8d 59 d3 33 7e ff eb 33 f6 bf 0d eb 33 :"..Y.3~..3....3 00:20:33.904 000000b0 9c e0 05 d7 7a b0 42 b0 05 e9 dd 35 5b 6a 82 b7 ....z.B....5[j.. 00:20:33.904 000000c0 75 ef 05 31 1e 70 53 6e 4c 79 da 17 2e e4 ba 09 u..1.pSnLy...... 00:20:33.904 000000d0 c2 13 e8 50 d1 09 bb be bd 06 6c 0b 87 0b 43 f8 ...P......l...C. 00:20:33.904 000000e0 a5 05 81 db 20 f9 e5 45 49 a8 07 a5 fc 59 07 ca .... ..EI....Y.. 00:20:33.904 000000f0 f6 49 b6 06 51 f7 66 99 16 57 32 2b 03 47 8f 8b .I..Q.f..W2+.G.. 00:20:33.904 dh secret: 00:20:33.904 00000000 0c c6 2e 67 3a 5b 5f b8 95 c0 af 28 1f 20 97 6a ...g:[_....(. .j 00:20:33.904 00000010 eb 0b a5 08 94 79 28 6c 3a bd d7 5b 19 7c f2 fb .....y(l:..[.|.. 00:20:33.904 00000020 7b 06 d6 e1 cf 30 5a 9f 09 8e 6b 53 69 f9 97 6d {....0Z...kSi..m 00:20:33.904 00000030 df 60 39 76 27 7d d0 3c 4e 59 69 f1 a3 67 b8 ed .`9v'}. 00:20:33.904 000000e0 84 09 d7 fb f6 06 75 7a 03 b5 d4 5e cd 14 95 7c ......uz...^...| 00:20:33.904 000000f0 ec 7b 29 f6 09 ce 12 51 96 85 47 7a 3a b9 18 13 .{)....Q..Gz:... 00:20:33.904 [2024-09-27 15:25:15.591731] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=2, dhgroup=1, seq=3428451753, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.904 [2024-09-27 15:25:15.594580] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.904 [2024-09-27 15:25:15.594607] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.904 [2024-09-27 15:25:15.594624] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.904 [2024-09-27 15:25:15.594630] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.904 [2024-09-27 15:25:15.700970] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.904 [2024-09-27 15:25:15.700988] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.904 [2024-09-27 15:25:15.700996] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.904 [2024-09-27 15:25:15.701005] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.904 [2024-09-27 15:25:15.701060] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.904 ctrlr pubkey: 00:20:33.904 00000000 02 82 0a 30 2a 36 d7 d2 64 d8 b8 96 e2 4d e2 80 ...0*6..d....M.. 00:20:33.904 00000010 2e d8 28 48 3a 90 b7 b4 ce 68 17 4a 7e 7b 55 15 ..(H:....h.J~{U. 00:20:33.904 00000020 91 d4 01 f7 61 e3 a6 42 9b 4e 61 34 86 1f 83 09 ....a..B.Na4.... 00:20:33.904 00000030 ac 8a b0 a3 e4 e4 1e c4 44 fa da 62 ef 78 1d 7c ........D..b.x.| 00:20:33.904 00000040 a2 20 9d 77 65 b4 98 84 57 b3 86 bd 6d 71 36 2a . .we...W...mq6* 00:20:33.904 00000050 ee 05 13 3a 22 8e 85 a4 25 99 49 13 bd 89 24 3b ...:"...%.I...$; 00:20:33.904 00000060 33 2e 84 91 0a d5 28 a9 2f 21 ba 07 cb 91 d6 21 3.....(./!.....! 00:20:33.904 00000070 57 87 29 2b 20 63 4d d9 34 07 a4 2f 5f ed 80 a3 W.)+ cM.4../_... 00:20:33.904 00000080 be 98 db ac 5b 73 09 bc b1 ba bf 4d b2 9c 17 7f ....[s.....M.... 00:20:33.904 00000090 ed ce 6b 7d 99 80 7b 04 d7 0e f3 1e f2 3e b3 d8 ..k}..{......>.. 00:20:33.904 000000a0 5f 08 d1 1b e3 a4 70 9a e4 4c 64 67 08 01 af be _.....p..Ldg.... 00:20:33.904 000000b0 89 60 15 92 e2 13 5f 21 84 70 eb 74 e9 f9 da e5 .`...._!.p.t.... 00:20:33.904 000000c0 7b e6 43 a6 58 b0 78 c9 9e 88 0a f4 dd 54 91 ff {.C.X.x......T.. 00:20:33.904 000000d0 2b 40 03 04 47 6a e7 38 1a e3 da 98 ba 6f 3e 9f +@..Gj.8.....o>. 00:20:33.904 000000e0 5c 69 22 df 1d 94 22 ee b7 0e a6 16 0e 2d 50 78 \i"..."......-Px 00:20:33.904 000000f0 37 9b e5 39 7c fb fa 4d 4c a4 12 7c a8 33 89 28 7..9|..ML..|.3.( 00:20:33.904 host pubkey: 00:20:33.904 00000000 e5 54 ce a1 55 25 10 37 74 4d 8f d2 e7 a0 70 c7 .T..U%.7tM....p. 00:20:33.904 00000010 e1 6f df 10 4b de ea 8b b2 0e c6 1b f9 b4 55 34 .o..K.........U4 00:20:33.904 00000020 db 80 bd 9e e7 8f 6c 35 7e f3 1c 68 19 f2 e5 22 ......l5~..h..." 00:20:33.904 00000030 6c 9f 2a 1d ea 32 ec 8f 58 55 0a 86 5f df 2e 42 l.*..2..XU.._..B 00:20:33.904 00000040 9b 60 5a ba c3 02 66 f8 2f 72 5e a8 a7 28 4b 7d .`Z...f./r^..(K} 00:20:33.904 00000050 e8 d0 5f 97 46 43 12 14 df 17 a9 b1 09 46 d6 c2 .._.FC.......F.. 00:20:33.904 00000060 38 ea e5 48 2d 4c d8 3f 21 42 80 2f 73 e0 8f fe 8..H-L.?!B./s... 00:20:33.904 00000070 9f 53 25 22 4d 5f b6 12 0f 81 b6 2b 05 d7 c5 18 .S%"M_.....+.... 00:20:33.904 00000080 5c 2b a5 74 eb ac be ca ba ff d2 b6 81 2a 2a 00 \+.t.........**. 00:20:33.904 00000090 81 c5 71 e5 29 04 a9 20 07 e2 3d 17 54 6a 9b d1 ..q.).. ..=.Tj.. 00:20:33.904 000000a0 da 80 16 bd db 69 b7 3e f5 11 58 38 43 e9 02 a1 .....i.>..X8C... 00:20:33.904 000000b0 d5 0a 45 40 49 89 f0 aa cc d3 2b 63 f9 9d 16 10 ..E@I.....+c.... 00:20:33.904 000000c0 52 d6 7f 40 29 48 2e e5 a1 09 81 e2 6b f9 5d a2 R..@)H......k.]. 00:20:33.904 000000d0 65 3a 7a 84 bd a0 51 f5 84 57 75 d7 65 e8 86 e1 e:z...Q..Wu.e... 00:20:33.905 000000e0 94 bc 9f 0d f8 96 68 54 bc 49 05 3a 65 7a a2 cd ......hT.I.:ez.. 00:20:33.905 000000f0 10 d3 e8 1a 7a 5f 9a 6d 1f 0a 36 cc db 89 a6 45 ....z_.m..6....E 00:20:33.905 dh secret: 00:20:33.905 00000000 12 83 07 c0 d8 a3 d0 01 a8 bb e6 fa e2 cd 6f 53 ..............oS 00:20:33.905 00000010 e7 60 3b cf fd 29 b8 ab 85 1e c5 f5 f5 49 e2 33 .`;..).......I.3 00:20:33.905 00000020 b3 44 ce 4c 0c fc 33 39 47 77 1b 07 e4 79 b4 04 .D.L..39Gw...y.. 00:20:33.905 00000030 b7 c7 08 ad 96 88 b8 f7 d3 c1 d6 7e c6 00 ef ee ...........~.... 00:20:33.905 00000040 a1 a0 f2 f8 d2 7d 85 6a 52 86 06 22 83 0a 0c 04 .....}.jR..".... 00:20:33.905 00000050 7c c1 f8 8a 4e 33 7f 97 80 05 bd b1 56 ef 8d 02 |...N3......V... 00:20:33.905 00000060 be 0d 73 34 4f f4 5a d5 03 01 46 6b 86 09 43 5e ..s4O.Z...Fk..C^ 00:20:33.905 00000070 58 bf d4 a0 d9 7d b2 d9 11 2f 6e ab 0c 24 20 eb X....}.../n..$ . 00:20:33.905 00000080 98 1e 94 61 d1 36 c4 aa b2 2e f8 3a 39 d4 56 4c ...a.6.....:9.VL 00:20:33.905 00000090 91 f1 6d 84 60 21 90 5a cc c7 bd 16 e2 e7 e5 e1 ..m.`!.Z........ 00:20:33.905 000000a0 6c c3 2b fd 30 fb 9b 64 f9 b1 c4 8b 96 a4 b5 d0 l.+.0..d........ 00:20:33.905 000000b0 92 9f f6 a7 fe 5c 7f 4f d6 ff 27 ff 61 90 39 9a .....\.O..'.a.9. 00:20:33.905 000000c0 c4 d2 5f 7b 39 3f 2b 28 13 31 34 91 cc be e6 05 .._{9?+(.14..... 00:20:33.905 000000d0 30 1f 3a cb a5 10 32 62 a1 a2 7e cd 04 bc a3 c9 0.:...2b..~..... 00:20:33.905 000000e0 7e f2 bc 12 56 c8 4c 6b a7 a3 e6 95 4a 49 6b eb ~...V.Lk....JIk. 00:20:33.905 000000f0 f7 60 e9 d3 43 f5 7f cf 5d 28 b1 ad f0 37 d3 64 .`..C...](...7.d 00:20:33.905 [2024-09-27 15:25:15.703686] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=2, dhgroup=1, seq=3428451754, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.905 [2024-09-27 15:25:15.703746] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.905 [2024-09-27 15:25:15.712999] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.905 [2024-09-27 15:25:15.713041] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.905 [2024-09-27 15:25:15.713048] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.905 [2024-09-27 15:25:15.870139] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.905 [2024-09-27 15:25:15.870159] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.905 [2024-09-27 15:25:15.870166] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.905 [2024-09-27 15:25:15.870210] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.905 [2024-09-27 15:25:15.870237] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.905 ctrlr pubkey: 00:20:33.905 00000000 6c 33 d8 8b 6a 93 aa 89 55 a1 69 b3 d3 de 20 c6 l3..j...U.i... . 00:20:33.905 00000010 b0 89 c6 2c 1d b0 9e a1 b4 89 5e 1c f6 f6 b6 28 ...,......^....( 00:20:33.905 00000020 40 95 2a 3f 3d 56 0a 96 f0 c0 28 48 73 8f c7 50 @.*?=V....(Hs..P 00:20:33.905 00000030 d3 b4 5f 9f fe 7f 55 2c 57 50 f4 86 44 ec 44 a6 .._...U,WP..D.D. 00:20:33.905 00000040 40 c4 1f b3 39 02 28 9e 0d c0 bc ce fc 17 b2 1d @...9.(......... 00:20:33.905 00000050 e6 65 8c 57 b0 81 f0 11 4f 49 b0 79 e1 2d 93 b7 .e.W....OI.y.-.. 00:20:33.905 00000060 57 88 7f 0c 2c 10 39 0c 03 e9 d5 cc df ad 22 27 W...,.9......."' 00:20:33.905 00000070 ae 29 2e ae a5 91 38 1b 42 d6 bc 56 60 33 5f a4 .)....8.B..V`3_. 00:20:33.905 00000080 05 73 ec a4 15 c7 3b 27 de 95 0a fe 7e 7e be fe .s....;'....~~.. 00:20:33.905 00000090 fc 4b fd 3a 17 d2 cd 53 75 14 83 24 f6 ce bc c8 .K.:...Su..$.... 00:20:33.905 000000a0 d9 8b 53 db fc fe 69 34 69 08 c2 f3 1f d3 f2 92 ..S...i4i....... 00:20:33.905 000000b0 1a 0a e0 1b 83 e1 e4 4e 1d cf cf a3 0f 2d 3b 5b .......N.....-;[ 00:20:33.905 000000c0 f5 4e 5f d8 cf d6 71 7c f2 d1 9f 05 dc 10 91 23 .N_...q|.......# 00:20:33.905 000000d0 37 61 67 6f af 0e a4 e0 35 7c 76 f5 11 8c e7 63 7ago....5|v....c 00:20:33.905 000000e0 bf 55 08 2d f2 0e 65 bd 0d 09 d5 63 f5 a6 2a 9f .U.-..e....c..*. 00:20:33.905 000000f0 d0 df ba 1c fe 64 1f 88 86 1f fe de e6 e0 40 e4 .....d........@. 00:20:33.905 00000100 b8 4f 24 9c e3 1d 7a b5 58 e1 96 44 a8 56 8a 7f .O$...z.X..D.V.. 00:20:33.905 00000110 ed 19 e7 ef 21 63 b5 4f 32 7e 8a 7c 04 13 99 af ....!c.O2~.|.... 00:20:33.905 00000120 33 0d 42 e1 5e 78 a0 fb de 41 64 40 7c 2a 9b f1 3.B.^x...Ad@|*.. 00:20:33.905 00000130 0f 8b aa 6a 2b dc 3a 45 9f 71 e4 64 0a 5d 6a 75 ...j+.:E.q.d.]ju 00:20:33.905 00000140 e8 33 ac fd 59 07 32 6a 6b d4 69 78 d0 86 d4 3e .3..Y.2jk.ix...> 00:20:33.905 00000150 92 ef 52 da 23 68 1f 97 71 3b c7 d4 69 0b 0d 4d ..R.#h..q;..i..M 00:20:33.905 00000160 f6 0e 9c 68 3b 76 2b 45 69 fd 0d f1 f2 8c 0f 6c ...h;v+Ei......l 00:20:33.905 00000170 08 85 88 ae e5 97 84 2f fa 08 9c 1d a2 59 2e 2b ......./.....Y.+ 00:20:33.905 host pubkey: 00:20:33.905 00000000 1a 72 b2 54 ee ee 94 7a e9 c7 f5 d0 89 65 da b6 .r.T...z.....e.. 00:20:33.905 00000010 56 8f ba b6 d5 f7 aa d6 41 0a 10 45 40 c7 e5 53 V.......A..E@..S 00:20:33.905 00000020 15 8e ab 94 e8 7a ab 80 83 46 12 c5 5f 7e 03 c2 .....z...F.._~.. 00:20:33.905 00000030 9a d8 38 07 a8 eb cb fc 96 ea a3 b5 c9 ff f0 70 ..8............p 00:20:33.905 00000040 2a d0 5f bc 0a e3 4c 11 15 62 64 50 fe ac a1 ae *._...L..bdP.... 00:20:33.905 00000050 fe 47 36 4a 15 86 07 a7 6a d3 a3 b7 c4 49 15 07 .G6J....j....I.. 00:20:33.905 00000060 ff 28 4b 1e 6e 89 18 2d 5b e3 74 d7 e2 64 92 ed .(K.n..-[.t..d.. 00:20:33.905 00000070 72 71 c0 fb 18 04 3b b5 00 c9 11 1a 0b e8 f2 51 rq....;........Q 00:20:33.905 00000080 f0 a9 c9 5d 37 c5 f0 63 57 02 ac 89 96 10 42 81 ...]7..cW.....B. 00:20:33.905 00000090 0d df c2 10 9a e5 be ab 7d 1c 7d 1a 78 26 c9 13 ........}.}.x&.. 00:20:33.905 000000a0 af a2 aa 67 e2 ab 67 ec 3d b2 c2 8a 2f f9 89 be ...g..g.=.../... 00:20:33.905 000000b0 f5 b7 4d 7a 61 a4 9e 2f bc b0 a4 1e 23 c4 08 84 ..Mza../....#... 00:20:33.905 000000c0 2a 5e 59 24 c1 e0 f0 b7 2c 2f 8b 8a 44 5b ca 98 *^Y$....,/..D[.. 00:20:33.905 000000d0 e1 b4 b1 47 36 2a 9c 0f 4d 2f 48 9c 8b 4d dc ad ...G6*..M/H..M.. 00:20:33.905 000000e0 11 d9 2c 47 a3 da f2 5a 84 9a b8 77 22 c7 6d d6 ..,G...Z...w".m. 00:20:33.905 000000f0 01 ac 2d 68 ff 26 1c 22 a8 00 27 35 2c a7 53 f1 ..-h.&."..'5,.S. 00:20:33.905 00000100 70 97 5c 69 4b 25 7f 5f 94 18 02 b6 c5 0a f4 98 p.\iK%._........ 00:20:33.905 00000110 2f f4 ba cb 66 0a 72 e1 da b5 88 b3 62 43 55 ee /...f.r.....bCU. 00:20:33.905 00000120 e2 2d a1 cd e9 6d ba 05 41 1e 92 6c 02 37 c0 d8 .-...m..A..l.7.. 00:20:33.905 00000130 37 1a 19 31 37 f5 10 d7 75 ee eb 60 de 0f 1b fd 7..17...u..`.... 00:20:33.905 00000140 c2 a8 17 1c 52 20 ef ad 04 1d a5 5e 41 25 6b ed ....R .....^A%k. 00:20:33.905 00000150 9e 70 c9 15 3e a0 42 a3 f5 44 c7 09 13 ea 75 11 .p..>.B..D....u. 00:20:33.905 00000160 12 31 d3 12 81 50 67 99 3b 5b 49 81 8a a8 ae e3 .1...Pg.;[I..... 00:20:33.905 00000170 04 47 04 fa d0 d7 85 75 fd b0 91 f8 ae 5e 33 4d .G.....u.....^3M 00:20:33.905 dh secret: 00:20:33.905 00000000 b2 27 72 12 92 73 cc f4 36 c8 26 af 1e 95 69 68 .'r..s..6.&...ih 00:20:33.905 00000010 f5 b8 57 69 fc 25 f9 bd 36 3e bd f9 70 de b6 c1 ..Wi.%..6>..p... 00:20:33.905 00000020 12 43 43 be cc 98 ac a7 16 b1 f9 02 ad c6 5f 54 .CC..........._T 00:20:33.905 00000030 0b 4e 70 0f 30 82 b5 c2 73 b8 a2 f4 2f 9e 44 37 .Np.0...s.../.D7 00:20:33.905 00000040 4f d8 44 ab 32 c8 90 c3 4c 93 64 ea 99 a6 16 93 O.D.2...L.d..... 00:20:33.905 00000050 a3 9e f6 24 2a 69 77 26 8b fa 0e 8e c6 42 8c 32 ...$*iw&.....B.2 00:20:33.905 00000060 25 31 11 f4 1d cd 55 49 b0 fd 74 b0 4e 8a 5f 12 %1....UI..t.N._. 00:20:33.905 00000070 16 f5 d2 d6 52 67 00 fd 5b c7 89 75 4a fe 69 38 ....Rg..[..uJ.i8 00:20:33.905 00000080 ef 54 1e da c3 83 c5 17 98 40 fe 1e 87 b7 17 44 .T.......@.....D 00:20:33.905 00000090 f8 85 48 88 97 a5 d7 0d b5 44 8f 10 3b 48 a0 0d ..H......D..;H.. 00:20:33.905 000000a0 5b 35 d8 c3 48 aa 60 2c 5f 67 df 73 9a 50 df 0e [5..H.`,_g.s.P.. 00:20:33.905 000000b0 48 4e ed 50 54 26 dd 35 0e 21 4d b9 89 b7 56 91 HN.PT&.5.!M...V. 00:20:33.905 000000c0 a1 9d 03 af b4 b6 6b 81 e9 37 7f da 76 c8 d0 66 ......k..7..v..f 00:20:33.905 000000d0 d5 c3 38 db 77 30 c3 fa 95 dc 2b 8a 74 c4 db 69 ..8.w0....+.t..i 00:20:33.905 000000e0 76 a0 c8 4d 63 75 a9 78 0d bf 06 d8 2d 7e eb f4 v..Mcu.x....-~.. 00:20:33.905 000000f0 13 88 e6 1f 6e 9a 44 68 c6 92 2d f4 91 3b 16 66 ....n.Dh..-..;.f 00:20:33.905 00000100 72 b2 b0 d7 79 57 23 4d 58 2b 61 a5 60 18 cf 23 r...yW#MX+a.`..# 00:20:33.905 00000110 7d 65 ba 5f 70 f7 54 4f 1b 6c 16 fb c1 80 51 a0 }e._p.TO.l....Q. 00:20:33.905 00000120 73 2c fc 25 7a 6b 42 7d 09 72 a9 07 9c 76 58 b2 s,.%zkB}.r...vX. 00:20:33.905 00000130 96 be 41 2a 68 22 3f fe 7a e1 f1 1c 25 35 d0 80 ..A*h"?.z...%5.. 00:20:33.905 00000140 e0 a1 2a 12 79 21 27 b4 00 21 3d e1 d4 6c ee 8c ..*.y!'..!=..l.. 00:20:33.905 00000150 7d 44 f4 82 58 80 06 74 36 7b 09 25 47 48 d8 96 }D..X..t6{.%GH.. 00:20:33.905 00000160 c4 ea ce 7c f9 06 80 00 7c df 60 ac fa 18 c9 8c ...|....|.`..... 00:20:33.905 00000170 fb dd 53 bd 4b e2 45 26 29 b8 f2 be d0 63 44 ac ..S.K.E&)....cD. 00:20:33.905 [2024-09-27 15:25:15.877606] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=2, dhgroup=2, seq=3428451755, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.905 [2024-09-27 15:25:15.883198] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.905 [2024-09-27 15:25:15.883239] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.905 [2024-09-27 15:25:15.883256] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.905 [2024-09-27 15:25:15.883277] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.905 [2024-09-27 15:25:15.883292] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.905 [2024-09-27 15:25:15.993639] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.905 [2024-09-27 15:25:15.993687] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.905 [2024-09-27 15:25:15.993710] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.905 [2024-09-27 15:25:15.993743] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.905 [2024-09-27 15:25:15.993815] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.905 ctrlr pubkey: 00:20:33.906 00000000 6c 33 d8 8b 6a 93 aa 89 55 a1 69 b3 d3 de 20 c6 l3..j...U.i... . 00:20:33.906 00000010 b0 89 c6 2c 1d b0 9e a1 b4 89 5e 1c f6 f6 b6 28 ...,......^....( 00:20:33.906 00000020 40 95 2a 3f 3d 56 0a 96 f0 c0 28 48 73 8f c7 50 @.*?=V....(Hs..P 00:20:33.906 00000030 d3 b4 5f 9f fe 7f 55 2c 57 50 f4 86 44 ec 44 a6 .._...U,WP..D.D. 00:20:33.906 00000040 40 c4 1f b3 39 02 28 9e 0d c0 bc ce fc 17 b2 1d @...9.(......... 00:20:33.906 00000050 e6 65 8c 57 b0 81 f0 11 4f 49 b0 79 e1 2d 93 b7 .e.W....OI.y.-.. 00:20:33.906 00000060 57 88 7f 0c 2c 10 39 0c 03 e9 d5 cc df ad 22 27 W...,.9......."' 00:20:33.906 00000070 ae 29 2e ae a5 91 38 1b 42 d6 bc 56 60 33 5f a4 .)....8.B..V`3_. 00:20:33.906 00000080 05 73 ec a4 15 c7 3b 27 de 95 0a fe 7e 7e be fe .s....;'....~~.. 00:20:33.906 00000090 fc 4b fd 3a 17 d2 cd 53 75 14 83 24 f6 ce bc c8 .K.:...Su..$.... 00:20:33.906 000000a0 d9 8b 53 db fc fe 69 34 69 08 c2 f3 1f d3 f2 92 ..S...i4i....... 00:20:33.906 000000b0 1a 0a e0 1b 83 e1 e4 4e 1d cf cf a3 0f 2d 3b 5b .......N.....-;[ 00:20:33.906 000000c0 f5 4e 5f d8 cf d6 71 7c f2 d1 9f 05 dc 10 91 23 .N_...q|.......# 00:20:33.906 000000d0 37 61 67 6f af 0e a4 e0 35 7c 76 f5 11 8c e7 63 7ago....5|v....c 00:20:33.906 000000e0 bf 55 08 2d f2 0e 65 bd 0d 09 d5 63 f5 a6 2a 9f .U.-..e....c..*. 00:20:33.906 000000f0 d0 df ba 1c fe 64 1f 88 86 1f fe de e6 e0 40 e4 .....d........@. 00:20:33.906 00000100 b8 4f 24 9c e3 1d 7a b5 58 e1 96 44 a8 56 8a 7f .O$...z.X..D.V.. 00:20:33.906 00000110 ed 19 e7 ef 21 63 b5 4f 32 7e 8a 7c 04 13 99 af ....!c.O2~.|.... 00:20:33.906 00000120 33 0d 42 e1 5e 78 a0 fb de 41 64 40 7c 2a 9b f1 3.B.^x...Ad@|*.. 00:20:33.906 00000130 0f 8b aa 6a 2b dc 3a 45 9f 71 e4 64 0a 5d 6a 75 ...j+.:E.q.d.]ju 00:20:33.906 00000140 e8 33 ac fd 59 07 32 6a 6b d4 69 78 d0 86 d4 3e .3..Y.2jk.ix...> 00:20:33.906 00000150 92 ef 52 da 23 68 1f 97 71 3b c7 d4 69 0b 0d 4d ..R.#h..q;..i..M 00:20:33.906 00000160 f6 0e 9c 68 3b 76 2b 45 69 fd 0d f1 f2 8c 0f 6c ...h;v+Ei......l 00:20:33.906 00000170 08 85 88 ae e5 97 84 2f fa 08 9c 1d a2 59 2e 2b ......./.....Y.+ 00:20:33.906 host pubkey: 00:20:33.906 00000000 e5 1b d9 16 de 8a d0 54 86 04 78 7c 93 43 bc 9e .......T..x|.C.. 00:20:33.906 00000010 9b e7 b8 fb 38 e4 98 9b 31 c6 cb dc a2 ba 90 56 ....8...1......V 00:20:33.906 00000020 ed 31 55 a6 db d6 87 ff 23 28 3b ce 4e ec 37 3d .1U.....#(;.N.7= 00:20:33.906 00000030 21 b7 db b5 f1 d1 e1 d6 62 9d 6f 7e c8 ed 10 bc !.......b.o~.... 00:20:33.906 00000040 7c a9 8f 43 09 42 ff e9 81 4c cf d1 a6 ce 73 6c |..C.B...L....sl 00:20:33.906 00000050 63 61 6d 7c 20 d2 e8 a1 ac 96 e8 7f 24 59 da bb cam| .......$Y.. 00:20:33.906 00000060 dc 07 b7 e5 5e 51 46 80 ed 88 57 b7 5f 1c 37 92 ....^QF...W._.7. 00:20:33.906 00000070 39 80 f1 21 ed fd c3 2f 33 ee 79 e3 e6 b4 fa 06 9..!.../3.y..... 00:20:33.906 00000080 47 45 e7 bd 3b 9c a3 54 16 1f c7 42 5c 99 69 cb GE..;..T...B\.i. 00:20:33.906 00000090 0f 62 f1 f5 90 8b ea 8f c7 9b 55 eb 0d 44 b4 c3 .b........U..D.. 00:20:33.906 000000a0 28 5e d2 cc 65 f9 bf 41 b5 bb 6f fc 56 91 dc 43 (^..e..A..o.V..C 00:20:33.906 000000b0 71 9d 87 74 7e d2 df 54 45 67 3e 46 c1 a6 df d1 q..t~..TEg>F.... 00:20:33.906 000000c0 d3 7a e7 04 a4 1d 50 d8 09 15 d7 5a 9a 6a 5d cd .z....P....Z.j]. 00:20:33.906 000000d0 76 c5 79 06 7b c7 50 2d 37 a6 db 92 cb 20 e8 b7 v.y.{.P-7.... .. 00:20:33.906 000000e0 a1 9d 98 8b 0e 8b 28 7d 52 c7 d8 c9 04 27 ef 77 ......(}R....'.w 00:20:33.906 000000f0 36 1c 27 d0 b6 89 84 0b cb e8 f0 87 f2 61 8f c2 6.'..........a.. 00:20:33.906 00000100 66 0b 4f 4c bd 7f 22 2a 14 ae 92 d4 fb 90 aa b0 f.OL.."*........ 00:20:33.906 00000110 0b b9 9b 2f d3 80 9c 32 76 d9 c4 81 1f bb ef 95 .../...2v....... 00:20:33.906 00000120 e5 5d 2d 98 ae 8f fc 32 7c 30 7b 07 84 37 1c 3c .]-....2|0{..7.< 00:20:33.906 00000130 8f 75 7c d7 fd 59 75 68 14 c3 11 b8 e9 8f 2c 2f .u|..Yuh......,/ 00:20:33.906 00000140 32 31 5d 2e 90 40 13 0c 6d bc d8 7c 14 d3 2b a0 21]..@..m..|..+. 00:20:33.906 00000150 76 24 a9 92 93 75 a2 40 0e 30 29 25 c0 19 08 c6 v$...u.@.0)%.... 00:20:33.906 00000160 0f 0a e6 bd 84 de 16 74 05 cd 23 1b 68 69 9f a7 .......t..#.hi.. 00:20:33.906 00000170 9b 72 f0 4b 1f c9 90 14 6c dd dd 2c 8a 1a 42 cd .r.K....l..,..B. 00:20:33.906 dh secret: 00:20:33.906 00000000 61 41 68 d1 39 83 b1 65 7c 0d f6 52 e9 25 01 fb aAh.9..e|..R.%.. 00:20:33.906 00000010 0e 8c bd 2e 89 fb 46 c3 b2 90 10 f7 dd 03 30 10 ......F.......0. 00:20:33.906 00000020 d2 a5 92 66 09 d6 2f 22 9e 11 66 94 cb 56 0d 11 ...f../"..f..V.. 00:20:33.906 00000030 75 21 f2 f7 60 50 fc 97 e0 ce 75 5f c6 55 30 b4 u!..`P....u_.U0. 00:20:33.906 00000040 e6 33 2b 99 64 84 dd 87 c5 2d 3f db 12 56 d9 62 .3+.d....-?..V.b 00:20:33.906 00000050 b9 e0 3a 2e 4b b0 01 b4 fe 08 65 99 00 8f ed 6c ..:.K.....e....l 00:20:33.906 00000060 6a 02 c9 0f 62 2a 3b 3c 4d 34 f9 a8 c7 9a 7d dc j...b*;.].R0... 00:20:33.906 00000080 44 61 eb 43 c7 4c 23 7f 4b 21 9e 98 b2 07 38 ac Da.C.L#.K!....8. 00:20:33.906 00000090 72 32 2e bb b7 f5 60 35 37 75 71 fc 4c af 06 0b r2....`57uq.L... 00:20:33.906 000000a0 19 e2 76 57 bb f9 02 e9 95 e6 0e e1 3c 5c b7 3a ..vW........<\.: 00:20:33.906 000000b0 af c3 c1 3d 2f cc 6a bb 55 19 2c 01 31 9f 6d 6c ...=/.j.U.,.1.ml 00:20:33.906 000000c0 f2 0b 08 17 36 4c f5 4a d4 cd f9 ea 90 df 35 63 ....6L.J......5c 00:20:33.906 000000d0 a2 16 cd ea 0e 1f 80 18 57 92 6c 6f 37 ba 13 a3 ........W.lo7... 00:20:33.906 000000e0 64 22 ac 18 14 77 aa b5 f3 55 8e b4 fc 79 93 cd d"...w...U...y.. 00:20:33.906 000000f0 71 f3 ac c7 1c 4d 4a e2 fd 50 65 74 6d af 4f 9a q....MJ..Petm.O. 00:20:33.906 00000100 b1 6a f2 94 f5 43 c7 6d 11 53 18 ff bc f6 61 b6 .j...C.m.S....a. 00:20:33.906 00000110 6f 4b 24 cd 34 75 5d 3e 55 c7 3e ed cd 3f b2 e2 oK$.4u]>U.>..?.. 00:20:33.906 00000120 90 59 30 f8 88 16 42 40 76 3d 24 38 2f 3d be e8 .Y0...B@v=$8/=.. 00:20:33.906 00000130 e5 4f 5a 6b 5a d0 78 92 e6 25 8d 39 5e 5f 73 fe .OZkZ.x..%.9^_s. 00:20:33.906 00000140 7f 11 80 a2 b9 0e 94 90 15 18 41 d4 3a 31 27 ab ..........A.:1'. 00:20:33.906 00000150 fe 16 91 ad ed a6 76 f5 aa 8a 3f e5 27 80 db ad ......v...?.'... 00:20:33.906 00000160 93 9b f2 8e f4 a1 f1 a0 e3 9b 3a 15 af e8 e7 29 ..........:....) 00:20:33.906 00000170 da dc bc 6c 3b f0 02 56 4c e9 e2 55 4e b4 b5 40 ...l;..VL..UN..@ 00:20:33.906 [2024-09-27 15:25:16.001173] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=2, dhgroup=2, seq=3428451756, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.906 [2024-09-27 15:25:16.001276] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.906 [2024-09-27 15:25:16.018470] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.906 [2024-09-27 15:25:16.018537] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.906 [2024-09-27 15:25:16.018547] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.906 [2024-09-27 15:25:16.018580] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.906 [2024-09-27 15:25:16.171692] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.906 [2024-09-27 15:25:16.171711] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.906 [2024-09-27 15:25:16.171718] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.906 [2024-09-27 15:25:16.171761] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.906 [2024-09-27 15:25:16.171783] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.906 ctrlr pubkey: 00:20:33.906 00000000 b0 61 97 57 07 5f 1e 5d 47 08 52 d0 44 85 40 24 .a.W._.]G.R.D.@$ 00:20:33.906 00000010 bf 0d 5c 20 f2 0d df a2 57 ce 2a 94 77 d0 f8 26 ..\ ....W.*.w..& 00:20:33.906 00000020 5b 69 7c 21 e2 4d 7f bf ca 93 a8 aa 44 aa ea f2 [i|!.M......D... 00:20:33.906 00000030 c3 94 41 ed 58 e7 87 55 77 c1 4e 99 c9 3c 2e ab ..A.X..Uw.N..<.. 00:20:33.906 00000040 cb bf cc 4c 4c 6a b3 1d 94 62 82 5a 28 fa c3 16 ...LLj...b.Z(... 00:20:33.906 00000050 0d 5b 8a b6 63 a9 f5 c0 38 07 43 bf 77 22 b8 2f .[..c...8.C.w"./ 00:20:33.906 00000060 cd d2 96 eb da 00 59 ae 6e 14 fb a7 e6 6a 70 d2 ......Y.n....jp. 00:20:33.906 00000070 2a 70 f0 06 71 5d 40 74 32 9c 54 8f e9 62 05 8c *p..q]@t2.T..b.. 00:20:33.906 00000080 e7 46 80 23 cd 49 7d 7e 40 93 3d 76 a5 39 67 a7 .F.#.I}~@.=v.9g. 00:20:33.906 00000090 a9 d9 eb a8 78 7f d1 13 0e 70 77 a0 46 46 5a af ....x....pw.FFZ. 00:20:33.906 000000a0 e3 a1 6b 7d bb 0e 68 ad b0 68 11 e7 48 68 c7 4f ..k}..h..h..Hh.O 00:20:33.906 000000b0 a1 e3 0e 29 72 9a f9 f7 e1 ea 08 e3 50 39 0c d9 ...)r.......P9.. 00:20:33.906 000000c0 f7 eb fb 57 df 21 e9 3d 9e 5c 49 65 28 5f da 47 ...W.!.=.\Ie(_.G 00:20:33.906 000000d0 83 53 8f 1e f1 d4 7a 7a 87 03 86 eb c7 52 48 f3 .S....zz.....RH. 00:20:33.906 000000e0 c3 f3 d6 b7 e6 47 22 08 ab 0a fb b0 04 3b 4f 66 .....G"......;Of 00:20:33.906 000000f0 bd 4a 67 f5 65 90 05 a6 83 15 3c 16 13 20 88 87 .Jg.e.....<.. .. 00:20:33.906 00000100 a8 82 34 95 1b 49 04 66 bc 0e 14 e7 b5 50 f0 7a ..4..I.f.....P.z 00:20:33.906 00000110 5b e5 38 c7 5d ec bc 79 19 fc 5d ad cc b6 35 5e [.8.]..y..]...5^ 00:20:33.906 00000120 da a5 ba 87 09 d2 1f 30 a8 57 ba d6 4d 55 d1 92 .......0.W..MU.. 00:20:33.906 00000130 97 ae d5 82 56 63 25 db 41 00 9c a2 45 ef 7f 14 ....Vc%.A...E... 00:20:33.906 00000140 09 29 00 a3 11 7e ab 49 f7 d4 f7 c8 45 bc d6 00 .)...~.I....E... 00:20:33.906 00000150 b0 dc 74 72 94 c7 63 38 0b 7c 3c 5a cd eb fe 5c ..tr..c8.|t.......D# 00:20:33.906 00000170 e1 15 89 00 73 61 d1 ee 26 29 36 5f 75 46 c9 b3 ....sa..&)6_uF.. 00:20:33.906 host pubkey: 00:20:33.906 00000000 2f 2c 0f 77 5d a5 9d 93 2c 54 78 69 ec ee f8 ef /,.w]...,Txi.... 00:20:33.906 00000010 75 a2 99 7c 83 d5 3e 53 4f 2b 4a 8c 8d 68 3d 4f u..|..>SO+J..h=O 00:20:33.906 00000020 67 9b 09 d0 93 ba 75 db d0 67 68 c7 80 e0 2b 2d g.....u..gh...+- 00:20:33.906 00000030 8d ba fd ed b0 6c 7c 1b b9 8e 8f 91 be c0 23 be .....l|.......#. 00:20:33.906 00000040 69 a4 b1 29 1c 6b 76 aa d8 ea 04 8c 80 bb 98 04 i..).kv......... 00:20:33.906 00000050 5c 6f 49 de cb d2 5b 6b fc 35 4c a2 54 ac 9f 08 \oI...[k.5L.T... 00:20:33.906 00000060 84 f7 79 43 7f d6 f7 6f 70 02 07 c0 95 62 47 0c ..yC...op....bG. 00:20:33.906 00000070 4e ae 2f 9a 7d f7 b6 20 70 61 17 42 9f e8 2e b1 N./.}.. pa.B.... 00:20:33.906 00000080 46 5a 1a 68 c5 7d 49 1a 37 b0 5c 82 67 ce 94 32 FZ.h.}I.7.\.g..2 00:20:33.906 00000090 74 74 73 66 e5 b3 6a fb 0a 1e 7d 35 13 b7 33 5d ttsf..j...}5..3] 00:20:33.906 000000a0 5d e7 c2 55 96 ef b3 de 81 15 44 c0 df a6 20 9d ]..U......D... . 00:20:33.906 000000b0 7f 0e d2 5f fd e5 20 7f 44 f3 ba d3 3d b1 62 0c ..._.. .D...=.b. 00:20:33.906 000000c0 e6 39 6b f7 bf ed 5d 86 8d b1 54 f9 bf 99 dd 06 .9k...]...T..... 00:20:33.907 000000d0 22 24 b4 05 8e 24 8d a7 93 15 a2 a0 52 be 51 23 "$...$......R.Q# 00:20:33.907 000000e0 f5 45 22 91 56 81 06 c1 28 65 b2 2e 0a c0 99 5a .E".V...(e.....Z 00:20:33.907 000000f0 52 aa 27 e0 11 df 66 24 54 0c 1c 31 c9 40 fc e1 R.'...f$T..1.@.. 00:20:33.907 00000100 7b b0 7f a4 8e 9e b0 18 bd 31 af c1 05 6b 2f 41 {........1...k/A 00:20:33.907 00000110 cd 05 f2 e2 e7 2c b4 ad 7d 65 ca 68 c3 a2 bc d9 .....,..}e.h.... 00:20:33.907 00000120 3f c4 0f c0 88 12 fd 25 d9 d6 5a a6 3a 12 e8 26 ?......%..Z.:..& 00:20:33.907 00000130 02 e2 95 42 be 03 c1 10 b6 6f 1b 15 81 7b a8 b0 ...B.....o...{.. 00:20:33.907 00000140 5e 38 3a e4 e0 3d 67 ed 85 f1 d8 fa ab 11 45 78 ^8:..=g.......Ex 00:20:33.907 00000150 14 d9 4f 17 7c 79 e6 70 ff 5a 8f 00 ee 9f 2d c0 ..O.|y.p.Z....-. 00:20:33.907 00000160 80 8f 86 13 61 0a dc 8e 04 e7 7b 77 cc 27 56 e3 ....a.....{w.'V. 00:20:33.907 00000170 6a b1 aa ca 89 4e 2d cd 3b a8 3a b4 c5 25 ef 81 j....N-.;.:..%.. 00:20:33.907 dh secret: 00:20:33.907 00000000 f3 4b c9 8a ca fe 85 a9 b0 7e e8 ad df 10 5b 42 .K.......~....[B 00:20:33.907 00000010 0a c2 02 b8 65 f6 4c 56 45 54 06 d4 f6 f4 e5 60 ....e.LVET.....` 00:20:33.907 00000020 7a df 47 7b fe 66 e7 00 04 0e 0e 0d 50 34 ef c2 z.G{.f......P4.. 00:20:33.907 00000030 3a 34 2d 66 79 db b7 8e a9 c3 c1 57 cd 2b a0 b7 :4-fy......W.+.. 00:20:33.907 00000040 77 f3 58 92 d8 7f 56 58 ad 88 ce 37 5f d7 f5 8f w.X...VX...7_... 00:20:33.907 00000050 05 e3 7a 0e 67 b2 2e 47 fe 45 b2 ba 41 da 80 5c ..z.g..G.E..A..\ 00:20:33.907 00000060 a8 7c 66 3b 55 35 9d f9 0e c7 bb 1d 9c 49 20 72 .|f;U5.......I r 00:20:33.907 00000070 41 49 96 90 9f 37 1e 24 1f c1 04 2f 43 5f bc 83 AI...7.$.../C_.. 00:20:33.907 00000080 c3 5f c6 56 de 3a aa e9 45 3c 8f 79 fa 72 fc f2 ._.V.:..E<.y.r.. 00:20:33.907 00000090 90 10 25 65 57 bb a2 d0 00 db 39 ef fd f4 2c 9a ..%eW.....9...,. 00:20:33.907 000000a0 8c 7a a4 32 e6 cc 7d ce b1 54 11 8c aa 8c a8 24 .z.2..}..T.....$ 00:20:33.907 000000b0 7b 0f 0a b8 4b d1 49 72 9b 38 af c6 31 03 89 0a {...K.Ir.8..1... 00:20:33.907 000000c0 4c b2 18 43 5f ae 1b 6d 27 67 e6 6b 48 d4 2a ec L..C_..m'g.kH.*. 00:20:33.907 000000d0 36 20 cd 37 26 06 43 b3 2c 27 c8 30 5f 3e 45 fd 6 .7&.C.,'.0_>E. 00:20:33.907 000000e0 01 d2 a8 ff e6 ae 8a 55 79 5d 42 de fc ec b1 41 .......Uy]B....A 00:20:33.907 000000f0 79 e5 74 89 a6 27 4e e5 5d 55 5f 66 20 21 aa 64 y.t..'N.]U_f !.d 00:20:33.907 00000100 26 af fc 6b 7d 3b d6 67 45 e4 32 d3 30 d0 8c 4b &..k};.gE.2.0..K 00:20:33.907 00000110 29 db 72 f1 f1 13 1a 62 c9 80 0b c6 e9 64 50 af ).r....b.....dP. 00:20:33.907 00000120 f2 a8 5e 3d f2 97 04 93 7e c9 e8 08 11 d0 73 1f ..^=....~.....s. 00:20:33.907 00000130 a3 25 08 5a c9 72 3e d5 31 17 65 d8 5e a0 5e 8b .%.Z.r>.1.e.^.^. 00:20:33.907 00000140 91 bb df 65 85 67 11 29 d2 e7 70 90 f5 3f 8b df ...e.g.)..p..?.. 00:20:33.907 00000150 9c a3 9a 74 ce 6d 80 8b 30 84 b8 d3 16 42 45 5e ...t.m..0....BE^ 00:20:33.907 00000160 65 a0 46 24 9a ae 41 43 2a 2f ca 07 54 1d eb cc e.F$..AC*/..T... 00:20:33.907 00000170 aa 7d 5e b7 85 0c 58 8b c9 f9 93 9f 5e af 59 5d .}^...X.....^.Y] 00:20:33.907 [2024-09-27 15:25:16.179115] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=2, dhgroup=2, seq=3428451757, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.907 [2024-09-27 15:25:16.184181] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.907 [2024-09-27 15:25:16.184220] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.907 [2024-09-27 15:25:16.184235] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.907 [2024-09-27 15:25:16.184255] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.907 [2024-09-27 15:25:16.184269] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.907 [2024-09-27 15:25:16.291128] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.907 [2024-09-27 15:25:16.291177] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.907 [2024-09-27 15:25:16.291201] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.907 [2024-09-27 15:25:16.291233] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.907 [2024-09-27 15:25:16.291336] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.907 ctrlr pubkey: 00:20:33.907 00000000 b0 61 97 57 07 5f 1e 5d 47 08 52 d0 44 85 40 24 .a.W._.]G.R.D.@$ 00:20:33.907 00000010 bf 0d 5c 20 f2 0d df a2 57 ce 2a 94 77 d0 f8 26 ..\ ....W.*.w..& 00:20:33.907 00000020 5b 69 7c 21 e2 4d 7f bf ca 93 a8 aa 44 aa ea f2 [i|!.M......D... 00:20:33.907 00000030 c3 94 41 ed 58 e7 87 55 77 c1 4e 99 c9 3c 2e ab ..A.X..Uw.N..<.. 00:20:33.907 00000040 cb bf cc 4c 4c 6a b3 1d 94 62 82 5a 28 fa c3 16 ...LLj...b.Z(... 00:20:33.907 00000050 0d 5b 8a b6 63 a9 f5 c0 38 07 43 bf 77 22 b8 2f .[..c...8.C.w"./ 00:20:33.907 00000060 cd d2 96 eb da 00 59 ae 6e 14 fb a7 e6 6a 70 d2 ......Y.n....jp. 00:20:33.907 00000070 2a 70 f0 06 71 5d 40 74 32 9c 54 8f e9 62 05 8c *p..q]@t2.T..b.. 00:20:33.907 00000080 e7 46 80 23 cd 49 7d 7e 40 93 3d 76 a5 39 67 a7 .F.#.I}~@.=v.9g. 00:20:33.907 00000090 a9 d9 eb a8 78 7f d1 13 0e 70 77 a0 46 46 5a af ....x....pw.FFZ. 00:20:33.907 000000a0 e3 a1 6b 7d bb 0e 68 ad b0 68 11 e7 48 68 c7 4f ..k}..h..h..Hh.O 00:20:33.907 000000b0 a1 e3 0e 29 72 9a f9 f7 e1 ea 08 e3 50 39 0c d9 ...)r.......P9.. 00:20:33.907 000000c0 f7 eb fb 57 df 21 e9 3d 9e 5c 49 65 28 5f da 47 ...W.!.=.\Ie(_.G 00:20:33.907 000000d0 83 53 8f 1e f1 d4 7a 7a 87 03 86 eb c7 52 48 f3 .S....zz.....RH. 00:20:33.907 000000e0 c3 f3 d6 b7 e6 47 22 08 ab 0a fb b0 04 3b 4f 66 .....G"......;Of 00:20:33.907 000000f0 bd 4a 67 f5 65 90 05 a6 83 15 3c 16 13 20 88 87 .Jg.e.....<.. .. 00:20:33.907 00000100 a8 82 34 95 1b 49 04 66 bc 0e 14 e7 b5 50 f0 7a ..4..I.f.....P.z 00:20:33.907 00000110 5b e5 38 c7 5d ec bc 79 19 fc 5d ad cc b6 35 5e [.8.]..y..]...5^ 00:20:33.907 00000120 da a5 ba 87 09 d2 1f 30 a8 57 ba d6 4d 55 d1 92 .......0.W..MU.. 00:20:33.907 00000130 97 ae d5 82 56 63 25 db 41 00 9c a2 45 ef 7f 14 ....Vc%.A...E... 00:20:33.907 00000140 09 29 00 a3 11 7e ab 49 f7 d4 f7 c8 45 bc d6 00 .)...~.I....E... 00:20:33.907 00000150 b0 dc 74 72 94 c7 63 38 0b 7c 3c 5a cd eb fe 5c ..tr..c8.|t.......D# 00:20:33.907 00000170 e1 15 89 00 73 61 d1 ee 26 29 36 5f 75 46 c9 b3 ....sa..&)6_uF.. 00:20:33.907 host pubkey: 00:20:33.907 00000000 7e 2f 4e c9 3e a1 86 b2 54 d1 40 d4 60 ef 26 ee ~/N.>...T.@.`.&. 00:20:33.907 00000010 e0 d8 26 92 03 c5 f8 d6 88 7b 5d cb ce 66 e5 da ..&......{]..f.. 00:20:33.907 00000020 e0 16 c2 44 29 39 10 06 e7 79 1f 8f a4 55 31 61 ...D)9...y...U1a 00:20:33.907 00000030 56 cc 71 d0 6b ca 9c cf 74 c2 54 06 73 bf cf e0 V.q.k...t.T.s... 00:20:33.907 00000040 9d 49 1d 1d c7 90 b2 20 9f d6 fd 63 a1 91 dd f7 .I..... ...c.... 00:20:33.907 00000050 b8 62 2e eb 99 5c 44 b3 2e 5c 3e d0 26 74 e8 bd .b...\D..\>.&t.. 00:20:33.907 00000060 8a 49 e1 d4 fa 6a ae 1e e8 f7 ab e7 48 4d c7 78 .I...j......HM.x 00:20:33.907 00000070 5a 07 45 14 56 e4 a8 f3 32 5e e5 20 b6 b9 68 ce Z.E.V...2^. ..h. 00:20:33.907 00000080 ac 42 b4 48 ac fe 15 35 25 f1 71 a3 40 ca 26 8b .B.H...5%.q.@.&. 00:20:33.907 00000090 b9 0b f7 d6 d2 70 93 9a b1 b4 fa 37 4d a2 52 7d .....p.....7M.R} 00:20:33.907 000000a0 08 2d 4a 9b bb 73 4d 5f 45 c4 67 c7 69 12 02 9e .-J..sM_E.g.i... 00:20:33.907 000000b0 42 f9 b5 96 be 51 65 db 01 61 99 4c a7 be a2 c4 B....Qe..a.L.... 00:20:33.907 000000c0 55 23 1e 01 6e f5 80 f8 d3 e8 aa fe 52 19 f6 28 U#..n.......R..( 00:20:33.907 000000d0 be 01 d7 5d 65 6c 1f d9 12 9d 48 d3 7a d2 88 4e ...]el....H.z..N 00:20:33.907 000000e0 b8 3d 81 25 2d 1e f0 6f e9 ba b8 ce 18 0a d5 9d .=.%-..o........ 00:20:33.907 000000f0 50 79 07 37 66 f6 2f 32 69 5b 92 30 1d 8c 4a 61 Py.7f./2i[.0..Ja 00:20:33.907 00000100 00 71 98 9f 42 19 f7 bb 17 01 35 e4 31 4e 14 e0 .q..B.....5.1N.. 00:20:33.907 00000110 8d 06 df be bd 7c 4d 1b ad 91 e3 1c d8 95 89 7e .....|M........~ 00:20:33.907 00000120 ef 1c bd e0 e2 f8 21 5f a7 f7 88 75 8d 01 ad ad ......!_...u.... 00:20:33.907 00000130 e0 4e 02 ff 23 3d e3 b1 a3 90 26 f9 24 9f 36 d4 .N..#=....&.$.6. 00:20:33.907 00000140 f7 08 03 a6 21 7d 8d 43 75 69 48 8b a4 e2 2a d0 ....!}.CuiH...*. 00:20:33.907 00000150 9f 79 e0 24 5f 02 90 fa 36 5f c0 96 3c de bb 47 .y.$_...6_..<..G 00:20:33.907 00000160 cb 94 43 b7 2d 4d 5c 92 ad e0 fa 29 4c 2a d2 4b ..C.-M\....)L*.K 00:20:33.907 00000170 d4 45 3d 48 45 da 51 d2 ae 45 bb 28 86 e2 19 17 .E=HE.Q..E.(.... 00:20:33.907 dh secret: 00:20:33.907 00000000 ee a8 21 46 83 39 b1 20 7a 39 de 05 57 0d db 2f ..!F.9. z9..W../ 00:20:33.907 00000010 28 13 ae a4 82 c9 ec 4a 56 12 84 08 af 49 a1 df (......JV....I.. 00:20:33.907 00000020 12 82 28 a8 95 6e dc bc 30 d2 ed ce 05 66 ca bd ..(..n..0....f.. 00:20:33.907 00000030 75 e2 0e 58 57 3f 24 27 e5 aa 2a 8a 7b 4a 90 0a u..XW?$'..*.{J.. 00:20:33.907 00000040 bc fb 7e b9 af 26 78 48 82 7c bd 98 ee 5e 13 ed ..~..&xH.|...^.. 00:20:33.907 00000050 8f 73 f0 e0 6e d7 2f 19 76 e7 f2 6f d6 58 e2 44 .s..n./.v..o.X.D 00:20:33.907 00000060 6c 52 83 14 50 07 aa 4c b3 fb 9e 29 eb 9d b6 c7 lR..P..L...).... 00:20:33.907 00000070 d1 34 28 38 f2 f4 22 91 aa f7 91 48 16 4a d7 e4 .4(8.."....H.J.. 00:20:33.907 00000080 ec fd e8 e1 7f 41 03 17 38 76 03 84 1c 15 59 2e .....A..8v....Y. 00:20:33.907 00000090 cf eb 1a cf eb 5c 95 45 7f 79 d3 6e 33 21 0c cb .....\.E.y.n3!.. 00:20:33.907 000000a0 bd 9b 68 74 0e b2 3b 10 bc 82 fe c9 34 c2 72 bb ..ht..;.....4.r. 00:20:33.907 000000b0 5b c7 68 c1 79 89 5f a9 c0 f5 13 dd 4e dd 96 a4 [.h.y._.....N... 00:20:33.907 000000c0 c8 cc c6 64 01 e7 5d 5a fb 95 7c 74 2c 34 64 18 ...d..]Z..|t,4d. 00:20:33.907 000000d0 cd 88 c5 aa 7b 69 68 83 a7 c5 ea 9c 41 62 af 6e ....{ih.....Ab.n 00:20:33.907 000000e0 08 ed bb 9a 1f b1 86 c2 3c 4e 91 fc e4 60 8c 67 ........... 00:20:33.907 00000170 df 30 20 dc 51 60 64 68 fb 98 b2 99 d5 73 72 a9 .0 .Q`dh.....sr. 00:20:33.907 [2024-09-27 15:25:16.298582] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=2, dhgroup=2, seq=3428451758, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.908 [2024-09-27 15:25:16.298678] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.908 [2024-09-27 15:25:16.315633] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.908 [2024-09-27 15:25:16.315698] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.908 [2024-09-27 15:25:16.315708] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.908 [2024-09-27 15:25:16.315744] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.908 [2024-09-27 15:25:16.468490] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.908 [2024-09-27 15:25:16.468509] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.908 [2024-09-27 15:25:16.468516] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.908 [2024-09-27 15:25:16.468558] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.908 [2024-09-27 15:25:16.468580] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.908 ctrlr pubkey: 00:20:33.908 00000000 ad 8a 4b 2b 86 75 f6 a0 fc 80 69 52 a2 aa ca 17 ..K+.u....iR.... 00:20:33.908 00000010 2b c6 48 0b 82 02 c4 d2 78 87 19 93 d0 eb f9 64 +.H.....x......d 00:20:33.908 00000020 cc b7 41 f5 db 4c c5 f5 a8 0e fe 21 49 9f 00 e3 ..A..L.....!I... 00:20:33.908 00000030 06 db be e6 13 9e 91 a9 83 b3 74 d2 69 cd 30 12 ..........t.i.0. 00:20:33.908 00000040 01 cc 2b 0d ae 3e f4 2d df 0c 8d 15 38 ab 22 1b ..+..>.-....8.". 00:20:33.908 00000050 59 da a8 6f e1 1e 71 e4 13 a3 99 0d d4 a8 92 13 Y..o..q......... 00:20:33.908 00000060 3f b5 c4 0d 6c 82 f4 83 5b 28 27 46 40 be b4 c8 ?...l...[('F@... 00:20:33.908 00000070 54 54 96 62 9e fb 52 c4 32 e6 91 4b b0 2d c7 bb TT.b..R.2..K.-.. 00:20:33.908 00000080 fa 96 fc 51 02 d5 34 5c fe ad f5 be bb 23 5c 20 ...Q..4\.....#\ 00:20:33.908 00000090 1e 6f 26 0f 11 36 52 fd 45 b0 fa 78 d7 2d 84 ba .o&..6R.E..x.-.. 00:20:33.908 000000a0 62 e8 01 4f 7c 35 ac 1b 96 39 38 3e ec 58 2a 54 b..O|5...98>.X*T 00:20:33.908 000000b0 11 64 3b ee 45 18 f6 3f e4 af 58 d6 db b8 74 36 .d;.E..?..X...t6 00:20:33.908 000000c0 ec 39 30 66 fe 15 b5 44 98 09 c7 cc a9 da a4 17 .90f...D........ 00:20:33.908 000000d0 44 20 50 a6 12 94 eb 5b 85 b8 aa 54 57 a6 21 35 D P....[...TW.!5 00:20:33.908 000000e0 21 10 84 72 72 a8 83 50 85 a2 f8 ee 8d 42 e4 dd !..rr..P.....B.. 00:20:33.908 000000f0 22 d9 2d 72 85 38 dd ad fa a7 07 49 a1 f9 29 70 ".-r.8.....I..)p 00:20:33.908 00000100 e6 2c e3 f1 36 7d d4 bf 06 a1 7f 6a 10 7a 04 ae .,..6}.....j.z.. 00:20:33.908 00000110 a7 d5 00 0f c5 61 ef d3 8b d5 04 86 8a b0 4b 68 .....a........Kh 00:20:33.908 00000120 91 db c7 4d c6 00 16 d3 f1 d2 8f 55 5b c6 22 1d ...M.......U[.". 00:20:33.908 00000130 d4 56 37 3b 1e 55 ae 7d 3d b7 e2 0a c9 1a 06 dd .V7;.U.}=....... 00:20:33.908 00000140 aa 44 96 57 c1 44 21 1a d2 2a a6 4c ed 31 86 ef .D.W.D!..*.L.1.. 00:20:33.908 00000150 68 be 24 a6 d5 ab dc 4b 60 11 c3 c2 be 5d 7e d2 h.$....K`....]~. 00:20:33.908 00000160 18 cc e0 de 30 2c dc 0b 32 6d 90 6c 04 01 38 78 ....0,..2m.l..8x 00:20:33.908 00000170 e2 f7 05 45 35 21 16 76 16 ae b8 c4 19 06 98 a8 ...E5!.v........ 00:20:33.908 host pubkey: 00:20:33.908 00000000 2f 1e 38 52 fb f7 c8 22 7e 46 60 4b 4c f9 24 ce /.8R..."~F`KL.$. 00:20:33.908 00000010 ea b5 93 32 cb 91 c9 3b a3 b8 21 4a 25 67 84 c2 ...2...;..!J%g.. 00:20:33.908 00000020 c9 20 0b 55 bd 6a ca 31 91 4b 0c dd b8 e4 95 ad . .U.j.1.K...... 00:20:33.908 00000030 cb 13 c9 7b 76 e3 f9 ab f6 33 02 08 4b a6 63 48 ...{v....3..K.cH 00:20:33.908 00000040 f4 a9 96 18 e6 b3 29 36 8b 9a 21 00 22 0c cb da ......)6..!."... 00:20:33.908 00000050 b8 20 f8 f1 6d 6a 0f 7f 46 d4 6e 53 a5 b0 94 81 . ..mj..F.nS.... 00:20:33.908 00000060 c6 0e a3 46 94 77 18 9c 1f c3 24 6a 9b 84 b0 83 ...F.w....$j.... 00:20:33.908 00000070 21 aa 1c 07 18 99 f9 93 1b 76 a0 12 76 52 70 b1 !........v..vRp. 00:20:33.908 00000080 c3 00 2a b4 4e 00 b3 1d 05 e9 fe 7e fb f9 e8 4f ..*.N......~...O 00:20:33.908 00000090 1b 38 9b ac 9e ee c7 eb 0f 9c c9 49 e1 45 41 a5 .8.........I.EA. 00:20:33.908 000000a0 58 5b 64 03 4d bd d6 90 07 74 e8 b6 ee 52 e4 8e X[d.M....t...R.. 00:20:33.908 000000b0 f1 a1 81 ad ef 9c 94 e9 35 35 dd 3f 94 43 e0 72 ........55.?.C.r 00:20:33.908 000000c0 b2 89 61 cc 42 71 2a 8c bb 96 0b 7d c3 6b a8 8d ..a.Bq*....}.k.. 00:20:33.908 000000d0 13 63 42 d1 5d 6b dd 52 46 ec 36 a7 21 1a 17 97 .cB.]k.RF.6.!... 00:20:33.908 000000e0 d0 ac b1 e5 63 2f e1 a3 e7 f5 85 04 b0 3e a3 cb ....c/.......>.. 00:20:33.908 000000f0 36 9b c0 a4 83 e7 65 b3 81 fc e8 e3 46 4d 83 da 6.....e.....FM.. 00:20:33.908 00000100 aa 62 11 27 d6 6e 54 da cb 05 20 8d d7 6b 0a ac .b.'.nT... ..k.. 00:20:33.908 00000110 bc 18 3d 13 95 b1 8d 27 9b e7 43 29 33 02 ef d8 ..=....'..C)3... 00:20:33.908 00000120 98 57 52 3b 01 84 94 9c cf 3f df 06 99 fa 12 7b .WR;.....?.....{ 00:20:33.908 00000130 58 44 a6 8f 4b 24 5e 38 49 5c c1 e2 3f 7a 7a ef XD..K$^8I\..?zz. 00:20:33.908 00000140 a6 cf 57 82 a9 9b bb 36 84 ca b4 64 eb 5c 9e 72 ..W....6...d.\.r 00:20:33.908 00000150 ff 44 61 5f ec 13 04 4c b1 e0 2e f2 bb 72 5f 53 .Da_...L.....r_S 00:20:33.908 00000160 d6 7d 61 3d 6d c1 0a 83 2c a5 bb f7 6b 41 18 96 .}a=m...,...kA.. 00:20:33.908 00000170 e4 e0 ad bc 0f 79 c7 55 9e 7e 11 20 04 41 bc 21 .....y.U.~. .A.! 00:20:33.908 dh secret: 00:20:33.908 00000000 95 f8 b8 15 85 e9 09 af 96 33 d7 e7 a0 81 26 03 .........3....&. 00:20:33.908 00000010 0b 29 a3 f3 2e 2d ad 7c a9 9e 84 01 1c 2f 7e e8 .)...-.|...../~. 00:20:33.908 00000020 38 d5 e8 20 9b c5 64 e3 e2 e8 d6 48 20 c3 01 09 8.. ..d....H ... 00:20:33.908 00000030 6b c0 39 6e 47 9d 01 66 d7 6a 57 40 12 f8 d0 f2 k.9nG..f.jW@.... 00:20:33.908 00000040 30 99 24 63 72 15 12 05 55 19 96 67 1b 98 e1 a8 0.$cr...U..g.... 00:20:33.908 00000050 9b d0 aa af 2b 91 21 ef ec 64 54 ad 4e 0d 91 c3 ....+.!..dT.N... 00:20:33.908 00000060 20 ad ab 84 21 dc 82 54 7f e4 13 61 81 67 42 cf ...!..T...a.gB. 00:20:33.908 00000070 17 ef ea 80 7b d5 31 59 0e 83 31 bc 59 c3 63 eb ....{.1Y..1.Y.c. 00:20:33.908 00000080 4e 0b 7b a9 90 21 88 e1 ca 98 3d 36 34 60 64 46 N.{..!....=64`dF 00:20:33.908 00000090 0b 90 59 a8 f6 25 bd 0e c0 79 5b 03 cf e8 9f 94 ..Y..%...y[..... 00:20:33.908 000000a0 ba 13 46 b5 b1 6f 88 61 9f 48 3f 48 f0 91 f5 b5 ..F..o.a.H?H.... 00:20:33.908 000000b0 62 5a 46 eb a7 90 5c 19 45 22 67 ed b7 d7 bc 1d bZF...\.E"g..... 00:20:33.908 000000c0 c9 fc 09 6f a0 d4 1c 62 ca 16 b4 12 29 be c3 02 ...o...b....)... 00:20:33.908 000000d0 0b d4 29 62 05 61 f8 78 53 3c ae fa 35 6d dc 30 ..)b.a.xS<..5m.0 00:20:33.908 000000e0 cf d7 34 e7 09 f9 64 de 9c 39 cb 42 0f 31 63 8d ..4...d..9.B.1c. 00:20:33.908 000000f0 5c 07 40 c4 94 0d 93 fc 9d f5 9c 85 38 25 43 fa \.@.........8%C. 00:20:33.908 00000100 c4 1e 4d c3 ed 04 6c f1 f9 ec 20 07 c6 e6 f6 8c ..M...l... ..... 00:20:33.908 00000110 e3 52 06 4e 53 3d 92 44 e9 05 09 0c 1b ed fa 67 .R.NS=.D.......g 00:20:33.908 00000120 fa a4 7e a1 7a 19 54 44 af bd 60 23 41 b3 96 2c ..~.z.TD..`#A.., 00:20:33.908 00000130 ca fa 5d f1 3a b7 3c 4c eb 8d a3 9e 82 3c c6 de ..].:...+.}.. 00:20:33.908 00000170 e8 06 83 ea e0 a0 fc 69 de 71 0e 26 a9 98 00 69 .......i.q.&...i 00:20:33.908 [2024-09-27 15:25:16.475893] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=2, dhgroup=2, seq=3428451759, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.908 [2024-09-27 15:25:16.481082] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.908 [2024-09-27 15:25:16.481122] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.908 [2024-09-27 15:25:16.481139] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.908 [2024-09-27 15:25:16.481163] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.908 [2024-09-27 15:25:16.481174] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.908 [2024-09-27 15:25:16.587330] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.908 [2024-09-27 15:25:16.587351] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.908 [2024-09-27 15:25:16.587362] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.908 [2024-09-27 15:25:16.587372] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.908 [2024-09-27 15:25:16.587426] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.908 ctrlr pubkey: 00:20:33.908 00000000 ad 8a 4b 2b 86 75 f6 a0 fc 80 69 52 a2 aa ca 17 ..K+.u....iR.... 00:20:33.908 00000010 2b c6 48 0b 82 02 c4 d2 78 87 19 93 d0 eb f9 64 +.H.....x......d 00:20:33.908 00000020 cc b7 41 f5 db 4c c5 f5 a8 0e fe 21 49 9f 00 e3 ..A..L.....!I... 00:20:33.908 00000030 06 db be e6 13 9e 91 a9 83 b3 74 d2 69 cd 30 12 ..........t.i.0. 00:20:33.908 00000040 01 cc 2b 0d ae 3e f4 2d df 0c 8d 15 38 ab 22 1b ..+..>.-....8.". 00:20:33.908 00000050 59 da a8 6f e1 1e 71 e4 13 a3 99 0d d4 a8 92 13 Y..o..q......... 00:20:33.908 00000060 3f b5 c4 0d 6c 82 f4 83 5b 28 27 46 40 be b4 c8 ?...l...[('F@... 00:20:33.908 00000070 54 54 96 62 9e fb 52 c4 32 e6 91 4b b0 2d c7 bb TT.b..R.2..K.-.. 00:20:33.908 00000080 fa 96 fc 51 02 d5 34 5c fe ad f5 be bb 23 5c 20 ...Q..4\.....#\ 00:20:33.908 00000090 1e 6f 26 0f 11 36 52 fd 45 b0 fa 78 d7 2d 84 ba .o&..6R.E..x.-.. 00:20:33.908 000000a0 62 e8 01 4f 7c 35 ac 1b 96 39 38 3e ec 58 2a 54 b..O|5...98>.X*T 00:20:33.908 000000b0 11 64 3b ee 45 18 f6 3f e4 af 58 d6 db b8 74 36 .d;.E..?..X...t6 00:20:33.908 000000c0 ec 39 30 66 fe 15 b5 44 98 09 c7 cc a9 da a4 17 .90f...D........ 00:20:33.908 000000d0 44 20 50 a6 12 94 eb 5b 85 b8 aa 54 57 a6 21 35 D P....[...TW.!5 00:20:33.908 000000e0 21 10 84 72 72 a8 83 50 85 a2 f8 ee 8d 42 e4 dd !..rr..P.....B.. 00:20:33.908 000000f0 22 d9 2d 72 85 38 dd ad fa a7 07 49 a1 f9 29 70 ".-r.8.....I..)p 00:20:33.908 00000100 e6 2c e3 f1 36 7d d4 bf 06 a1 7f 6a 10 7a 04 ae .,..6}.....j.z.. 00:20:33.908 00000110 a7 d5 00 0f c5 61 ef d3 8b d5 04 86 8a b0 4b 68 .....a........Kh 00:20:33.909 00000120 91 db c7 4d c6 00 16 d3 f1 d2 8f 55 5b c6 22 1d ...M.......U[.". 00:20:33.909 00000130 d4 56 37 3b 1e 55 ae 7d 3d b7 e2 0a c9 1a 06 dd .V7;.U.}=....... 00:20:33.909 00000140 aa 44 96 57 c1 44 21 1a d2 2a a6 4c ed 31 86 ef .D.W.D!..*.L.1.. 00:20:33.909 00000150 68 be 24 a6 d5 ab dc 4b 60 11 c3 c2 be 5d 7e d2 h.$....K`....]~. 00:20:33.909 00000160 18 cc e0 de 30 2c dc 0b 32 6d 90 6c 04 01 38 78 ....0,..2m.l..8x 00:20:33.909 00000170 e2 f7 05 45 35 21 16 76 16 ae b8 c4 19 06 98 a8 ...E5!.v........ 00:20:33.909 host pubkey: 00:20:33.909 00000000 57 9d 57 7e 33 d2 f3 0d d0 d5 a8 c5 b5 30 2e 08 W.W~3........0.. 00:20:33.909 00000010 9f 4a 8c b6 1e 75 b9 48 9e 9c 9d aa c4 6f 87 f0 .J...u.H.....o.. 00:20:33.909 00000020 f8 01 46 85 13 42 2e c4 97 4e 13 e8 7e e1 bc 21 ..F..B...N..~..! 00:20:33.909 00000030 6e 4a 43 1d fd 3c 46 57 fe 00 9b d0 11 e5 e1 49 nJC... 00:20:33.909 000000b0 71 dc 75 be 01 6b 31 6c 11 6f c4 bc 00 5b b1 40 q.u..k1l.o...[.@ 00:20:33.909 000000c0 65 c9 e4 98 ee 80 04 1c 09 db 71 2f 67 49 87 9a e.........q/gI.. 00:20:33.909 000000d0 ce df 0f 4b af ac 21 c4 36 a2 eb e0 52 8c 46 df ...K..!.6...R.F. 00:20:33.909 000000e0 77 ac 09 5e bf c3 91 db d1 cd 04 81 a7 17 ba 49 w..^...........I 00:20:33.909 000000f0 82 50 98 a4 2c 8d ac dd ac a7 9e 98 6e 50 66 98 .P..,.......nPf. 00:20:33.909 00000100 ed 6f 24 59 c5 76 a9 b4 4d 81 cc ae 03 f3 16 39 .o$Y.v..M......9 00:20:33.909 00000110 0b f2 72 57 f7 fa ff 5b d7 3d a9 f9 f4 61 c8 27 ..rW...[.=...a.' 00:20:33.909 00000120 95 e0 41 7f 47 2e d9 ae 4a 63 d1 47 2b bf 96 9b ..A.G...Jc.G+... 00:20:33.909 00000130 93 5e 0d 3e 2b 57 14 63 48 8d 32 29 16 66 be a1 .^.>+W.cH.2).f.. 00:20:33.909 00000140 77 13 71 7f cc 73 2a 81 d7 18 b4 26 7a 45 1e cd w.q..s*....&zE.. 00:20:33.909 00000150 a9 a1 f4 1e 58 21 84 66 71 69 db b8 71 01 c4 0c ....X!.fqi..q... 00:20:33.909 00000160 e3 7f 26 ae d3 91 ac a2 db 02 50 b1 04 fc 55 c4 ..&.......P...U. 00:20:33.909 00000170 db 3d 7c 45 13 ba 2d f6 01 4c d9 10 ec 12 6b ba .=|E..-..L....k. 00:20:33.909 dh secret: 00:20:33.909 00000000 be e2 8b 1e 2f 2f 96 62 ef d4 d1 c8 f8 9c 64 67 ....//.b......dg 00:20:33.909 00000010 bc 3b 69 98 af 5e 63 44 ce 76 cb 8e 38 ee 40 df .;i..^cD.v..8.@. 00:20:33.909 00000020 eb bc b0 bd 62 3a 69 e7 c1 ff 58 b9 99 a3 dd 22 ....b:i...X...." 00:20:33.909 00000030 2b 27 c8 70 ba 92 0b cb 6a 72 fa a3 31 cc 12 46 +'.p....jr..1..F 00:20:33.909 00000040 32 0f b5 ef 13 24 59 ec 04 77 14 52 d6 b1 ab 1b 2....$Y..w.R.... 00:20:33.909 00000050 39 af a5 4b de fa bf ef 7f fe 54 7c ea f3 8e 7d 9..K......T|...} 00:20:33.909 00000060 05 d3 df ec ba c5 88 69 53 9b 19 3d b7 27 a3 f1 .......iS..=.'.. 00:20:33.909 00000070 df b5 6c b6 a3 c2 14 a1 8c 4d a2 5a d2 d4 70 04 ..l......M.Z..p. 00:20:33.909 00000080 1f 4d 62 d8 99 ce 8c d3 40 6e 06 bc 13 13 61 80 .Mb.....@n....a. 00:20:33.909 00000090 71 6c 49 88 7f b9 45 bf 20 0d 9f f1 f8 43 f0 43 qlI...E. ....C.C 00:20:33.909 000000a0 76 ce dd ee 81 d4 83 d2 81 f9 a9 e0 8c 8b fc ff v............... 00:20:33.909 000000b0 71 a2 91 02 68 a0 d5 6b bc e6 21 08 ff c5 db 48 q...h..k..!....H 00:20:33.909 000000c0 44 ea 54 87 2f ee f3 51 c5 3b d8 08 d8 85 d5 3e D.T./..Q.;.....> 00:20:33.909 000000d0 9d d7 30 b0 e1 fd 55 e7 15 89 8d fc a8 85 65 ad ..0...U.......e. 00:20:33.909 000000e0 8c e7 f6 9f 60 74 cf 1b a9 0e 06 6b 82 23 a8 b9 ....`t.....k.#.. 00:20:33.909 000000f0 40 5d 1d 03 11 55 64 2a 98 3d fc 37 26 75 8f a0 @]...Ud*.=.7&u.. 00:20:33.909 00000100 c2 43 8c af e4 3d 14 91 6a fb 54 72 4f 5f cc 44 .C...=..j.TrO_.D 00:20:33.909 00000110 e1 c7 2a 2c d2 e7 5d 6e 21 12 6e 3e 61 80 54 25 ..*,..]n!.n>a.T% 00:20:33.909 00000120 9e 13 c6 c0 f6 fa e2 5d 50 bd 0a 2d ba 23 d2 24 .......]P..-.#.$ 00:20:33.909 00000130 fd 9b 1e e4 fd 8f 19 ae 9b 6a 7d 37 b5 ae ed a7 .........j}7.... 00:20:33.909 00000140 20 36 87 37 70 56 c4 c2 f8 c1 e9 d4 d9 09 6e 92 6.7pV........n. 00:20:33.909 00000150 60 74 24 9a 3f a6 bf 99 7c b5 6a fc 09 92 3b 4c `t$.?...|.j...;L 00:20:33.909 00000160 7c bf c8 9e ce 72 ec 06 9c 02 80 7c 96 05 d4 46 |....r.....|...F 00:20:33.909 00000170 2e a6 f9 7f 8f a6 3b 54 ac 9a 91 15 49 60 cb 76 ......;T....I`.v 00:20:33.909 [2024-09-27 15:25:16.594529] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=2, dhgroup=2, seq=3428451760, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.909 [2024-09-27 15:25:16.594627] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.909 [2024-09-27 15:25:16.611078] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.909 [2024-09-27 15:25:16.611137] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.909 [2024-09-27 15:25:16.611147] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.909 [2024-09-27 15:25:16.611180] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.909 [2024-09-27 15:25:16.768241] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.909 [2024-09-27 15:25:16.768261] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.909 [2024-09-27 15:25:16.768267] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.909 [2024-09-27 15:25:16.768310] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.909 [2024-09-27 15:25:16.768332] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.909 ctrlr pubkey: 00:20:33.909 00000000 d8 d2 1a e7 08 60 3d d9 26 48 bf 8d e4 cc 95 fc .....`=.&H...... 00:20:33.909 00000010 db f4 f8 86 50 50 d6 7c 01 80 aa b5 5e 5b 81 41 ....PP.|....^[.A 00:20:33.909 00000020 a0 c5 06 3e 37 c7 b3 c0 8a 4a 5f 78 ea 35 fa e6 ...>7....J_x.5.. 00:20:33.909 00000030 c7 28 e6 4c e5 04 6d 3d 34 17 33 80 c0 d2 60 93 .(.L..m=4.3...`. 00:20:33.909 00000040 74 aa ba f7 2e 98 ca 4b 3a 14 60 aa 84 2a 80 76 t......K:.`..*.v 00:20:33.909 00000050 da 08 10 5e a7 30 d6 25 46 e6 8b 1e c3 93 96 a7 ...^.0.%F....... 00:20:33.909 00000060 3a 6b 49 11 c5 87 39 b3 b2 78 0a d0 f2 79 56 85 :kI...9..x...yV. 00:20:33.909 00000070 aa 9d ed 7b 3c 8d 19 12 ad 98 b8 a1 b8 1b 96 39 ...{<..........9 00:20:33.909 00000080 e6 5f 10 0c e8 05 02 a9 8f dc 4e 76 35 1f a7 36 ._........Nv5..6 00:20:33.909 00000090 97 e8 ee 82 1a 8f 2b 41 40 d5 ff 2b 3f 73 f4 e2 ......+A@..+?s.. 00:20:33.909 000000a0 04 96 07 69 33 51 a8 3e 1b a8 ce 99 4c f4 9c f2 ...i3Q.>....L... 00:20:33.909 000000b0 1b ba d1 a8 1f b6 93 a9 3f 21 7d 1c 75 b6 e2 c5 ........?!}.u... 00:20:33.909 000000c0 22 09 5f 24 e8 04 3b a6 65 79 a9 46 a0 d1 35 2f "._$..;.ey.F..5/ 00:20:33.909 000000d0 92 17 9d 4a 66 f3 1e 6b d7 0e be 58 06 8d 96 e3 ...Jf..k...X.... 00:20:33.909 000000e0 6e 2e bc 46 40 f6 55 7b 59 f8 f4 c2 d4 a0 e9 e0 n..F@.U{Y....... 00:20:33.909 000000f0 82 3a 29 6d c2 c6 79 d0 78 cd 3d ec ea 02 93 39 .:)m..y.x.=....9 00:20:33.909 00000100 08 4a 4c ba c0 02 8d 56 c2 64 3b a1 84 33 b8 3a .JL....V.d;..3.: 00:20:33.909 00000110 11 c1 69 99 a7 38 e9 75 bc 65 cf 44 50 22 e2 5f ..i..8.u.e.DP"._ 00:20:33.909 00000120 89 7b 47 76 3b cf 43 25 cb 36 a8 fd f6 87 49 bd .{Gv;.C%.6....I. 00:20:33.909 00000130 dd c5 bd 0e 51 46 62 d2 15 1a a1 9d 18 d1 4b 55 ....QFb.......KU 00:20:33.909 00000140 23 c9 7d c9 8a ce 28 1b 2a fb 7f df ea 52 0e f1 #.}...(.*....R.. 00:20:33.909 00000150 aa 71 83 7e 47 3f 64 5a 9d 50 64 99 27 13 2a 60 .q.~G?dZ.Pd.'.*` 00:20:33.909 00000160 1a e4 32 27 19 2c 40 39 20 95 d5 1c d3 8b ed 21 ..2'.,@9 ......! 00:20:33.909 00000170 ef 9d 96 02 cb ec 77 69 ed 72 36 11 3a 6e 37 3f ......wi.r6.:n7? 00:20:33.909 host pubkey: 00:20:33.909 00000000 3d ce 0f f9 77 e8 53 2a 3a dd 0d 9a 79 7f 8a 09 =...w.S*:...y... 00:20:33.909 00000010 67 89 2f 0d 52 c1 9a 2e f0 44 56 60 10 39 d6 16 g./.R....DV`.9.. 00:20:33.909 00000020 69 1e 54 b0 56 16 73 c6 a0 5e d7 a1 d6 e4 78 f6 i.T.V.s..^....x. 00:20:33.909 00000030 2d 02 75 b7 7a ae dd 7b 49 c4 7d a0 69 b7 ab 1f -.u.z..{I.}.i... 00:20:33.909 00000040 7d 7c 7c 85 7d ce 88 e6 aa f5 fc f3 b0 8f 71 12 }||.}.........q. 00:20:33.909 00000050 73 cb 09 e7 bd 11 e1 7c d1 56 d7 58 2d bc d3 18 s......|.V.X-... 00:20:33.909 00000060 05 d7 7a f8 4e 33 f1 ad bb f8 d9 2c 0b 14 9f e2 ..z.N3.....,.... 00:20:33.909 00000070 85 bf e7 3d 4a 20 49 ec 54 cf ae 89 57 87 23 8f ...=J I.T...W.#. 00:20:33.909 00000080 cd 44 72 de e0 dc 00 46 c6 f3 32 86 d0 d2 9a bb .Dr....F..2..... 00:20:33.909 00000090 a2 e6 6c 37 18 c0 ad 82 27 35 e2 40 1b d7 cb c3 ..l7....'5.@.... 00:20:33.909 000000a0 3a 17 d7 b1 10 1f 7a 48 1b 02 21 f8 fa c5 9d 65 :.....zH..!....e 00:20:33.909 000000b0 3e c5 dd 4e 2a d8 5e 4c e5 58 44 bf a9 27 15 8e >..N*.^L.XD..'.. 00:20:33.909 000000c0 6a c5 b9 6d b3 8f f4 17 a1 e0 0d c8 92 93 d4 a5 j..m............ 00:20:33.909 000000d0 12 a8 98 78 7f f2 c8 d1 7e 1c c4 11 ad 6b 69 7b ...x....~....ki{ 00:20:33.909 000000e0 4b 2e 55 a9 18 91 a1 5c ac 5e 71 29 cc ab db dc K.U....\.^q).... 00:20:33.909 000000f0 a4 bf 13 4b 25 70 4c 9f 7d 07 23 99 0c 34 b8 8a ...K%pL.}.#..4.. 00:20:33.909 00000100 11 27 af f2 95 52 24 01 d8 0b 50 01 04 ec db e2 .'...R$...P..... 00:20:33.909 00000110 3a af 03 f5 76 ab 9d 99 08 3f d7 e6 01 35 f1 e6 :...v....?...5.. 00:20:33.910 00000120 3c 1b 76 f0 30 74 5e 28 5c 12 e1 a0 d9 63 2a 6a <.v.0t^(\....c*j 00:20:33.910 00000130 fa 38 6b 78 96 8f f5 a8 6c f8 1c 20 cc d4 7b 74 .8kx....l.. ..{t 00:20:33.910 00000140 99 84 55 72 0f 72 c3 91 6a 9b c4 ce 9d ab 11 f1 ..Ur.r..j....... 00:20:33.910 00000150 4f 2a 62 c5 c1 4d 83 b4 c3 7d 38 15 96 c5 e7 1e O*b..M...}8..... 00:20:33.910 00000160 c4 44 c1 8f e7 5b 03 65 51 c3 53 bc 67 fb e3 e6 .D...[.eQ.S.g... 00:20:33.910 00000170 fa 85 50 c5 86 1a 02 f1 7e 45 73 2b c8 b8 32 44 ..P.....~Es+..2D 00:20:33.910 dh secret: 00:20:33.910 00000000 28 46 03 d7 8a 6c 32 c1 af 85 ad 65 e4 ea 7d f4 (F...l2....e..}. 00:20:33.910 00000010 e3 d7 85 93 0d ee ae 70 02 5c d6 c8 28 af 5b 40 .......p.\..(.[@ 00:20:33.910 00000020 7c d4 1d 25 63 8b c5 38 c2 34 27 88 84 c6 ba 73 |..%c..8.4'....s 00:20:33.910 00000030 90 08 ce 96 7c 87 39 b4 60 3b 2c a8 2b db d2 d1 ....|.9.`;,.+... 00:20:33.910 00000040 24 a1 65 1a 27 7d 7d 31 17 e7 88 f7 88 e2 1f 9b $.e.'}}1........ 00:20:33.910 00000050 de 4f 65 52 04 cc 56 54 0d c2 54 85 0b 7d b1 6b .OeR..VT..T..}.k 00:20:33.910 00000060 36 bd 8b 00 7b 49 cd e6 f7 5c 0e 05 9b 6f f7 1f 6...{I...\...o.. 00:20:33.910 00000070 d2 79 f1 20 7e 12 ab a0 a5 bc ea 0d d3 80 0d 25 .y. ~..........% 00:20:33.910 00000080 e0 b3 7d 72 c8 1d 66 be 1e 3b 32 a6 ee 3a 1d 91 ..}r..f..;2..:.. 00:20:33.910 00000090 8c 48 2d b1 55 b8 da 30 6f 07 8a 77 80 ab 3e 7c .H-.U..0o..w..>| 00:20:33.910 000000a0 09 b6 76 6e 01 74 c8 0c ca fe 9a 1b 0c e9 aa 4d ..vn.t.........M 00:20:33.910 000000b0 4b 30 15 f5 b4 ea 0f 12 ec 5b d9 ba f4 23 a3 ad K0.......[...#.. 00:20:33.910 000000c0 89 32 bb d8 0d ac dc 18 e6 20 c9 8b 59 a4 28 25 .2....... ..Y.(% 00:20:33.910 000000d0 52 e3 20 e9 7d af cc 0b 47 21 27 a2 f7 56 b0 0a R. .}...G!'..V.. 00:20:33.910 000000e0 d4 80 31 4f 7e 59 82 26 b8 61 b0 c2 ef 71 92 87 ..1O~Y.&.a...q.. 00:20:33.910 000000f0 09 71 78 16 fd 4e 5b 9c 67 ab e2 57 81 dc 57 76 .qx..N[.g..W..Wv 00:20:33.910 00000100 4b 4c 3f e6 3c f5 b5 2d ea 3a 2e 0c b0 0a 67 71 KL?.<..-.:....gq 00:20:33.910 00000110 8f 9a b6 19 70 62 ed dd 73 ec 34 2d 99 4f 22 5c ....pb..s.4-.O"\ 00:20:33.910 00000120 cf 59 4b d8 95 91 0a b7 bc f7 1c fa a1 96 02 44 .YK............D 00:20:33.910 00000130 cc 58 cc 96 02 43 6a c2 e7 0c 4f 0d 14 d3 02 8e .X...Cj...O..... 00:20:33.910 00000140 0b f0 40 56 ac c9 df 93 80 19 17 de c7 36 d0 7f ..@V.........6.. 00:20:33.910 00000150 72 a8 e9 c8 d4 11 1e 05 52 f8 4e 93 70 22 9a c8 r.......R.N.p".. 00:20:33.910 00000160 cf 57 1a 7f 76 64 c1 98 d6 bd f9 51 63 2e 3b e0 .W..vd.....Qc.;. 00:20:33.910 00000170 b7 15 89 08 ea 06 52 5e 69 de 9c fe 4b 13 43 1c ......R^i...K.C. 00:20:33.910 [2024-09-27 15:25:16.775716] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=2, dhgroup=2, seq=3428451761, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.910 [2024-09-27 15:25:16.780723] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.910 [2024-09-27 15:25:16.780758] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.910 [2024-09-27 15:25:16.780774] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.910 [2024-09-27 15:25:16.780789] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.910 [2024-09-27 15:25:16.780807] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.910 [2024-09-27 15:25:16.886981] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.910 [2024-09-27 15:25:16.886998] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.910 [2024-09-27 15:25:16.887005] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.910 [2024-09-27 15:25:16.887015] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.910 [2024-09-27 15:25:16.887069] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.910 ctrlr pubkey: 00:20:33.910 00000000 d8 d2 1a e7 08 60 3d d9 26 48 bf 8d e4 cc 95 fc .....`=.&H...... 00:20:33.910 00000010 db f4 f8 86 50 50 d6 7c 01 80 aa b5 5e 5b 81 41 ....PP.|....^[.A 00:20:33.910 00000020 a0 c5 06 3e 37 c7 b3 c0 8a 4a 5f 78 ea 35 fa e6 ...>7....J_x.5.. 00:20:33.910 00000030 c7 28 e6 4c e5 04 6d 3d 34 17 33 80 c0 d2 60 93 .(.L..m=4.3...`. 00:20:33.910 00000040 74 aa ba f7 2e 98 ca 4b 3a 14 60 aa 84 2a 80 76 t......K:.`..*.v 00:20:33.910 00000050 da 08 10 5e a7 30 d6 25 46 e6 8b 1e c3 93 96 a7 ...^.0.%F....... 00:20:33.910 00000060 3a 6b 49 11 c5 87 39 b3 b2 78 0a d0 f2 79 56 85 :kI...9..x...yV. 00:20:33.910 00000070 aa 9d ed 7b 3c 8d 19 12 ad 98 b8 a1 b8 1b 96 39 ...{<..........9 00:20:33.910 00000080 e6 5f 10 0c e8 05 02 a9 8f dc 4e 76 35 1f a7 36 ._........Nv5..6 00:20:33.910 00000090 97 e8 ee 82 1a 8f 2b 41 40 d5 ff 2b 3f 73 f4 e2 ......+A@..+?s.. 00:20:33.910 000000a0 04 96 07 69 33 51 a8 3e 1b a8 ce 99 4c f4 9c f2 ...i3Q.>....L... 00:20:33.910 000000b0 1b ba d1 a8 1f b6 93 a9 3f 21 7d 1c 75 b6 e2 c5 ........?!}.u... 00:20:33.910 000000c0 22 09 5f 24 e8 04 3b a6 65 79 a9 46 a0 d1 35 2f "._$..;.ey.F..5/ 00:20:33.910 000000d0 92 17 9d 4a 66 f3 1e 6b d7 0e be 58 06 8d 96 e3 ...Jf..k...X.... 00:20:33.910 000000e0 6e 2e bc 46 40 f6 55 7b 59 f8 f4 c2 d4 a0 e9 e0 n..F@.U{Y....... 00:20:33.910 000000f0 82 3a 29 6d c2 c6 79 d0 78 cd 3d ec ea 02 93 39 .:)m..y.x.=....9 00:20:33.910 00000100 08 4a 4c ba c0 02 8d 56 c2 64 3b a1 84 33 b8 3a .JL....V.d;..3.: 00:20:33.910 00000110 11 c1 69 99 a7 38 e9 75 bc 65 cf 44 50 22 e2 5f ..i..8.u.e.DP"._ 00:20:33.910 00000120 89 7b 47 76 3b cf 43 25 cb 36 a8 fd f6 87 49 bd .{Gv;.C%.6....I. 00:20:33.910 00000130 dd c5 bd 0e 51 46 62 d2 15 1a a1 9d 18 d1 4b 55 ....QFb.......KU 00:20:33.910 00000140 23 c9 7d c9 8a ce 28 1b 2a fb 7f df ea 52 0e f1 #.}...(.*....R.. 00:20:33.910 00000150 aa 71 83 7e 47 3f 64 5a 9d 50 64 99 27 13 2a 60 .q.~G?dZ.Pd.'.*` 00:20:33.910 00000160 1a e4 32 27 19 2c 40 39 20 95 d5 1c d3 8b ed 21 ..2'.,@9 ......! 00:20:33.910 00000170 ef 9d 96 02 cb ec 77 69 ed 72 36 11 3a 6e 37 3f ......wi.r6.:n7? 00:20:33.910 host pubkey: 00:20:33.910 00000000 33 85 bf 88 5a 87 f7 de 59 3d 3e 8e eb 14 60 f1 3...Z...Y=>...`. 00:20:33.910 00000010 be 12 51 37 08 86 a5 32 bb bd fc f7 95 87 ec 7b ..Q7...2.......{ 00:20:33.910 00000020 2f fd 59 43 49 ea f4 31 76 72 90 eb 33 e3 12 34 /.YCI..1vr..3..4 00:20:33.910 00000030 ec 2e f2 7a 88 db 6f a6 85 12 8d c2 9b 59 c1 1c ...z..o......Y.. 00:20:33.910 00000040 ad fb 8d c1 d9 2c 9c 31 d0 04 81 5e 5f 40 36 00 .....,.1...^_@6. 00:20:33.910 00000050 e8 04 a5 a8 1b bb 85 71 b9 ff f0 2a b6 6c cc 22 .......q...*.l." 00:20:33.910 00000060 29 84 51 ba 3a f2 c2 d0 95 a0 64 35 27 f7 3b 8b ).Q.:.....d5'.;. 00:20:33.910 00000070 83 20 45 ca 71 a3 a3 79 9a b6 be 2a 1b 8d 02 1d . E.q..y...*.... 00:20:33.910 00000080 4b a4 85 b5 a7 ba 53 d3 06 b1 d0 0e 99 ac 01 bd K.....S......... 00:20:33.910 00000090 56 04 ef e1 57 46 69 04 b6 04 fd 0c b5 d9 05 0b V...WFi......... 00:20:33.910 000000a0 b9 61 03 f7 6c aa 94 b7 44 10 18 a4 32 9b 49 38 .a..l...D...2.I8 00:20:33.910 000000b0 74 c6 b7 df 2a 78 15 1f ff 3a 3a 21 c4 45 b1 35 t...*x...::!.E.5 00:20:33.910 000000c0 bf 04 eb 5f 29 bd a4 5e a7 eb e0 97 a6 e0 7c 3a ..._)..^......|: 00:20:33.910 000000d0 f8 5a 52 6e 70 25 79 8b 2e fc 99 2a 5e e7 16 cb .ZRnp%y....*^... 00:20:33.910 000000e0 d4 4b 33 56 b4 6e 5c 99 25 97 1d 61 71 93 6b bd .K3V.n\.%..aq.k. 00:20:33.910 000000f0 24 98 7c 00 3d 28 57 66 61 10 19 ff 2c d4 26 89 $.|.=(Wfa...,.&. 00:20:33.910 00000100 78 4f de 4c 9f 34 f3 da d2 10 21 59 68 f4 78 53 xO.L.4....!Yh.xS 00:20:33.910 00000110 e4 50 ab 7c 36 e3 5e 0e 57 7b f6 15 a0 b3 88 5a .P.|6.^.W{.....Z 00:20:33.910 00000120 d0 40 d6 22 b9 59 94 17 dd 55 02 85 37 ab 4b 89 .@.".Y...U..7.K. 00:20:33.910 00000130 66 70 8f 29 50 b9 29 4a 96 fb 0d 57 08 5a 4f f1 fp.)P.)J...W.ZO. 00:20:33.910 00000140 3a 89 43 6c 5f af 42 06 ed 94 c6 b5 f7 d6 28 f2 :.Cl_.B.......(. 00:20:33.910 00000150 e6 25 4e d2 58 ed f3 f7 2d 03 1f 6f d0 b2 4f c2 .%N.X...-..o..O. 00:20:33.910 00000160 c3 41 3b d4 b3 b5 4c 65 6d ac 38 e5 4b a4 37 6d .A;...Lem.8.K.7m 00:20:33.910 00000170 95 eb 36 62 4c 53 ff 2c 20 bd 9e 8b 05 a6 aa 14 ..6bLS., ....... 00:20:33.910 dh secret: 00:20:33.910 00000000 cc 61 d0 27 8b 68 2e f6 c8 b3 a7 f9 6c b9 fa c7 .a.'.h......l... 00:20:33.910 00000010 6c 6e 5c ab 85 62 e2 9e 15 59 fa 90 e1 e7 1d 09 ln\..b...Y...... 00:20:33.910 00000020 9c 8e cf 90 a4 58 2a 7f ac 1d f9 8b d3 dd 0f 60 .....X*........` 00:20:33.910 00000030 c0 f6 0d 2c 9c 1f d6 07 2c 96 8f da df f7 f3 2a ...,....,......* 00:20:33.910 00000040 e8 a0 79 0c 42 60 c0 9b c4 c1 3b f0 f8 2c 96 47 ..y.B`....;..,.G 00:20:33.910 00000050 9a 37 b1 68 b1 b0 67 86 3b b3 34 d6 86 54 03 8a .7.h..g.;.4..T.. 00:20:33.910 00000060 3d 98 88 d5 35 d1 33 7d 13 0d 69 91 32 28 4a 50 =...5.3}..i.2(JP 00:20:33.910 00000070 13 05 48 43 49 6b 2e f5 43 8a 68 5d e0 8b 3c 59 ..HCIk..C.h]..\pJ 00:20:33.911 00000150 a4 8a ec 8d e3 d0 24 45 48 a4 1a 5a 5b 14 14 29 ......$EH..Z[..) 00:20:33.911 00000160 68 6e 93 8f 12 4b 3f 2b 73 bf 40 21 9c 55 82 33 hn...K?+s.@!.U.3 00:20:33.911 00000170 a3 eb d6 7d a5 bb 2a 95 b8 f4 ac 8e 2f 81 80 e6 ...}..*...../... 00:20:33.911 [2024-09-27 15:25:17.076873] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=2, dhgroup=2, seq=3428451763, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.911 [2024-09-27 15:25:17.082352] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.911 [2024-09-27 15:25:17.082379] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.911 [2024-09-27 15:25:17.082396] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.911 [2024-09-27 15:25:17.082402] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.911 [2024-09-27 15:25:17.188707] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.911 [2024-09-27 15:25:17.188726] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.911 [2024-09-27 15:25:17.188734] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.911 [2024-09-27 15:25:17.188743] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.911 [2024-09-27 15:25:17.188801] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.911 ctrlr pubkey: 00:20:33.911 00000000 01 8d 9e 29 04 ca c7 5d 23 af f2 ca a1 1f 0c 80 ...)...]#....... 00:20:33.911 00000010 33 eb c0 80 39 9a b7 e5 4a f9 bf 6e f7 47 03 2f 3...9...J..n.G./ 00:20:33.911 00000020 d4 14 00 83 fd 08 4f b4 0b 6a 4e d7 5d 3b 71 59 ......O..jN.];qY 00:20:33.911 00000030 7f 16 40 af 78 2c 19 c0 72 8c 88 36 51 ff da 04 ..@.x,..r..6Q... 00:20:33.911 00000040 29 1f 17 40 30 6f df 12 53 e9 58 16 4b 65 4f 59 )..@0o..S.X.KeOY 00:20:33.911 00000050 8c eb eb 22 ae aa f6 d8 96 df e4 d6 f5 ec d9 55 ..."...........U 00:20:33.911 00000060 1d b8 ed d4 8e 2e c2 f8 c3 57 0d b9 24 fd 9e fd .........W..$... 00:20:33.911 00000070 c1 88 c6 28 10 29 61 71 ca 66 34 5d f5 32 28 a7 ...(.)aq.f4].2(. 00:20:33.911 00000080 ca bc 62 c2 51 0e 11 a0 ac 48 55 58 77 15 31 5d ..b.Q....HUXw.1] 00:20:33.911 00000090 24 0c 30 c0 71 bd be 62 8b 52 54 0b 37 bd 69 4f $.0.q..b.RT.7.iO 00:20:33.911 000000a0 62 56 47 3f af a5 97 14 20 72 7a ed 85 9a 86 1e bVG?.... rz..... 00:20:33.911 000000b0 81 6f 24 bc 65 82 f5 e8 3a 94 e3 b3 30 bf 0a b0 .o$.e...:...0... 00:20:33.911 000000c0 a8 d8 aa 16 f2 92 72 1b af ed 78 f9 3c f6 6a 49 ......r...x.<.jI 00:20:33.911 000000d0 d2 10 bc cc 76 5b 95 95 87 17 98 f0 65 4d 53 6d ....v[......eMSm 00:20:33.911 000000e0 c4 12 71 71 35 ac 09 ff 84 06 6f f5 11 57 dd 9c ..qq5.....o..W.. 00:20:33.911 000000f0 3f 21 eb 23 e4 e5 49 de 51 c1 4e cd df b1 3c 0f ?!.#..I.Q.N...<. 00:20:33.911 00000100 d8 a8 ac 6f 17 52 b9 de 18 c1 ee 97 5c 08 cf f4 ...o.R......\... 00:20:33.911 00000110 f4 15 83 9a 59 95 cd c1 d6 8d 0b 67 4b a6 e2 87 ....Y......gK... 00:20:33.911 00000120 8f 46 0e 61 9a 78 a8 f2 c3 d8 70 b6 34 da 86 16 .F.a.x....p.4... 00:20:33.911 00000130 fa a7 25 5a 9b ae 0a 60 3c 67 8f c3 2a 20 65 d7 ..%Z...` 00:20:33.912 00000160 98 11 92 0e 52 e3 e4 6b c9 ff d5 0e 61 31 e6 b5 ....R..k....a1.. 00:20:33.912 00000170 bc 54 05 78 09 c3 f2 e6 19 ec cb 7c e2 4f 5c a6 .T.x.......|.O\. 00:20:33.912 dh secret: 00:20:33.912 00000000 1c 17 d4 b3 bf 05 da f6 04 e4 5c d2 69 27 6e c2 ..........\.i'n. 00:20:33.912 00000010 b8 da 74 ae bf 16 3d 8e a6 6c 97 29 65 05 65 b3 ..t...=..l.)e.e. 00:20:33.912 00000020 e1 a4 19 f0 c3 6c 08 a0 61 8e 61 85 79 ab b2 a7 .....l..a.a.y... 00:20:33.912 00000030 cc fc 9d e1 b8 32 a0 32 95 9c 14 01 af 35 1a ee .....2.2.....5.. 00:20:33.912 00000040 58 01 ee 4a ea 7a 09 98 81 da e1 83 53 a5 36 51 X..J.z......S.6Q 00:20:33.912 00000050 2d f3 9a be 06 b7 45 53 f7 ed 8e 4a 70 eb 85 ed -.....ES...Jp... 00:20:33.912 00000060 2d c5 5e 89 b9 db ac 3b c5 34 f0 6a 07 d2 f9 6f -.^....;.4.j...o 00:20:33.912 00000070 39 0f 73 31 22 91 7f b1 91 ed c0 40 fa ca 13 6b 9.s1"......@...k 00:20:33.912 00000080 96 37 0e b4 1b 28 5c 32 71 e9 61 62 c3 47 ea 9f .7...(\2q.ab.G.. 00:20:33.912 00000090 9c cd 77 35 83 58 13 4c 9f 86 4d dc a0 bc ca 24 ..w5.X.L..M....$ 00:20:33.912 000000a0 49 f7 b3 79 72 e0 d7 90 a1 28 2d c8 86 2b c2 d7 I..yr....(-..+.. 00:20:33.912 000000b0 2c 0c 31 86 63 e5 55 5d 16 13 8c 8a 01 67 fd 8b ,.1.c.U].....g.. 00:20:33.912 000000c0 60 76 f9 6d a9 db 76 f4 92 ac ad 46 e8 0d a3 be `v.m..v....F.... 00:20:33.912 000000d0 78 2f dc 88 1c e8 70 ff 09 9c 70 0f e9 2d 9e 91 x/....p...p..-.. 00:20:33.912 000000e0 e3 9c d8 1c b8 d2 e2 d3 ec 49 28 32 95 bb f6 43 .........I(2...C 00:20:33.912 000000f0 5c 3d c7 cc 44 f2 33 ae f4 fe f5 41 68 20 1f 88 \=..D.3....Ah .. 00:20:33.912 00000100 8d f1 b7 38 45 64 bb ce 13 c8 54 4b f7 f4 ed f0 ...8Ed....TK.... 00:20:33.912 00000110 cb c3 87 17 b5 fd 43 07 b6 16 f4 f7 34 21 dd 46 ......C.....4!.F 00:20:33.912 00000120 f1 19 1f e7 5a 8b 4a e6 b3 2a 99 ba be 80 ce c7 ....Z.J..*...... 00:20:33.912 00000130 95 86 c9 db c1 ec f3 27 c0 54 71 1f 5b b0 d5 79 .......'.Tq.[..y 00:20:33.912 00000140 59 c9 ea be c3 63 1e ea 72 f5 18 3e 42 f5 f8 4b Y....c..r..>B..K 00:20:33.912 00000150 89 d9 67 8b 3f 0b 98 b8 e5 70 3f fe c8 67 e7 36 ..g.?....p?..g.6 00:20:33.912 00000160 5f 10 f6 7c 46 d0 37 e5 8c fd 4f 12 f1 27 18 19 _..|F.7...O..'.. 00:20:33.912 00000170 a6 11 c8 35 2f 3e dc e9 0e 83 06 cc 71 d0 cc ad ...5/>......q... 00:20:33.912 [2024-09-27 15:25:17.196118] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=2, dhgroup=2, seq=3428451764, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.912 [2024-09-27 15:25:17.196183] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.912 [2024-09-27 15:25:17.214310] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.912 [2024-09-27 15:25:17.214349] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.912 [2024-09-27 15:25:17.214356] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.912 [2024-09-27 15:25:17.374227] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.912 [2024-09-27 15:25:17.374252] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.912 [2024-09-27 15:25:17.374259] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.912 [2024-09-27 15:25:17.374304] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.912 [2024-09-27 15:25:17.374328] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.912 ctrlr pubkey: 00:20:33.912 00000000 79 b0 32 6a 1f 73 1f a8 c1 29 d4 87 b6 86 53 d3 y.2j.s...)....S. 00:20:33.912 00000010 1f 4b 36 e4 2d f4 65 8a 1a dd d4 6e 42 6e 6f e3 .K6.-.e....nBno. 00:20:33.912 00000020 4e 71 25 f5 ec 6b b4 e6 76 97 30 06 7a 29 05 47 Nq%..k..v.0.z).G 00:20:33.912 00000030 f3 07 34 21 bb b7 59 e7 66 17 ac a3 fe cd 5d 0e ..4!..Y.f.....]. 00:20:33.912 00000040 7d 24 4d 5a 04 98 1c 33 1a fa 5b ce c0 7d 8d f9 }$MZ...3..[..}.. 00:20:33.912 00000050 bc 0b 3c b9 7b f6 e5 85 7e 23 10 3d e0 5d 6d 20 ..<.{...~#.=.]m 00:20:33.912 00000060 57 e1 c6 80 cd 0a 14 25 4f 2a 43 bb 82 69 cb 5f W......%O*C..i._ 00:20:33.912 00000070 95 7e 84 96 d0 7d e2 98 21 ae 70 1a 94 ba 44 47 .~...}..!.p...DG 00:20:33.912 00000080 9a 8b 60 84 76 e1 f2 3c f2 17 c2 ac 11 47 0f 4f ..`.v..<.....G.O 00:20:33.912 00000090 c1 e3 a5 ca 62 40 93 40 ef cd e4 7b 1d 7c 3f ad ....b@.@...{.|?. 00:20:33.912 000000a0 07 4e a9 51 e7 58 9a b9 14 36 d7 38 b0 19 54 c5 .N.Q.X...6.8..T. 00:20:33.912 000000b0 97 76 62 4d e1 17 71 04 5d 41 cf 6d 91 ac f5 13 .vbM..q.]A.m.... 00:20:33.912 000000c0 1e 83 c6 7d de d0 77 67 e8 eb 2f e6 d5 ee e3 04 ...}..wg../..... 00:20:33.912 000000d0 e4 5c fe 29 28 63 43 31 76 cf 54 39 01 c8 d8 9c .\.)(cC1v.T9.... 00:20:33.912 000000e0 05 8c 36 57 59 12 e0 0e 29 df 97 46 42 80 d6 fe ..6WY...)..FB... 00:20:33.912 000000f0 71 66 ed 38 d0 ec 3d 57 7e 04 49 0c 27 1b 79 b7 qf.8..=W~.I.'.y. 00:20:33.912 00000100 82 8f 16 57 bf 4d 96 b8 66 fc 40 ee 2a 93 8f f0 ...W.M..f.@.*... 00:20:33.912 00000110 31 17 f8 22 cb 01 7a 69 b3 e6 a6 f2 9d 6b 4c a0 1.."..zi.....kL. 00:20:33.912 00000120 34 d8 68 58 cb 1c 69 50 c9 65 34 b8 aa 1f 7d 7c 4.hX..iP.e4...}| 00:20:33.912 00000130 e2 bf f6 e0 33 d2 f4 e9 03 26 25 88 5d 7d d7 90 ....3....&%.]}.. 00:20:33.912 00000140 64 8e 67 e0 4a b8 e1 3a fb c8 71 19 f9 0e 38 69 d.g.J..:..q...8i 00:20:33.912 00000150 0e e3 c7 8a 30 9a 8b 0f 33 83 fc e7 33 71 e7 f9 ....0...3...3q.. 00:20:33.912 00000160 4a 0e 7d cd e8 7f cd ab d1 11 f7 0a 23 3c 5b ef J.}.........#<[. 00:20:33.912 00000170 c0 32 39 bf ef 8d fb 8c 96 b2 25 54 bb 86 75 71 .29.......%T..uq 00:20:33.912 00000180 8e 56 70 2a 1f 0e 40 dc de a8 9e 6f c8 cf 96 93 .Vp*..@....o.... 00:20:33.912 00000190 d3 0d d0 5a 00 38 3e 87 16 50 a1 d4 b8 56 27 04 ...Z.8>..P...V'. 00:20:33.912 000001a0 16 bc 8d 46 37 19 6b a5 25 4f e4 14 29 57 eb 22 ...F7.k.%O..)W." 00:20:33.912 000001b0 6a 90 1a 58 35 f2 d8 f9 0a 31 f3 61 7d e6 b1 ee j..X5....1.a}... 00:20:33.912 000001c0 0b 6f b1 50 5b f0 61 63 2f 57 1b bb 6f 2f 22 9d .o.P[.ac/W..o/". 00:20:33.912 000001d0 39 b7 2d 47 07 9f c2 4a 88 f7 13 7b 61 9f 3b 79 9.-G...J...{a.;y 00:20:33.912 000001e0 72 75 33 6d 52 45 34 17 20 f9 38 af eb a5 ac 4c ru3mRE4. .8....L 00:20:33.912 000001f0 ca e2 c9 e5 13 22 91 0f 90 7c f4 e1 c5 7a b1 b2 ....."...|...z.. 00:20:33.912 host pubkey: 00:20:33.912 00000000 ac 94 e5 92 02 c1 2f e1 4a fb fb 44 ad 69 58 ea ....../.J..D.iX. 00:20:33.912 00000010 ff 81 f4 66 b4 69 72 ce b9 ff ca d8 00 5f 4e f2 ...f.ir......_N. 00:20:33.912 00000020 5b d1 52 85 77 0e fc af 45 5e 6d 71 9b f8 6d 1d [.R.w...E^mq..m. 00:20:33.912 00000030 5c 36 43 4f 4b f0 3c 27 cc 49 6e c3 37 a4 d7 4d \6COK.<'.In.7..M 00:20:33.912 00000040 bf 83 e1 36 4e af 3d e9 0e a3 29 4c 9e 9a 90 c0 ...6N.=...)L.... 00:20:33.912 00000050 e9 d2 1c b1 c1 52 35 13 ef 37 78 f8 1e 48 f2 d7 .....R5..7x..H.. 00:20:33.912 00000060 9d 19 f0 c0 4f d2 c0 54 93 93 31 01 bb 93 a5 d0 ....O..T..1..... 00:20:33.912 00000070 3f 79 57 ed c8 6a ac 61 0c 07 12 a0 af 43 1f 4b ?yW..j.a.....C.K 00:20:33.912 00000080 80 59 88 29 f0 6f c2 42 1a 43 8c f2 14 0a 64 e0 .Y.).o.B.C....d. 00:20:33.912 00000090 34 e4 8e 59 9a ce ab a3 7c 7e 5c b9 ee b1 06 bc 4..Y....|~\..... 00:20:33.912 000000a0 be 8f eb b1 6b f6 12 1d 76 33 1b 1b af fa c7 bf ....k...v3...... 00:20:33.912 000000b0 e6 ae 22 b1 a7 30 e8 da 56 02 f0 66 46 b0 97 09 .."..0..V..fF... 00:20:33.912 000000c0 a9 d7 c0 6c 36 01 ca be 38 86 b3 d5 a5 cb d3 21 ...l6...8......! 00:20:33.912 000000d0 96 39 51 6b fc ff 7a ff 7a 95 09 e9 84 89 48 27 .9Qk..z.z.....H' 00:20:33.912 000000e0 c6 13 4d dc 0d df 0b fe e3 7b 05 f6 4f 19 3e 27 ..M......{..O.>' 00:20:33.912 000000f0 e2 c4 70 05 41 bc c3 f7 ac 88 1d 4a 0e 85 57 23 ..p.A......J..W# 00:20:33.912 00000100 6c 68 cd 5f e6 4d c4 cd 39 8d f5 ab 85 d2 15 a9 lh._.M..9....... 00:20:33.912 00000110 aa 57 16 74 b7 31 ed 4f 93 04 c2 98 5c 52 0a c5 .W.t.1.O....\R.. 00:20:33.912 00000120 cd 6b ac 0a 49 33 48 db 74 51 3c 51 9f 6d 38 e9 .k..I3H.tQ.. 00:20:33.912 000001c0 83 b7 2d 64 ee 02 55 77 0b 00 d2 10 04 58 61 82 ..-d..Uw.....Xa. 00:20:33.912 000001d0 6e 75 35 be d4 26 d1 c6 91 7e 01 66 39 53 f9 58 nu5..&...~.f9S.X 00:20:33.912 000001e0 fc e8 90 03 4a 2e 02 29 62 2a cb 08 bc 0a 54 88 ....J..)b*....T. 00:20:33.912 000001f0 83 16 9e c4 30 eb bd 65 b7 26 f2 a9 29 f5 98 23 ....0..e.&..)..# 00:20:33.912 dh secret: 00:20:33.913 00000000 11 84 ab 29 f0 fb 17 36 27 61 72 09 6c b3 a8 2d ...)...6'ar.l..- 00:20:33.913 00000010 f9 ae 3e e4 38 2d 8a fa c9 52 18 f1 d7 39 7d 85 ..>.8-...R...9}. 00:20:33.913 00000020 37 47 ac 00 75 0e c1 9d 10 40 ad 96 99 f6 6b 7e 7G..u....@....k~ 00:20:33.913 00000030 81 81 42 61 26 ab 56 c2 f6 39 32 8b 41 64 71 68 ..Ba&.V..92.Adqh 00:20:33.913 00000040 44 80 c5 98 e6 45 35 14 a8 77 bb 9f 5e 58 49 71 D....E5..w..^XIq 00:20:33.913 00000050 c5 17 78 9b f4 a5 2e 37 bf 3e 13 5c 6f 2a 57 70 ..x....7.>.\o*Wp 00:20:33.913 00000060 d2 ce 89 73 c6 2f 8f f7 d2 76 43 de 81 4c 27 5a ...s./...vC..L'Z 00:20:33.913 00000070 4b a4 2d 59 64 83 de ea 35 fb aa ee 4e 96 4f b3 K.-Yd...5...N.O. 00:20:33.913 00000080 b6 98 2a 52 a3 d6 83 46 bb 9b a8 5a 5c 65 36 e5 ..*R...F...Z\e6. 00:20:33.913 00000090 35 46 2d 75 9b 9a e0 83 e3 12 90 5c b4 3e 7a 10 5F-u.......\.>z. 00:20:33.913 000000a0 e5 54 20 8d da 51 84 f9 a2 5d d1 d0 c5 ea cd 3a .T ..Q...].....: 00:20:33.913 000000b0 c0 c9 17 f5 04 a3 33 4a 4d 80 f9 f4 fd d5 14 e7 ......3JM....... 00:20:33.913 000000c0 a0 28 47 b7 2b 99 9c 47 a5 22 f0 ab 20 59 55 58 .(G.+..G.".. YUX 00:20:33.913 000000d0 83 dd 0a 2d cd bf df 54 4c 20 9d 58 32 60 69 86 ...-...TL .X2`i. 00:20:33.913 000000e0 14 dc ee b8 35 89 57 8d 48 ad ab 55 58 e2 4f a8 ....5.W.H..UX.O. 00:20:33.913 000000f0 94 e3 61 f6 20 75 6a 0c de 21 8c ea c9 68 fd 39 ..a. uj..!...h.9 00:20:33.913 00000100 b8 30 88 33 a8 0e bc dd 25 f1 f1 91 0d ae ec 64 .0.3....%......d 00:20:33.913 00000110 89 da 90 98 bd 64 a4 3d c1 51 31 3f f5 f3 f7 aa .....d.=.Q1?.... 00:20:33.913 00000120 92 fa f6 59 db 30 7e 88 0c d9 b1 5c b1 5b d1 f1 ...Y.0~....\.[.. 00:20:33.913 00000130 91 5d 7c ad 45 35 94 04 c1 d9 eb c0 e8 b0 dd 17 .]|.E5.......... 00:20:33.913 00000140 86 ef 3d 19 a7 16 77 03 13 ce f5 1a 68 13 f3 c2 ..=...w.....h... 00:20:33.913 00000150 6b af 06 7b 40 7e df 43 73 4c 46 6d f5 65 a5 2b k..{@~.CsLFm.e.+ 00:20:33.913 00000160 d4 0f 06 6f d9 3d 35 5a 6d c7 76 57 a6 7c 5b 4e ...o.=5Zm.vW.|[N 00:20:33.913 00000170 b7 66 ea 54 70 34 98 d4 a4 ba 98 23 a2 67 ba 4d .f.Tp4.....#.g.M 00:20:33.913 00000180 f6 67 56 3f d6 9a 95 50 67 16 60 1a 3a ad 5c 92 .gV?...Pg.`.:.\. 00:20:33.913 00000190 0d 8d 00 c7 b1 e2 a2 ad 85 f2 ec c0 6d 06 83 d7 ............m... 00:20:33.913 000001a0 93 c4 91 9e 8e 0f bc 6b d3 38 74 9e 00 fb 7e 8d .......k.8t...~. 00:20:33.913 000001b0 89 4f 26 6c ed 62 fa 03 4a bd 42 3d 50 c3 eb 07 .O&l.b..J.B=P... 00:20:33.913 000001c0 bf 4e 4c 47 8d 95 5f eb 09 9b 3f 1e 93 00 36 de .NLG.._...?...6. 00:20:33.913 000001d0 e7 28 47 bb 97 68 e6 fa db 19 04 53 24 bd 70 1d .(G..h.....S$.p. 00:20:33.913 000001e0 5a f6 af 49 5b f3 94 5b 55 d9 de 25 52 d2 ae b0 Z..I[..[U..%R... 00:20:33.913 000001f0 e8 58 27 84 e4 be d0 32 5e da 13 fb 4c 7a 52 86 .X'....2^...LzR. 00:20:33.913 [2024-09-27 15:25:17.390737] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=2, dhgroup=3, seq=3428451765, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.913 [2024-09-27 15:25:17.408890] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.913 [2024-09-27 15:25:17.408933] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.913 [2024-09-27 15:25:17.408951] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.913 [2024-09-27 15:25:17.408978] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.913 [2024-09-27 15:25:17.408989] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.913 [2024-09-27 15:25:17.515377] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.913 [2024-09-27 15:25:17.515396] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.913 [2024-09-27 15:25:17.515404] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.913 [2024-09-27 15:25:17.515414] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.913 [2024-09-27 15:25:17.515468] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.913 ctrlr pubkey: 00:20:33.913 00000000 79 b0 32 6a 1f 73 1f a8 c1 29 d4 87 b6 86 53 d3 y.2j.s...)....S. 00:20:33.913 00000010 1f 4b 36 e4 2d f4 65 8a 1a dd d4 6e 42 6e 6f e3 .K6.-.e....nBno. 00:20:33.913 00000020 4e 71 25 f5 ec 6b b4 e6 76 97 30 06 7a 29 05 47 Nq%..k..v.0.z).G 00:20:33.913 00000030 f3 07 34 21 bb b7 59 e7 66 17 ac a3 fe cd 5d 0e ..4!..Y.f.....]. 00:20:33.913 00000040 7d 24 4d 5a 04 98 1c 33 1a fa 5b ce c0 7d 8d f9 }$MZ...3..[..}.. 00:20:33.913 00000050 bc 0b 3c b9 7b f6 e5 85 7e 23 10 3d e0 5d 6d 20 ..<.{...~#.=.]m 00:20:33.913 00000060 57 e1 c6 80 cd 0a 14 25 4f 2a 43 bb 82 69 cb 5f W......%O*C..i._ 00:20:33.913 00000070 95 7e 84 96 d0 7d e2 98 21 ae 70 1a 94 ba 44 47 .~...}..!.p...DG 00:20:33.913 00000080 9a 8b 60 84 76 e1 f2 3c f2 17 c2 ac 11 47 0f 4f ..`.v..<.....G.O 00:20:33.913 00000090 c1 e3 a5 ca 62 40 93 40 ef cd e4 7b 1d 7c 3f ad ....b@.@...{.|?. 00:20:33.913 000000a0 07 4e a9 51 e7 58 9a b9 14 36 d7 38 b0 19 54 c5 .N.Q.X...6.8..T. 00:20:33.913 000000b0 97 76 62 4d e1 17 71 04 5d 41 cf 6d 91 ac f5 13 .vbM..q.]A.m.... 00:20:33.913 000000c0 1e 83 c6 7d de d0 77 67 e8 eb 2f e6 d5 ee e3 04 ...}..wg../..... 00:20:33.913 000000d0 e4 5c fe 29 28 63 43 31 76 cf 54 39 01 c8 d8 9c .\.)(cC1v.T9.... 00:20:33.913 000000e0 05 8c 36 57 59 12 e0 0e 29 df 97 46 42 80 d6 fe ..6WY...)..FB... 00:20:33.913 000000f0 71 66 ed 38 d0 ec 3d 57 7e 04 49 0c 27 1b 79 b7 qf.8..=W~.I.'.y. 00:20:33.913 00000100 82 8f 16 57 bf 4d 96 b8 66 fc 40 ee 2a 93 8f f0 ...W.M..f.@.*... 00:20:33.913 00000110 31 17 f8 22 cb 01 7a 69 b3 e6 a6 f2 9d 6b 4c a0 1.."..zi.....kL. 00:20:33.913 00000120 34 d8 68 58 cb 1c 69 50 c9 65 34 b8 aa 1f 7d 7c 4.hX..iP.e4...}| 00:20:33.913 00000130 e2 bf f6 e0 33 d2 f4 e9 03 26 25 88 5d 7d d7 90 ....3....&%.]}.. 00:20:33.913 00000140 64 8e 67 e0 4a b8 e1 3a fb c8 71 19 f9 0e 38 69 d.g.J..:..q...8i 00:20:33.913 00000150 0e e3 c7 8a 30 9a 8b 0f 33 83 fc e7 33 71 e7 f9 ....0...3...3q.. 00:20:33.913 00000160 4a 0e 7d cd e8 7f cd ab d1 11 f7 0a 23 3c 5b ef J.}.........#<[. 00:20:33.913 00000170 c0 32 39 bf ef 8d fb 8c 96 b2 25 54 bb 86 75 71 .29.......%T..uq 00:20:33.913 00000180 8e 56 70 2a 1f 0e 40 dc de a8 9e 6f c8 cf 96 93 .Vp*..@....o.... 00:20:33.913 00000190 d3 0d d0 5a 00 38 3e 87 16 50 a1 d4 b8 56 27 04 ...Z.8>..P...V'. 00:20:33.913 000001a0 16 bc 8d 46 37 19 6b a5 25 4f e4 14 29 57 eb 22 ...F7.k.%O..)W." 00:20:33.913 000001b0 6a 90 1a 58 35 f2 d8 f9 0a 31 f3 61 7d e6 b1 ee j..X5....1.a}... 00:20:33.913 000001c0 0b 6f b1 50 5b f0 61 63 2f 57 1b bb 6f 2f 22 9d .o.P[.ac/W..o/". 00:20:33.913 000001d0 39 b7 2d 47 07 9f c2 4a 88 f7 13 7b 61 9f 3b 79 9.-G...J...{a.;y 00:20:33.913 000001e0 72 75 33 6d 52 45 34 17 20 f9 38 af eb a5 ac 4c ru3mRE4. .8....L 00:20:33.913 000001f0 ca e2 c9 e5 13 22 91 0f 90 7c f4 e1 c5 7a b1 b2 ....."...|...z.. 00:20:33.913 host pubkey: 00:20:33.913 00000000 08 c2 6d 3c bc 56 8e f4 47 5a 05 c1 4f 0f cc 8f ..m<.V..GZ..O... 00:20:33.913 00000010 eb 73 07 29 a5 60 88 79 02 13 5c 24 4f dd 85 52 .s.).`.y..\$O..R 00:20:33.913 00000020 52 2f 36 b4 83 2f 40 d7 71 96 4e b7 02 8e db ce R/6../@.q.N..... 00:20:33.913 00000030 98 be 38 60 10 59 57 38 ef c2 b3 a9 24 66 c7 8f ..8`.YW8....$f.. 00:20:33.913 00000040 8f bf 25 7c b2 55 37 3e c2 71 98 51 5c 81 a0 23 ..%|.U7>.q.Q\..# 00:20:33.913 00000050 0c b7 85 c2 33 4e 3c 5f ff 04 c3 21 a8 7b b0 ee ....3N<_...!.{.. 00:20:33.913 00000060 c9 99 f0 b9 d2 6a f7 6d 61 7e 3f 34 85 36 cb 6f .....j.ma~?4.6.o 00:20:33.913 00000070 41 cb b7 55 3c b6 fc 16 03 04 9c ab c7 7f 76 39 A..U<.........v9 00:20:33.913 00000080 bf 53 3d 28 bd a6 22 d6 ca e6 bf 1b 98 99 73 6c .S=(..".......sl 00:20:33.913 00000090 48 47 3c 61 0b de de 47 6b 82 b9 91 61 01 9c a8 HGv$...Mx.d 00:20:33.913 00000130 1c 34 3c d6 9c e6 68 e8 e9 1f 91 24 a5 43 35 88 .4<...h....$.C5. 00:20:33.913 00000140 9d ca 69 d1 a9 67 c0 1e 1e 54 69 18 86 b3 20 b0 ..i..g...Ti... . 00:20:33.913 00000150 53 ae 95 02 7c 3f 7c da 81 67 cc 2f 88 ca 58 4e S...|?|..g./..XN 00:20:33.913 00000160 4f 4d f1 d1 da 3d b5 e7 29 e5 9f 21 cb a3 3f 9b OM...=..)..!..?. 00:20:33.913 00000170 31 ce f4 cc db b7 13 04 57 e6 0b 78 1f 15 3c 69 1.......W..x..Sw 00:20:33.913 000001f0 85 39 34 b2 bc 3f 62 33 9a 18 35 86 da db d2 f2 .94..?b3..5..... 00:20:33.913 dh secret: 00:20:33.913 00000000 5c 14 75 cd 5e 09 ed 3f 4f bd 2d 69 a5 1f 54 3e \.u.^..?O.-i..T> 00:20:33.913 00000010 b6 88 93 90 b8 2d f7 d4 5b b5 60 21 c5 d7 a5 6c .....-..[.`!...l 00:20:33.913 00000020 42 3a d0 63 2a 7c 14 28 a4 94 f8 f7 7b b8 bc f0 B:.c*|.(....{... 00:20:33.913 00000030 cc 45 28 3b d5 19 a1 bc 66 12 35 b2 00 09 d4 ed .E(;....f.5..... 00:20:33.913 00000040 80 9f 0f 59 4e 0f 47 14 80 e7 43 85 fa ff dd 67 ...YN.G...C....g 00:20:33.913 00000050 c4 8b b7 29 0e 74 6f 80 f1 5b c9 db 9b cb 83 39 ...).to..[.....9 00:20:33.913 00000060 46 91 ff 74 a3 f9 ef 15 f0 a7 a4 b4 ef bf 01 ef F..t............ 00:20:33.913 00000070 3e fa ed f6 11 d7 4d 92 46 b1 cf 57 16 36 26 1b >.....M.F..W.6&. 00:20:33.913 00000080 10 79 74 10 d7 57 ff 44 ac c0 df 5d 92 c0 a5 e2 .yt..W.D...].... 00:20:33.913 00000090 af db c0 54 bb 1a 30 20 c4 87 cb ce ec 67 73 87 ...T..0 .....gs. 00:20:33.914 000000a0 e8 d2 ca f3 76 89 85 21 2f 6b 2d f3 9d 20 20 48 ....v..!/k-.. H 00:20:33.914 000000b0 81 db 48 8f c6 1f 40 0b ea 7e ec 79 ce fc 4d 4b ..H...@..~.y..MK 00:20:33.914 000000c0 57 66 a1 eb e8 3a 50 93 d5 d9 04 aa f9 a9 a0 6b Wf...:P........k 00:20:33.914 000000d0 b3 92 77 94 75 dc 92 9e 48 26 3c 95 fc d5 b7 61 ..w.u...H&<....a 00:20:33.914 000000e0 9c 2b 95 33 4d 56 86 b2 44 8f 68 a3 e0 08 9a 03 .+.3MV..D.h..... 00:20:33.914 000000f0 60 91 97 22 00 1b 23 75 62 ba 9d a6 f2 a0 bd a8 `.."..#ub....... 00:20:33.914 00000100 03 e5 59 63 9e 94 a1 cb 49 17 bb de 94 e7 04 f5 ..Yc....I....... 00:20:33.914 00000110 ad ca b9 ff f1 f7 39 a4 4c 9a 83 68 1d db 59 9f ......9.L..h..Y. 00:20:33.914 00000120 4b 2a 51 8b ca fe 88 59 75 a0 a5 a5 24 58 ed 13 K*Q....Yu...$X.. 00:20:33.914 00000130 72 9e 23 f9 d8 6a da d9 fc 80 d3 26 4b ca c5 0f r.#..j.....&K... 00:20:33.914 00000140 a3 f7 78 5d d2 dc 75 39 7d ff 90 17 01 96 eb c1 ..x]..u9}....... 00:20:33.914 00000150 74 65 74 59 48 06 b3 3e 0d 48 e7 57 86 27 c4 c1 tetYH..>.H.W.'.. 00:20:33.914 00000160 fc 86 5d 2d 6a f6 e0 73 63 3a 2c dd d1 94 a4 c7 ..]-j..sc:,..... 00:20:33.914 00000170 7f 3a 91 31 de f9 c4 02 15 fb 26 af 7c 90 7d a5 .:.1......&.|.}. 00:20:33.914 00000180 46 a8 88 85 20 ac 6a c2 fa ad 61 dd 2a 03 f6 53 F... .j...a.*..S 00:20:33.914 00000190 e9 2d af 37 00 41 04 59 d8 17 f2 27 af ba 13 27 .-.7.A.Y...'...' 00:20:33.914 000001a0 b4 73 fb 66 63 57 2e be 43 e2 d6 8b 51 31 b6 19 .s.fcW..C...Q1.. 00:20:33.914 000001b0 2d a4 37 52 bf 17 fe 36 3f d6 25 7b fb 50 50 4c -.7R...6?.%{.PPL 00:20:33.914 000001c0 0f 16 f1 55 9d d2 e4 d9 d9 b3 ac 3b e4 3b 77 92 ...U.......;.;w. 00:20:33.914 000001d0 27 26 82 9d b5 4b 54 e9 97 dd ae 92 b1 05 9a 66 '&...KT........f 00:20:33.914 000001e0 5c ee ce 97 5f 13 ad c9 07 c4 bd 3e 10 51 05 d2 \..._......>.Q.. 00:20:33.914 000001f0 ec 5c 79 6d 24 c1 a9 1a c5 cb 01 9a 3a 03 2a ba .\ym$.......:.*. 00:20:33.914 [2024-09-27 15:25:17.531444] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=2, dhgroup=3, seq=3428451766, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.914 [2024-09-27 15:25:17.531544] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.914 [2024-09-27 15:25:17.567828] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.914 [2024-09-27 15:25:17.567870] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.914 [2024-09-27 15:25:17.567881] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.914 [2024-09-27 15:25:17.567907] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.914 [2024-09-27 15:25:17.732970] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.914 [2024-09-27 15:25:17.732991] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.914 [2024-09-27 15:25:17.732999] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.914 [2024-09-27 15:25:17.733044] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.914 [2024-09-27 15:25:17.733067] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.914 ctrlr pubkey: 00:20:33.914 00000000 4f 77 de 85 ff a8 ba a6 3c bd 4f b2 8f a6 c6 9c Ow......<.O..... 00:20:33.914 00000010 d2 42 78 ac c5 f6 d7 34 0c df 39 c3 ed 50 7a c8 .Bx....4..9..Pz. 00:20:33.914 00000020 83 8e b0 ad 79 15 d4 31 7b 2a 8f 90 4a f9 cf 74 ....y..1{*..J..t 00:20:33.914 00000030 90 f8 54 c6 2a f4 a8 3a c0 6c 37 55 a3 0b f5 4a ..T.*..:.l7U...J 00:20:33.914 00000040 90 f8 df 27 1a 36 01 e9 3a 05 d6 e4 ed 28 77 68 ...'.6..:....(wh 00:20:33.914 00000050 c3 f8 43 7e 63 7a db fc 0e fc 80 b9 37 54 15 8f ..C~cz......7T.. 00:20:33.914 00000060 5a 9e da 7e 1d ce 51 bd d7 69 b9 c7 7a 4d e4 e4 Z..~..Q..i..zM.. 00:20:33.914 00000070 36 c4 0e 87 fc 6c 96 08 36 90 39 c6 c7 74 0d e3 6....l..6.9..t.. 00:20:33.914 00000080 40 dc 87 a3 12 ca fb 0d 21 62 e5 5a 4b b0 d2 8e @.......!b.ZK... 00:20:33.914 00000090 b4 ae 83 bd b4 c2 67 6d 87 e6 8e 10 0f 1a df 7d ......gm.......} 00:20:33.914 000000a0 a1 0d 79 48 80 90 4d 66 4c c2 af e9 15 36 1e bb ..yH..MfL....6.. 00:20:33.914 000000b0 94 c9 b9 02 31 2b 24 c1 7f 29 a5 83 59 82 f5 f2 ....1+$..)..Y... 00:20:33.914 000000c0 ac aa 3a 66 71 a5 7d 62 ae f9 26 e6 3f ac d8 52 ..:fq.}b..&.?..R 00:20:33.914 000000d0 49 ec 31 14 80 f8 44 0e 39 7d 75 06 11 9e 53 f6 I.1...D.9}u...S. 00:20:33.914 000000e0 3c 67 20 e3 8e a2 cb b1 4e 8a 94 5a 6f f3 5a 57 "[tR.... 00:20:33.914 00000080 e5 4c 87 2c 97 fe 17 50 08 1d d5 49 63 e9 bd 6a .L.,...P...Ic..j 00:20:33.914 00000090 b8 20 8e 63 38 d1 e0 9f f1 93 fd 7f 39 ce 56 e7 . .c8.......9.V. 00:20:33.914 000000a0 4a d0 f5 82 fa 00 30 16 95 e5 86 69 1e e2 f3 d3 J.....0....i.... 00:20:33.914 000000b0 cb df 13 81 c1 32 62 f0 4c b7 8b bb 30 dd c8 77 .....2b.L...0..w 00:20:33.914 000000c0 f6 cb 41 17 8d 32 12 a3 50 d0 44 16 45 31 93 17 ..A..2..P.D.E1.. 00:20:33.914 000000d0 8f 79 7b ae 57 dd 3d 1f 17 a3 9c a9 c1 92 b2 60 .y{.W.=........` 00:20:33.914 000000e0 20 34 b0 c3 54 37 49 ec 06 40 d3 11 0c 0d b2 8f 4..T7I..@...... 00:20:33.914 000000f0 1a de 67 0a 0a 35 31 51 22 1e 0f 7c 66 f7 69 ee ..g..51Q"..|f.i. 00:20:33.914 00000100 36 85 55 12 f8 1c d6 0b 2d 4f c6 9d 61 35 da a6 6.U.....-O..a5.. 00:20:33.914 00000110 8d 82 cc 36 b7 03 fd e2 6c 60 11 b1 94 ba 13 74 ...6....l`.....t 00:20:33.914 00000120 41 78 e8 3a e5 1f 1a 9a 78 2f b8 11 b2 97 e4 9e Ax.:....x/...... 00:20:33.914 00000130 f3 d0 82 11 29 ca ef 3a 05 84 08 f9 08 3e 9e 3d ....)..:.....>.= 00:20:33.914 00000140 ff 79 aa 65 70 23 72 d5 98 e7 84 27 51 6d 58 32 .y.ep#r....'QmX2 00:20:33.914 00000150 e1 2b b6 4f ae 2e 19 91 58 86 55 09 1b cb b5 ed .+.O....X.U..... 00:20:33.914 00000160 29 fc f3 94 eb bc a1 e0 af 2f 21 70 1a b2 67 4c )......../!p..gL 00:20:33.914 00000170 53 67 47 0c 20 80 cb ba f8 3d cc 65 4f 43 45 e4 SgG. ....=.eOCE. 00:20:33.914 00000180 f9 d9 d7 03 3b 10 eb eb 46 3d b8 d9 f1 87 78 a7 ....;...F=....x. 00:20:33.914 00000190 4f 65 f0 27 86 ed 91 c7 37 2e a8 22 f1 aa 37 2c Oe.'....7.."..7, 00:20:33.914 000001a0 91 19 49 8c 7e a3 7d b8 8f 65 82 7f a7 22 84 47 ..I.~.}..e...".G 00:20:33.914 000001b0 c5 f3 af 62 c4 41 22 c6 f0 ac 36 59 2a f3 3d 6f ...b.A"...6Y*.=o 00:20:33.914 000001c0 53 a7 40 54 bf b3 c8 46 d6 bc 44 7f 14 72 34 aa S.@T...F..D..r4. 00:20:33.914 000001d0 a6 c6 14 14 03 cd 33 bc 9d 84 df 9b 27 6a c4 46 ......3.....'j.F 00:20:33.914 000001e0 46 80 d3 64 a3 83 46 fc 07 ef fd 4a e4 e7 12 15 F..d..F....J.... 00:20:33.914 000001f0 f8 53 f4 35 40 88 4a 9e 72 09 6d 9d b4 bc f5 e4 .S.5@.J.r.m..... 00:20:33.914 dh secret: 00:20:33.914 00000000 98 d9 43 31 60 fe ed 67 d1 a7 c0 09 b4 c2 09 2b ..C1`..g.......+ 00:20:33.914 00000010 ac 08 1f 94 61 c6 e0 c5 a3 51 58 19 7d ce d0 35 ....a....QX.}..5 00:20:33.914 00000020 0b 54 15 57 28 ad 14 0e 10 3e c2 c0 e0 66 52 f7 .T.W(....>...fR. 00:20:33.914 00000030 cc 39 84 19 34 04 02 73 9c c6 f7 39 f1 ad 7d 1f .9..4..s...9..}. 00:20:33.914 00000040 f2 d3 58 55 00 f5 c2 94 65 bf e3 f4 38 97 51 a7 ..XU....e...8.Q. 00:20:33.914 00000050 34 ca 10 63 49 84 6f 9c 42 49 97 1d 27 e4 c3 1c 4..cI.o.BI..'... 00:20:33.914 00000060 bb b2 c9 c8 59 c1 67 bf 6b ab f6 9d 70 81 0c 6f ....Y.g.k...p..o 00:20:33.914 00000070 ad 47 f8 09 91 7d a6 79 90 23 ee dc 19 76 4a 75 .G...}.y.#...vJu 00:20:33.914 00000080 3b f9 40 c5 51 ba 2a 8b e3 48 74 ef ba 89 98 9a ;.@.Q.*..Ht..... 00:20:33.915 00000090 05 b7 88 96 ca 83 ee 42 28 13 7a 14 ca d8 e0 b4 .......B(.z..... 00:20:33.915 000000a0 6f ef 32 42 0e ed ac 45 1a 6a 38 d0 7b 5e e5 f2 o.2B...E.j8.{^.. 00:20:33.915 000000b0 19 59 77 54 34 a9 86 66 99 c5 b0 92 1f 85 7f 6e .YwT4..f.......n 00:20:33.915 000000c0 20 8b 99 ac 82 8b 76 34 3a 16 a8 53 19 28 75 3c .....v4:..S.(u< 00:20:33.915 000000d0 a3 e8 3c 37 92 d0 8a 22 c1 ae 8f 92 9d cc 69 ba ..<7..."......i. 00:20:33.915 000000e0 db d5 d3 3f 91 5b 18 3c 5e ec b5 e5 a8 ac 39 e1 ...?.[.<^.....9. 00:20:33.915 000000f0 ac 91 92 7a 5e 1a 09 f6 b3 8c 8b 2b 27 51 69 af ...z^......+'Qi. 00:20:33.915 00000100 ae c9 78 10 b6 50 10 8b f6 7f ca ad 4d 51 54 64 ..x..P......MQTd 00:20:33.915 00000110 54 fd 2b f9 44 99 0d bc 15 b1 b0 56 e4 5f 38 00 T.+.D......V._8. 00:20:33.915 00000120 01 7f e5 48 da 20 b0 f7 e2 6c d2 0b 4a 80 92 a7 ...H. ...l..J... 00:20:33.915 00000130 c0 75 94 f4 9c ad 0b 31 de 00 19 ae 72 5f b9 a3 .u.....1....r_.. 00:20:33.915 00000140 c1 79 9b 7d d0 4d 5f 77 66 d0 62 eb 5f e9 5b df .y.}.M_wf.b._.[. 00:20:33.915 00000150 b0 92 c7 ab 69 79 33 e0 28 8e bd 03 b6 4d ad d5 ....iy3.(....M.. 00:20:33.915 00000160 eb ad ac 4d b5 eb b7 6a b8 2b 0d 6d ba 22 48 0c ...M...j.+.m."H. 00:20:33.915 00000170 2a 52 e4 9c 43 18 70 9a ec 55 15 f6 2a 0d a1 b1 *R..C.p..U..*... 00:20:33.915 00000180 b4 1f 51 5c 20 35 59 66 f2 cc 24 77 7f 88 e2 b3 ..Q\ 5Yf..$w.... 00:20:33.915 00000190 cd 0c ea c3 89 b6 e1 55 f8 9e 24 84 7a 7e 4d 44 .......U..$.z~MD 00:20:33.915 000001a0 19 7b 87 3f 31 2a 2f 0d 93 cc 44 d1 0e 74 ae 43 .{.?1*/...D..t.C 00:20:33.915 000001b0 0d c2 37 38 cc e2 9e c5 8e 4d eb 98 70 5d 23 36 ..78.....M..p]#6 00:20:33.915 000001c0 63 ac 1f 57 b4 7c 86 6c 4a cd 0d 53 96 3c 1f dd c..W.|.lJ..S.<.. 00:20:33.915 000001d0 b1 13 1d ee 08 21 c9 d3 b6 c7 dc 2a 1f bb 06 cf .....!.....*.... 00:20:33.915 000001e0 a2 4e 90 23 7a 8e e2 79 42 cc c7 7f 6f 7b 09 1b .N.#z..yB...o{.. 00:20:33.915 000001f0 66 71 43 44 11 99 b1 c3 e9 2c ba a6 48 24 42 2b fqCD.....,..H$B+ 00:20:33.915 [2024-09-27 15:25:17.749024] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=2, dhgroup=3, seq=3428451767, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.915 [2024-09-27 15:25:17.765425] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.915 [2024-09-27 15:25:17.765467] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.915 [2024-09-27 15:25:17.765483] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.915 [2024-09-27 15:25:17.765502] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.915 [2024-09-27 15:25:17.765517] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.915 [2024-09-27 15:25:17.871296] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.915 [2024-09-27 15:25:17.871315] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.915 [2024-09-27 15:25:17.871323] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.915 [2024-09-27 15:25:17.871333] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.915 [2024-09-27 15:25:17.871391] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.915 ctrlr pubkey: 00:20:33.915 00000000 4f 77 de 85 ff a8 ba a6 3c bd 4f b2 8f a6 c6 9c Ow......<.O..... 00:20:33.915 00000010 d2 42 78 ac c5 f6 d7 34 0c df 39 c3 ed 50 7a c8 .Bx....4..9..Pz. 00:20:33.915 00000020 83 8e b0 ad 79 15 d4 31 7b 2a 8f 90 4a f9 cf 74 ....y..1{*..J..t 00:20:33.915 00000030 90 f8 54 c6 2a f4 a8 3a c0 6c 37 55 a3 0b f5 4a ..T.*..:.l7U...J 00:20:33.915 00000040 90 f8 df 27 1a 36 01 e9 3a 05 d6 e4 ed 28 77 68 ...'.6..:....(wh 00:20:33.915 00000050 c3 f8 43 7e 63 7a db fc 0e fc 80 b9 37 54 15 8f ..C~cz......7T.. 00:20:33.915 00000060 5a 9e da 7e 1d ce 51 bd d7 69 b9 c7 7a 4d e4 e4 Z..~..Q..i..zM.. 00:20:33.915 00000070 36 c4 0e 87 fc 6c 96 08 36 90 39 c6 c7 74 0d e3 6....l..6.9..t.. 00:20:33.915 00000080 40 dc 87 a3 12 ca fb 0d 21 62 e5 5a 4b b0 d2 8e @.......!b.ZK... 00:20:33.915 00000090 b4 ae 83 bd b4 c2 67 6d 87 e6 8e 10 0f 1a df 7d ......gm.......} 00:20:33.915 000000a0 a1 0d 79 48 80 90 4d 66 4c c2 af e9 15 36 1e bb ..yH..MfL....6.. 00:20:33.915 000000b0 94 c9 b9 02 31 2b 24 c1 7f 29 a5 83 59 82 f5 f2 ....1+$..)..Y... 00:20:33.915 000000c0 ac aa 3a 66 71 a5 7d 62 ae f9 26 e6 3f ac d8 52 ..:fq.}b..&.?..R 00:20:33.915 000000d0 49 ec 31 14 80 f8 44 0e 39 7d 75 06 11 9e 53 f6 I.1...D.9}u...S. 00:20:33.915 000000e0 3c 67 20 e3 8e a2 cb b1 4e 8a 94 5a 6f f3 5a 57 3.Op.$.kE*4 00:20:33.915 00000070 02 5e 8c 52 02 f9 2c 86 e4 7d 7b 54 69 37 48 b1 .^.R..,..}{Ti7H. 00:20:33.915 00000080 75 c1 2d ca 2b da cd ce a6 5c cb 5a 89 ad 17 f4 u.-.+....\.Z.... 00:20:33.915 00000090 42 a6 dd 20 5d 93 4b 00 9e 83 2a bc 5e f6 0e ce B.. ].K...*.^... 00:20:33.915 000000a0 ea a1 22 35 e3 f2 1f 4c bd 72 88 36 ca 97 46 a3 .."5...L.r.6..F. 00:20:33.915 000000b0 b1 88 e5 30 63 3e 92 52 bd 9e 10 1b 9e 4b b8 9c ...0c>.R.....K.. 00:20:33.915 000000c0 9d fd f1 f6 3d 05 d9 5b 14 74 95 3b e6 a7 0f 03 ....=..[.t.;.... 00:20:33.915 000000d0 be 45 46 fb 51 10 10 ab 88 2f 4e 90 db cf 59 bb .EF.Q..../N...Y. 00:20:33.915 000000e0 72 14 17 b7 8f fa 81 b6 ee cf 02 f7 7a 11 0d 6c r...........z..l 00:20:33.915 000000f0 d8 f6 49 df 15 4e b3 b5 0d 5e 22 46 ee a5 03 ff ..I..N...^"F.... 00:20:33.915 00000100 5b 3f 81 01 f5 0f 80 aa 3d b7 d6 36 a5 71 0c df [?......=..6.q.. 00:20:33.915 00000110 28 4b 0a 75 9a 64 7a 51 46 29 c0 ee b4 b6 1b 06 (K.u.dzQF)...... 00:20:33.915 00000120 d5 c7 2a 07 7f 65 1b 60 29 20 a1 6e 9b f6 f7 6c ..*..e.`) .n...l 00:20:33.915 00000130 7e f5 23 49 f3 91 bf c4 a7 c2 a6 24 0e e6 96 9f ~.#I.......$.... 00:20:33.915 00000140 f0 1d ce 4e 89 ce 28 d6 f9 43 9f c4 98 4d 91 bb ...N..(..C...M.. 00:20:33.915 00000150 9c 06 4d aa f3 af 7b f8 8f c2 c0 74 88 9b 15 6f ..M...{....t...o 00:20:33.915 00000160 e5 74 00 9e c9 85 4e 67 31 2d 38 5f 04 42 56 67 .t....Ng1-8_.BVg 00:20:33.915 00000170 8f ee 32 e7 fc 10 95 47 0f fb 28 b4 0b 5d 2a c3 ..2....G..(..]*. 00:20:33.915 00000180 8d fb 39 23 e0 d3 f0 9e d8 1b d1 73 05 9d 42 df ..9#.......s..B. 00:20:33.915 00000190 5a 0a 0f 45 31 60 bb 42 ae 4b 27 6a ca 19 f1 c4 Z..E1`.B.K'j.... 00:20:33.915 000001a0 20 0d f3 2c db c8 08 d2 62 67 09 70 88 5d a4 7a ..,....bg.p.].z 00:20:33.915 000001b0 75 8b 2c 07 5d b2 34 dd b6 a5 1b 09 c2 56 0d 19 u.,.].4......V.. 00:20:33.915 000001c0 86 3a 34 27 4b ca c2 d8 18 3b 96 4e b0 3b 2d 97 .:4'K....;.N.;-. 00:20:33.915 000001d0 3a a0 95 b7 af c8 4f 62 10 12 25 9d 06 69 22 59 :.....Ob..%..i"Y 00:20:33.915 000001e0 1b c3 d9 2f 0a a5 83 88 ee 23 48 f2 63 ba 14 c6 .../.....#H.c... 00:20:33.915 000001f0 69 4b ca be de 71 00 0f a7 68 ea c8 2d d3 f9 f6 iK...q...h..-... 00:20:33.915 dh secret: 00:20:33.915 00000000 1c b8 9e 32 3b ae 88 df fe ea 96 90 52 96 74 c6 ...2;.......R.t. 00:20:33.915 00000010 27 91 b9 ff dd 81 86 9a 3b 82 09 98 15 4c 95 c4 '.......;....L.. 00:20:33.915 00000020 ad cd 7e 49 5e 66 55 4a 82 00 2e 49 9d c6 dd 2e ..~I^fUJ...I.... 00:20:33.915 00000030 18 2a fb fc 9d 9c 21 7d d1 52 e7 d5 8f 86 08 e6 .*....!}.R...... 00:20:33.915 00000040 bc 8c 99 86 e8 22 15 01 12 b1 5c 9e 20 78 5d 0f ....."....\. x]. 00:20:33.915 00000050 43 b1 57 89 5d e6 ec 18 d6 64 8a b5 d7 4f e0 bf C.W.]....d...O.. 00:20:33.915 00000060 94 ba 2f 0b aa 21 a2 97 c2 e2 e2 86 7c 07 44 4f ../..!......|.DO 00:20:33.915 00000070 19 ac ef 73 0a 71 68 d9 4b 7c 15 28 f5 80 df 0b ...s.qh.K|.(.... 00:20:33.915 00000080 eb 0f 54 a1 f5 ee cb 9e 90 0c 0b 73 fd 69 41 50 ..T........s.iAP 00:20:33.915 00000090 b7 e5 96 8f 42 b2 06 e8 a3 b2 7e c1 05 85 92 e7 ....B.....~..... 00:20:33.915 000000a0 24 e2 54 76 12 84 8e f6 5b b0 ff f3 cf 6f 6b 4e $.Tv....[....okN 00:20:33.915 000000b0 1d 87 dd 7b 29 ca 6f 6b 6a 2b ff f0 61 e2 e5 c1 ...{).okj+..a... 00:20:33.915 000000c0 1e 52 7b ac 4b 5e 67 2e 62 d4 8a 72 a9 37 10 15 .R{.K^g.b..r.7.. 00:20:33.915 000000d0 39 82 fa 52 26 ce a4 4d ab 06 14 06 37 e0 ab 5d 9..R&..M....7..] 00:20:33.915 000000e0 f2 e2 95 86 fd 7c a0 3d 8a 6c 1d ab 31 a7 ab ac .....|.=.l..1... 00:20:33.915 000000f0 cd 1a 8a b5 1b a7 46 22 1a fc 0b f6 fa 9f 5a 05 ......F"......Z. 00:20:33.915 00000100 8b 28 c2 78 dc f9 d7 2c b5 05 53 4f b9 95 1c ba .(.x...,..SO.... 00:20:33.915 00000110 ee bc 45 47 08 7b 64 36 6c 70 2d 03 e1 c2 96 b7 ..EG.{d6lp-..... 00:20:33.915 00000120 c4 fc b1 b9 48 7a 25 93 fb 7f 96 15 b1 98 d1 39 ....Hz%........9 00:20:33.915 00000130 1f a8 e6 4f e0 9e 64 54 86 1c 34 82 92 70 1a 05 ...O..dT..4..p.. 00:20:33.915 00000140 78 20 fb 90 f4 a6 91 3c 99 e1 25 40 1f 26 b1 fe x .....<..%@.&.. 00:20:33.915 00000150 7f e0 e2 ba bb 85 ac 82 0c 1c bf c2 7d b8 2d 74 ............}.-t 00:20:33.916 00000160 0f af f8 1c 6e 31 78 c6 f1 59 f4 aa ab 1e ad bd ....n1x..Y...... 00:20:33.916 00000170 cc 35 83 6f d4 b8 d2 68 d9 00 2d ea ee aa d7 b5 .5.o...h..-..... 00:20:33.916 00000180 1b 08 bc 7d a0 cd 9c 72 d9 8b d4 5f 2d d0 fa 76 ...}...r..._-..v 00:20:33.916 00000190 89 7c cc 6c 61 de 77 f5 4d 1b f2 6d f7 a7 90 cc .|.la.w.M..m.... 00:20:33.916 000001a0 fe 12 08 e2 a2 3d 46 35 d0 f8 6d 08 15 60 a7 45 .....=F5..m..`.E 00:20:33.916 000001b0 f2 cd 58 65 e6 1f 16 3f 6b 1b c8 de b8 fd 44 8e ..Xe...?k.....D. 00:20:33.916 000001c0 dc e7 26 5f 12 4b 9d de c3 e6 58 19 ed 1d c4 a8 ..&_.K....X..... 00:20:33.916 000001d0 a3 eb 8d 9c f7 bf 78 00 b6 1a f6 6f 05 05 d6 f0 ......x....o.... 00:20:33.916 000001e0 13 42 c1 a1 f2 82 c0 b1 3d b6 f5 0b 93 63 c1 0c .B......=....c.. 00:20:33.916 000001f0 b2 f1 1b f3 af 43 19 89 58 8b 7a bb f1 a2 80 d3 .....C..X.z..... 00:20:33.916 [2024-09-27 15:25:17.887224] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=2, dhgroup=3, seq=3428451768, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.916 [2024-09-27 15:25:17.887319] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.916 [2024-09-27 15:25:17.921246] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.916 [2024-09-27 15:25:17.921287] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.916 [2024-09-27 15:25:17.921298] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.916 [2024-09-27 15:25:17.921327] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.916 [2024-09-27 15:25:18.079151] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.916 [2024-09-27 15:25:18.079171] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.916 [2024-09-27 15:25:18.079178] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.916 [2024-09-27 15:25:18.079223] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.916 [2024-09-27 15:25:18.079246] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.916 ctrlr pubkey: 00:20:33.916 00000000 a6 35 5b 34 38 89 eb 52 fc 3a 81 6d b2 cf 43 14 .5[48..R.:.m..C. 00:20:33.916 00000010 55 03 da e3 5c b0 85 db ff 51 19 99 8e c2 69 2d U...\....Q....i- 00:20:33.916 00000020 7c c3 ee d8 ba 6d 63 17 f1 fc 26 90 5f 9d a9 f2 |....mc...&._... 00:20:33.916 00000030 7e 1e 3c 36 a2 29 67 e8 9b 70 95 64 99 fc 00 38 ~.<6.)g..p.d...8 00:20:33.916 00000040 3f 17 0d 3f d5 86 73 79 0c 3d c5 22 51 35 bd 7a ?..?..sy.=."Q5.z 00:20:33.916 00000050 19 86 4e e3 3c 11 8e 65 9a 5e 19 6b 4d 33 7f 20 ..N.<..e.^.kM3. 00:20:33.916 00000060 9b 09 d0 71 ff f9 32 73 fe fc 89 d5 15 d1 be 54 ...q..2s.......T 00:20:33.916 00000070 f6 05 72 fe 27 a5 d2 cb ef c2 a9 32 d6 ad 11 9d ..r.'......2.... 00:20:33.916 00000080 92 4d e7 e8 6c 84 3d 8c 0d 89 96 b4 08 cc bb 15 .M..l.=......... 00:20:33.916 00000090 61 38 b5 aa d8 b0 8b 1a 13 0c ef 41 3d 04 b1 79 a8.........A=..y 00:20:33.916 000000a0 b0 8b 6a e3 78 77 bf 55 ad 16 72 c4 12 3f 32 f6 ..j.xw.U..r..?2. 00:20:33.916 000000b0 43 99 03 bf 7f 75 00 0e e7 1a de 63 03 ed e2 46 C....u.....c...F 00:20:33.916 000000c0 60 aa c7 e6 a2 c2 d6 4b 82 ea 66 dc f4 7d 04 9f `......K..f..}.. 00:20:33.916 000000d0 1a 86 b0 9a 37 55 98 8f 90 43 c9 37 41 ac 66 0a ....7U...C.7A.f. 00:20:33.916 000000e0 e1 5c 48 98 04 57 c1 78 d6 83 9c 28 c4 02 11 4f .\H..W.x...(...O 00:20:33.916 000000f0 ed 6d 91 bb 81 75 b4 c1 a3 69 14 07 bd c4 96 00 .m...u...i...... 00:20:33.916 00000100 f6 f6 1a 16 dc ba 10 c2 98 58 df 41 76 b0 64 62 .........X.Av.db 00:20:33.916 00000110 50 31 14 6f cf 8b b6 df 7a b8 cd 04 9a d6 cb 89 P1.o....z....... 00:20:33.916 00000120 4b 7d ab 37 29 74 40 12 8a 1a 2c fb c2 09 a3 0c K}.7)t@...,..... 00:20:33.916 00000130 3c 33 68 3b 8b 9a 3a 8a c2 0a 1a cd 53 6a ec b0 <3h;..:.....Sj.. 00:20:33.916 00000140 ff 87 c9 8f 88 0f 14 82 a7 61 72 73 b7 d7 4f 72 .........ars..Or 00:20:33.916 00000150 7b 39 85 00 c2 ac 05 e0 42 5b af c0 8d f6 af 71 {9......B[.....q 00:20:33.916 00000160 02 7b d6 54 87 23 a0 d3 55 a3 1a 3b e7 64 42 e4 .{.T.#..U..;.dB. 00:20:33.916 00000170 86 73 20 ce d8 f5 4b 4f 1b ba db 95 59 81 09 db .s ...KO....Y... 00:20:33.916 00000180 d2 91 f6 6d 07 7c db e4 2e cf ef b5 0b 7c c8 39 ...m.|.......|.9 00:20:33.916 00000190 66 93 3b 3b 1e a0 76 cb e6 06 0f c1 3e e8 59 82 f.;;..v.....>.Y. 00:20:33.916 000001a0 e4 80 55 b4 c6 97 87 ac 99 f5 1c 51 71 c3 33 45 ..U........Qq.3E 00:20:33.916 000001b0 0a 57 8d 9c 96 78 4b 12 a3 8a 02 d6 dc 68 6c 2e .W...xK......hl. 00:20:33.916 000001c0 ed 31 a3 c3 4d 68 18 55 e6 42 db 06 e0 15 ed a1 .1..Mh.U.B...... 00:20:33.916 000001d0 9c d0 e1 8d ad ad 90 d4 53 f4 19 4e 88 3e 9b 5b ........S..N.>.[ 00:20:33.916 000001e0 d9 9f 8f c1 76 67 5d 00 7e 94 e3 8c 9a ec 7b 01 ....vg].~.....{. 00:20:33.916 000001f0 d0 1a 48 bf 0a e7 3e ef d9 ec cd 55 77 43 2c c0 ..H...>....UwC,. 00:20:33.916 host pubkey: 00:20:33.916 00000000 c6 cb 49 27 30 5b 18 3a cd 2b 3f ea 58 b1 a4 51 ..I'0[.:.+?.X..Q 00:20:33.916 00000010 04 16 26 11 03 b8 1f 58 86 86 b0 3e 47 38 53 bb ..&....X...>G8S. 00:20:33.916 00000020 1c 6c 92 40 0e 7e 3e b9 2b 8c 7a 0b f6 c0 db e7 .l.@.~>.+.z..... 00:20:33.916 00000030 49 93 a5 6d c7 54 ab bf fa 9e 39 5c 5b 2b 66 a9 I..m.T....9\[+f. 00:20:33.916 00000040 5a b5 8b 14 01 ba 63 46 2a 7b 72 b1 86 33 25 0c Z.....cF*{r..3%. 00:20:33.916 00000050 fa 18 29 e1 79 b5 27 92 13 a0 21 71 41 62 27 c4 ..).y.'...!qAb'. 00:20:33.916 00000060 9e 38 f9 27 a6 d3 8b 13 de 95 d9 d4 96 02 6a 3d .8.'..........j= 00:20:33.916 00000070 7e 13 0d 73 0c 96 10 da 43 8f 83 a1 e5 3d 05 04 ~..s....C....=.. 00:20:33.916 00000080 5f 93 c4 d9 65 c2 a4 9c bf 00 5c 0e 08 7b 09 7e _...e.....\..{.~ 00:20:33.916 00000090 cd ce 8b de 05 0a 7f 12 62 df 5b a9 92 16 5e 9e ........b.[...^. 00:20:33.916 000000a0 2b 7e e7 45 3b b7 64 c8 23 bf bf ac 29 ed 7d 46 +~.E;.d.#...).}F 00:20:33.916 000000b0 92 09 94 09 d0 d6 51 60 8b a4 41 df 5e 91 61 46 ......Q`..A.^.aF 00:20:33.916 000000c0 fa 91 f5 bd 68 78 82 88 fd 9e b8 ed 48 bb 16 b2 ....hx......H... 00:20:33.916 000000d0 59 8f e9 84 90 ee 24 ec 51 97 72 ef 7e ff dd 54 Y.....$.Q.r.~..T 00:20:33.916 000000e0 74 31 29 f9 90 00 cd d1 fe fe 73 cd 0f ff 16 36 t1).......s....6 00:20:33.916 000000f0 67 09 17 b6 6a bc 01 cb 6d f5 23 f4 58 e2 da 4b g...j...m.#.X..K 00:20:33.916 00000100 8b 46 03 b5 e9 86 65 31 f0 03 fb 92 84 ff 4d 46 .F....e1......MF 00:20:33.916 00000110 5b 34 a8 1c bb e2 e1 b8 24 17 21 69 2d e3 63 81 [4......$.!i-.c. 00:20:33.916 00000120 ad 2d ff db c0 72 74 2d a7 52 44 76 09 2b 3b 7e .-...rt-.RDv.+;~ 00:20:33.916 00000130 82 67 2c ca af a6 d9 44 70 0b 4f 2b e3 e0 69 f6 .g,....Dp.O+..i. 00:20:33.916 00000140 82 c0 be 5b 04 28 3c b5 11 1a 41 f5 31 03 dc 6a ...[.(<...A.1..j 00:20:33.916 00000150 59 ae ff d6 8d 46 c7 42 a6 52 eb 20 15 7c b8 8c Y....F.B.R. .|.. 00:20:33.916 00000160 cb 3f 67 e7 99 f6 a5 cb cb 92 4c bd 6b 47 71 72 .?g.......L.kGqr 00:20:33.916 00000170 b4 d1 57 8a 3a 9c 60 75 ef 99 4f f8 4a 4a 0e 08 ..W.:.`u..O.JJ.. 00:20:33.916 00000180 6b 4e ee 1e 3d 16 ba 32 a6 d5 59 68 dc bb b7 5a kN..=..2..Yh...Z 00:20:33.916 00000190 d3 d7 99 b8 22 1c 2f cc 82 1f dc 0c 02 a8 25 1f ...."./.......%. 00:20:33.916 000001a0 50 4d a6 3e 63 93 d4 62 ac a1 51 b9 90 0e 22 2d PM.>c..b..Q..."- 00:20:33.916 000001b0 be c7 05 dc 9a 03 99 ec 96 47 da 7e 18 52 c0 a1 .........G.~.R.. 00:20:33.916 000001c0 06 ff 34 09 0c 5e 9e c8 14 d2 79 b1 54 fe 3a 9b ..4..^....y.T.:. 00:20:33.916 000001d0 60 e1 25 88 59 68 e5 a1 9e fb fa 44 05 e0 c1 41 `.%.Yh.....D...A 00:20:33.916 000001e0 d1 cd dd c4 76 56 a8 06 df 70 05 92 04 c4 37 af ....vV...p....7. 00:20:33.916 000001f0 c0 36 4f 96 c5 f0 f5 4b ef 33 2a b1 8a a8 d1 2b .6O....K.3*....+ 00:20:33.916 dh secret: 00:20:33.916 00000000 e6 cc a1 b8 73 1c 21 e2 05 c1 a2 25 97 1f 54 61 ....s.!....%..Ta 00:20:33.916 00000010 9f 43 40 49 ab de 53 d3 b7 10 1b 32 b7 66 45 3e .C@I..S....2.fE> 00:20:33.916 00000020 15 76 fc ac 26 ca f9 33 70 ed ef 3a d7 15 6e 49 .v..&..3p..:..nI 00:20:33.916 00000030 71 17 ed 51 f0 5f 13 63 22 09 76 cd 5c 0e 5a d0 q..Q._.c".v.\.Z. 00:20:33.916 00000040 35 1e 0e de b5 f8 e3 de ac 82 8d 4b 2c 17 8c ec 5..........K,... 00:20:33.916 00000050 02 20 c9 f1 9b cd b2 3e 3f 87 48 28 10 fe a4 c8 . .....>?.H(.... 00:20:33.916 00000060 b5 61 66 33 f9 0f 53 2f 83 cc d3 e4 1b eb 18 ac .af3..S/........ 00:20:33.916 00000070 dd d4 84 72 0e 91 d0 c8 05 33 8b df a9 c3 60 92 ...r.....3....`. 00:20:33.916 00000080 68 c5 d2 4c dc 06 cd b8 84 30 df 89 bb 35 46 1f h..L.....0...5F. 00:20:33.916 00000090 47 43 de a3 70 2b 42 01 f0 ca e9 39 c8 00 32 83 GC..p+B....9..2. 00:20:33.916 000000a0 05 ca b2 be 35 a3 aa bd 43 c4 8c 37 f3 89 fa b7 ....5...C..7.... 00:20:33.916 000000b0 ea a8 5e 93 54 3f fb 49 7a 23 17 69 13 2d 25 2f ..^.T?.Iz#.i.-%/ 00:20:33.916 000000c0 01 45 9d e1 5f 3e d5 e0 3d d0 5e ab fc fd c9 e1 .E.._>..=.^..... 00:20:33.916 000000d0 f0 7a 4c e5 1f 32 7b 6c b0 f9 53 fd b3 ea 1e 6d .zL..2{l..S....m 00:20:33.916 000000e0 fe e9 88 0f b7 59 3c 94 be 75 f5 7c 33 4e 3c 24 .....Y<..u.|3N<$ 00:20:33.916 000000f0 1b 19 0d c5 01 88 ce e4 75 d8 3b a9 36 fe c7 9d ........u.;.6... 00:20:33.916 00000100 29 81 4e 89 5d 23 ee 5c 40 cc cc 0b df 09 ee 4c ).N.]#.\@......L 00:20:33.916 00000110 0b fc da a3 cf de 54 00 c1 d6 ec 51 a0 2c 30 d0 ......T....Q.,0. 00:20:33.916 00000120 57 ef 95 e8 98 07 30 e5 ff 45 ca 23 79 0a d3 82 W.....0..E.#y... 00:20:33.916 00000130 68 32 9a fa 7d ea 33 f8 9a 5e 07 3e bf 07 a9 6b h2..}.3..^.>...k 00:20:33.916 00000140 e5 82 c6 44 e7 a1 a0 5e 29 77 3c 30 2a d4 70 b4 ...D...^)w<0*.p. 00:20:33.916 00000150 49 6a a8 f7 7f cd 74 db 2f a9 84 74 b6 dd ef a4 Ij....t./..t.... 00:20:33.916 00000160 50 d3 1b cd 89 83 67 88 5e 66 c2 45 b7 b8 04 d7 P.....g.^f.E.... 00:20:33.916 00000170 5f f9 53 cb 21 02 ee be 8a d9 d3 e1 82 14 4d 85 _.S.!.........M. 00:20:33.916 00000180 0a 9a 22 60 85 2e 01 9a bb e9 0a 59 ae 46 97 a6 .."`.......Y.F.. 00:20:33.916 00000190 df 17 9d 3a d1 0e 27 2b 1d e7 83 30 c0 63 ed 16 ...:..'+...0.c.. 00:20:33.916 000001a0 f5 4d 0c 21 b4 b2 de 7c 25 b2 06 3c 64 50 ce 67 .M.!...|%..d.Q.Q.*BO1.h.f 00:20:33.916 000001c0 40 ce 22 f9 56 0f 21 f2 98 67 2e cf f0 49 5c df @.".V.!..g...I\. 00:20:33.916 000001d0 09 fd 43 d0 99 eb 25 cc 94 2f 9a 33 f6 75 49 e8 ..C...%../.3.uI. 00:20:33.916 000001e0 41 51 86 49 8b 23 9e 32 28 d5 90 70 b8 56 39 46 AQ.I.#.2(..p.V9F 00:20:33.916 000001f0 20 9e 4d e4 12 5e 3d de 2f d2 fc b2 f9 99 9c 2e .M..^=./....... 00:20:33.916 [2024-09-27 15:25:18.095229] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=2, dhgroup=3, seq=3428451769, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.916 [2024-09-27 15:25:18.112290] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.917 [2024-09-27 15:25:18.112329] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.917 [2024-09-27 15:25:18.112350] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.917 [2024-09-27 15:25:18.112371] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.917 [2024-09-27 15:25:18.112385] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.917 [2024-09-27 15:25:18.218403] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.917 [2024-09-27 15:25:18.218450] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.917 [2024-09-27 15:25:18.218474] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.917 [2024-09-27 15:25:18.218506] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.917 [2024-09-27 15:25:18.218575] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.917 ctrlr pubkey: 00:20:33.917 00000000 a6 35 5b 34 38 89 eb 52 fc 3a 81 6d b2 cf 43 14 .5[48..R.:.m..C. 00:20:33.917 00000010 55 03 da e3 5c b0 85 db ff 51 19 99 8e c2 69 2d U...\....Q....i- 00:20:33.917 00000020 7c c3 ee d8 ba 6d 63 17 f1 fc 26 90 5f 9d a9 f2 |....mc...&._... 00:20:33.917 00000030 7e 1e 3c 36 a2 29 67 e8 9b 70 95 64 99 fc 00 38 ~.<6.)g..p.d...8 00:20:33.917 00000040 3f 17 0d 3f d5 86 73 79 0c 3d c5 22 51 35 bd 7a ?..?..sy.=."Q5.z 00:20:33.917 00000050 19 86 4e e3 3c 11 8e 65 9a 5e 19 6b 4d 33 7f 20 ..N.<..e.^.kM3. 00:20:33.917 00000060 9b 09 d0 71 ff f9 32 73 fe fc 89 d5 15 d1 be 54 ...q..2s.......T 00:20:33.917 00000070 f6 05 72 fe 27 a5 d2 cb ef c2 a9 32 d6 ad 11 9d ..r.'......2.... 00:20:33.917 00000080 92 4d e7 e8 6c 84 3d 8c 0d 89 96 b4 08 cc bb 15 .M..l.=......... 00:20:33.917 00000090 61 38 b5 aa d8 b0 8b 1a 13 0c ef 41 3d 04 b1 79 a8.........A=..y 00:20:33.917 000000a0 b0 8b 6a e3 78 77 bf 55 ad 16 72 c4 12 3f 32 f6 ..j.xw.U..r..?2. 00:20:33.917 000000b0 43 99 03 bf 7f 75 00 0e e7 1a de 63 03 ed e2 46 C....u.....c...F 00:20:33.917 000000c0 60 aa c7 e6 a2 c2 d6 4b 82 ea 66 dc f4 7d 04 9f `......K..f..}.. 00:20:33.917 000000d0 1a 86 b0 9a 37 55 98 8f 90 43 c9 37 41 ac 66 0a ....7U...C.7A.f. 00:20:33.917 000000e0 e1 5c 48 98 04 57 c1 78 d6 83 9c 28 c4 02 11 4f .\H..W.x...(...O 00:20:33.917 000000f0 ed 6d 91 bb 81 75 b4 c1 a3 69 14 07 bd c4 96 00 .m...u...i...... 00:20:33.917 00000100 f6 f6 1a 16 dc ba 10 c2 98 58 df 41 76 b0 64 62 .........X.Av.db 00:20:33.917 00000110 50 31 14 6f cf 8b b6 df 7a b8 cd 04 9a d6 cb 89 P1.o....z....... 00:20:33.917 00000120 4b 7d ab 37 29 74 40 12 8a 1a 2c fb c2 09 a3 0c K}.7)t@...,..... 00:20:33.917 00000130 3c 33 68 3b 8b 9a 3a 8a c2 0a 1a cd 53 6a ec b0 <3h;..:.....Sj.. 00:20:33.917 00000140 ff 87 c9 8f 88 0f 14 82 a7 61 72 73 b7 d7 4f 72 .........ars..Or 00:20:33.917 00000150 7b 39 85 00 c2 ac 05 e0 42 5b af c0 8d f6 af 71 {9......B[.....q 00:20:33.917 00000160 02 7b d6 54 87 23 a0 d3 55 a3 1a 3b e7 64 42 e4 .{.T.#..U..;.dB. 00:20:33.917 00000170 86 73 20 ce d8 f5 4b 4f 1b ba db 95 59 81 09 db .s ...KO....Y... 00:20:33.917 00000180 d2 91 f6 6d 07 7c db e4 2e cf ef b5 0b 7c c8 39 ...m.|.......|.9 00:20:33.917 00000190 66 93 3b 3b 1e a0 76 cb e6 06 0f c1 3e e8 59 82 f.;;..v.....>.Y. 00:20:33.917 000001a0 e4 80 55 b4 c6 97 87 ac 99 f5 1c 51 71 c3 33 45 ..U........Qq.3E 00:20:33.917 000001b0 0a 57 8d 9c 96 78 4b 12 a3 8a 02 d6 dc 68 6c 2e .W...xK......hl. 00:20:33.917 000001c0 ed 31 a3 c3 4d 68 18 55 e6 42 db 06 e0 15 ed a1 .1..Mh.U.B...... 00:20:33.917 000001d0 9c d0 e1 8d ad ad 90 d4 53 f4 19 4e 88 3e 9b 5b ........S..N.>.[ 00:20:33.917 000001e0 d9 9f 8f c1 76 67 5d 00 7e 94 e3 8c 9a ec 7b 01 ....vg].~.....{. 00:20:33.917 000001f0 d0 1a 48 bf 0a e7 3e ef d9 ec cd 55 77 43 2c c0 ..H...>....UwC,. 00:20:33.917 host pubkey: 00:20:33.917 00000000 92 fe e5 8c 98 80 1a 11 7e c4 62 37 aa 3d 9f 79 ........~.b7.=.y 00:20:33.917 00000010 2d bf b4 01 75 33 10 ba 97 a1 76 58 ab c9 49 a2 -...u3....vX..I. 00:20:33.917 00000020 45 dd e3 5b 82 49 f2 f5 fb 41 f9 0e 94 fe 63 0c E..[.I...A....c. 00:20:33.917 00000030 ba 3f 5f 28 9e 73 72 1c a9 67 34 42 91 53 86 79 .?_(.sr..g4B.S.y 00:20:33.917 00000040 b4 7a 6f 28 9f ad 46 43 4a ff 61 81 8d 9b eb 18 .zo(..FCJ.a..... 00:20:33.917 00000050 51 50 db 2a 84 0c fb 85 d1 28 99 11 69 96 be 23 QP.*.....(..i..# 00:20:33.917 00000060 b8 28 e3 d2 64 5d d2 c9 06 73 53 cd 29 f9 cf b1 .(..d]...sS.)... 00:20:33.917 00000070 ed 5f c3 b4 c1 b0 ac 08 1b d2 9a e3 62 57 2e 08 ._..........bW.. 00:20:33.917 00000080 c3 61 29 4f a3 c3 88 50 d3 67 4c b7 29 2a 07 47 .a)O...P.gL.)*.G 00:20:33.917 00000090 d0 25 ba 52 7e 46 68 49 60 c9 30 84 83 69 ba fa .%.R~FhI`.0..i.. 00:20:33.917 000000a0 05 83 4f 29 45 4e 3a ea bd b8 9a 7c 44 76 36 cb ..O)EN:....|Dv6. 00:20:33.917 000000b0 e5 88 ef 89 6c 66 ac 2e 48 9e cc 06 d9 57 cc 3d ....lf..H....W.= 00:20:33.917 000000c0 3b 5f 19 91 0e 1b b4 b2 e6 8e be f6 bb 9e aa ee ;_.............. 00:20:33.917 000000d0 a8 e6 18 1b 43 a3 b8 b4 01 82 23 26 41 ca e9 a2 ....C.....#&A... 00:20:33.917 000000e0 43 33 95 4b e5 c1 7e da 6e 1f 26 f5 90 81 eb 18 C3.K..~.n.&..... 00:20:33.917 000000f0 f5 be d4 b5 45 0e 10 9d 99 37 3a ba d4 6a 36 63 ....E....7:..j6c 00:20:33.917 00000100 a4 e8 d4 c5 e1 31 d9 d0 3f f3 c2 8a a3 76 d4 c4 .....1..?....v.. 00:20:33.917 00000110 68 6f 07 aa fb 9d ac 54 ca 8c 82 b6 ae f1 f5 38 ho.....T.......8 00:20:33.917 00000120 60 f7 66 1a 99 a0 33 f1 de d5 eb eb 6b 29 5c 81 `.f...3.....k)\. 00:20:33.917 00000130 c1 d7 ee ab 64 73 05 4b ee ff 33 48 cf fb 5d f6 ....ds.K..3H..]. 00:20:33.917 00000140 00 bb af 3c 6d 28 86 64 c9 a6 53 c1 9f ab 89 d4 ...{.. 00:20:33.918 000000e0 4f 0d 82 4a 27 98 5a c8 e1 b0 73 af fa e5 f8 60 O..J'.Z...s....` 00:20:33.918 000000f0 b2 9f 7f b4 67 7f af 42 8f 71 76 7b 68 ae 3e 88 ....g..B.qv{h.>. 00:20:33.918 00000100 a2 cc fd 2f 6e 6c 8f ca 45 18 50 fd b0 e4 ab d4 .../nl..E.P..... 00:20:33.918 00000110 74 f5 7f 56 0c e3 23 1e ab 5a 3b b1 a6 97 c7 bb t..V..#..Z;..... 00:20:33.918 00000120 fe 30 9c 15 37 db d5 45 5a 30 a6 48 c0 d7 f8 48 .0..7..EZ0.H...H 00:20:33.918 00000130 1b b6 0c 74 80 28 c3 92 8d 06 ff e9 68 56 36 de ...t.(......hV6. 00:20:33.918 00000140 3c b4 f9 52 7a 14 09 9c 2a c6 4a de 3d 65 37 77 <..Rz...*.J.=e7w 00:20:33.918 00000150 34 aa e5 1c f3 14 71 c6 c2 ff 02 c1 00 c1 e6 7b 4.....q........{ 00:20:33.918 00000160 59 c0 bf e3 0d 4d 23 e8 5d 49 3e d8 cf bf d2 04 Y....M#.]I>..... 00:20:33.918 00000170 84 8e e7 0a 43 10 56 47 10 0c ff a0 10 8b 30 7e ....C.VG......0~ 00:20:33.918 00000180 45 09 40 c2 75 74 2f b4 21 96 d4 e6 dc 7c eb b3 E.@.ut/.!....|.. 00:20:33.918 00000190 29 c4 64 5f e4 89 0f 15 cb f2 5f b4 f2 06 12 cc ).d_......_..... 00:20:33.918 000001a0 01 8c 04 55 74 0a 04 10 de ae 74 00 ca e1 aa 4c ...Ut.....t....L 00:20:33.918 000001b0 8a 4f 93 e5 ff 69 d3 ec b4 a9 69 0a d5 41 63 e6 .O...i....i..Ac. 00:20:33.918 000001c0 e3 6b 71 fb 79 4e b5 82 95 da 09 92 73 03 71 e4 .kq.yN......s.q. 00:20:33.918 000001d0 62 da 81 50 23 a3 5f 28 92 b3 d2 a1 95 6b 1a 08 b..P#._(.....k.. 00:20:33.918 000001e0 fc 3a f4 c7 67 53 b7 b5 51 99 58 c2 71 70 f9 97 .:..gS..Q.X.qp.. 00:20:33.918 000001f0 0a 59 3d 2f 1d bd 16 ef 9a 28 2a 03 dc 4c 4f db .Y=/.....(*..LO. 00:20:33.918 host pubkey: 00:20:33.918 00000000 ea 36 ba a3 5b 6d 07 c5 5f 8d bb cc b4 d6 5f 43 .6..[m.._....._C 00:20:33.918 00000010 9a e2 ed c5 52 bc 3e 21 71 b6 99 57 85 60 ad cc ....R.>!q..W.`.. 00:20:33.918 00000020 ef e8 67 c0 98 34 8c 6f af b5 1d c5 64 88 a9 8d ..g..4.o....d... 00:20:33.918 00000030 a3 43 d9 43 67 1a 15 18 70 2e b5 96 19 37 05 26 .C.Cg...p....7.& 00:20:33.918 00000040 1b 9b 4c ea 27 42 7d 65 7a 86 7e b6 f3 46 76 ca ..L.'B}ez.~..Fv. 00:20:33.918 00000050 08 61 51 f3 8e dd 7e 7f 94 5b d1 df af 9c 7c 0b .aQ...~..[....|. 00:20:33.918 00000060 85 08 39 a3 18 6d b8 e9 eb 79 09 5e ad ff ad d9 ..9..m...y.^.... 00:20:33.918 00000070 8c 43 87 57 73 94 40 8b 0a 6e 38 92 27 5c c1 f7 .C.Ws.@..n8.'\.. 00:20:33.918 00000080 72 dd 6b 8c da 1b a7 53 33 2d 43 d8 03 83 73 36 r.k....S3-C...s6 00:20:33.918 00000090 ce 55 42 53 13 b0 06 51 7b 50 a6 0b 86 0c b4 cc .UBS...Q{P...... 00:20:33.918 000000a0 65 43 fb d2 fd b1 2e 8b 28 46 bf f0 9f ca 7f b9 eC......(F...... 00:20:33.918 000000b0 91 14 eb 99 03 37 0a 8b b2 c9 0d 89 0f 95 e0 8f .....7.......... 00:20:33.918 000000c0 39 6c 3d 45 b0 60 1e 21 9c 2e 99 72 01 d4 fb 2e 9l=E.`.!...r.... 00:20:33.918 000000d0 45 2f 0b f5 d6 97 57 c3 be 5a a8 63 e2 df 0c c6 E/....W..Z.c.... 00:20:33.918 000000e0 63 bd 08 6b 23 64 96 b9 4a 8e fb ce e4 4b c7 c6 c..k#d..J....K.. 00:20:33.918 000000f0 f2 79 87 16 60 58 fc 3c 95 b6 ef d4 c5 eb 79 c5 .y..`X.<......y. 00:20:33.918 00000100 4d cf 75 7b 0e 63 8e 28 3c 5d 3a 8c 4c 31 cb e5 M.u{.c.(<]:.L1.. 00:20:33.918 00000110 86 01 28 dc da 9e db 05 20 b9 31 e6 c3 4f 9e d7 ..(..... .1..O.. 00:20:33.918 00000120 44 73 b9 74 83 ea 51 f5 bd de 74 17 00 29 9f e0 Ds.t..Q...t..).. 00:20:33.918 00000130 82 41 08 66 9f a6 67 73 91 c3 e9 59 e6 a6 05 66 .A.f..gs...Y...f 00:20:33.918 00000140 56 a2 42 84 bf 1a 86 ff de d5 3a 5b 80 70 c3 74 V.B.......:[.p.t 00:20:33.918 00000150 85 99 71 b9 80 db 9d 85 74 da b6 e5 6e df b9 72 ..q.....t...n..r 00:20:33.918 00000160 77 50 b6 3a 81 8d 2c 0a ee 2e b6 67 6f 97 35 ea wP.:..,....go.5. 00:20:33.918 00000170 1c 0d e6 30 7b c9 49 e1 bb 3e f1 67 cd 19 9e d7 ...0{.I..>.g.... 00:20:33.918 00000180 6c 24 e3 72 ce b2 54 1a a3 f3 36 21 c1 cb e6 0a l$.r..T...6!.... 00:20:33.918 00000190 2e 79 36 0a c4 0b 1d 11 d3 32 62 2d b4 a1 f9 5e .y6......2b-...^ 00:20:33.918 000001a0 2e 96 47 4d 8b c3 09 c8 b0 a6 21 de 66 37 99 bd ..GM......!.f7.. 00:20:33.918 000001b0 4b dd 22 67 4d 7b 5d aa 5a f0 e4 ea 06 7f 8f 48 K."gM{].Z......H 00:20:33.918 000001c0 21 a0 d2 a4 21 16 8c 21 a0 3a d7 45 6b b3 bd 72 !...!..!.:.Ek..r 00:20:33.918 000001d0 21 fa ae 75 f1 91 1e a4 53 a9 32 e9 5e b9 15 78 !..u....S.2.^..x 00:20:33.918 000001e0 3b b7 f5 37 09 30 77 ef 16 b2 17 13 01 0e 7d d1 ;..7.0w.......}. 00:20:33.918 000001f0 fc 81 90 97 0f 3e 01 35 3a 05 fc 43 54 6a 37 a4 .....>.5:..CTj7. 00:20:33.918 dh secret: 00:20:33.918 00000000 4a c7 56 29 0d 36 7d f9 e4 59 76 34 25 24 a6 38 J.V).6}..Yv4%$.8 00:20:33.918 00000010 e2 0d f3 cc 04 c8 28 f2 76 31 f8 a9 60 3e 85 1e ......(.v1..`>.. 00:20:33.918 00000020 fd d5 36 45 d5 3d 9c 8b f2 ae 8a 80 06 2f d2 3b ..6E.=......./.; 00:20:33.918 00000030 19 5e d4 5c f8 dd 89 2d 7c 43 00 fc ce 5f b8 4f .^.\...-|C..._.O 00:20:33.918 00000040 99 dc f3 0c 33 12 a2 46 57 da 30 cc d8 10 b2 97 ....3..FW.0..... 00:20:33.918 00000050 70 2a 29 4e 4c b7 0e 13 9a 9b 36 8f 5f fa 7e 2c p*)NL.....6._.~, 00:20:33.918 00000060 8f 92 be 30 bc 3a a6 de 7c 1f cb 0c 59 ba e2 d9 ...0.:..|...Y... 00:20:33.918 00000070 33 d5 7a 36 3f 46 39 f3 03 cc 77 82 7d e1 55 42 3.z6?F9...w.}.UB 00:20:33.918 00000080 16 3b 79 59 93 74 eb 88 05 3a 2b 0a 3d 10 79 68 .;yY.t...:+.=.yh 00:20:33.918 00000090 bc c9 60 26 78 20 3c 9d e0 1f 3b 20 56 95 c3 93 ..`&x <...; V... 00:20:33.918 000000a0 c6 bf fb 1e 09 ad ae b8 98 47 1b 6a 69 75 1a a3 .........G.jiu.. 00:20:33.918 000000b0 82 14 9e 66 65 fc ac 60 b7 bf 1b 24 75 f6 9d 26 ...fe..`...$u..& 00:20:33.918 000000c0 54 5a 4f f3 91 e0 26 db 7d d4 66 16 93 4c 36 fa TZO...&.}.f..L6. 00:20:33.918 000000d0 d5 b6 b4 9f 03 ab 7c a4 3a 64 4a 85 ed c0 14 d8 ......|.:dJ..... 00:20:33.918 000000e0 59 c8 17 58 cf 75 40 30 e0 e7 e6 ec 27 03 41 7a Y..X.u@0....'.Az 00:20:33.918 000000f0 9a 9e 4a ed fa 44 18 73 d1 b9 f7 4a 74 23 bb 37 ..J..D.s...Jt#.7 00:20:33.918 00000100 f2 ae 28 42 75 f9 da ea 73 c2 41 24 6b 56 ab d0 ..(Bu...s.A$kV.. 00:20:33.918 00000110 4f 62 43 5e ff e0 0b 13 03 76 43 e0 2d 45 a7 25 ObC^.....vC.-E.% 00:20:33.918 00000120 a9 43 cb ac a6 3a cb d7 f8 b7 25 ad b5 22 87 31 .C...:....%..".1 00:20:33.918 00000130 7f 1f 41 e0 94 fc 10 5c dc f0 ba 3a 4e 82 6f 82 ..A....\...:N.o. 00:20:33.918 00000140 75 39 6e c9 be 3e 01 e4 fd 6a 8f df 39 20 89 50 u9n..>...j..9 .P 00:20:33.918 00000150 cc a7 12 6a eb e4 1b e8 1a 6b 00 39 e8 36 5e f3 ...j.....k.9.6^. 00:20:33.918 00000160 80 ef c1 3d 9e 5f de 19 fc e5 e8 da d1 a8 50 20 ...=._........P 00:20:33.918 00000170 01 93 07 af 2d 91 48 66 f3 0d 37 78 9a 76 ad e3 ....-.Hf..7x.v.. 00:20:33.918 00000180 1d 9f 23 9f 83 ab ba b8 38 78 9b 0c 25 2a 54 74 ..#.....8x..%*Tt 00:20:33.918 00000190 d1 73 54 ff 2b 11 11 1b 06 92 79 ee b4 a7 cd 53 .sT.+.....y....S 00:20:33.918 000001a0 6f 9e ab 06 3f 75 13 32 e9 84 d4 1d 5f eb ce 1a o...?u.2...._... 00:20:33.918 000001b0 93 29 0a 70 01 3e 11 46 36 5d 53 2e 1e 6c 7e aa .).p.>.F6]S..l~. 00:20:33.918 000001c0 fe 0a 71 6f e3 1a e6 f6 be 81 e0 27 d0 16 a1 b0 ..qo.......'.... 00:20:33.918 000001d0 2c 94 15 d8 69 8d 55 fa 81 55 80 ea d8 06 d4 e3 ,...i.U..U...... 00:20:33.918 000001e0 39 5f 4d 8c c7 e6 2d e5 c5 46 0b 0b 90 cb dd 5d 9_M...-..F.....] 00:20:33.918 000001f0 16 f1 18 41 12 54 8b a4 34 a9 b5 2f 6f 1c 28 29 ...A.T..4../o.() 00:20:33.918 [2024-09-27 15:25:18.451202] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=2, dhgroup=3, seq=3428451771, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.918 [2024-09-27 15:25:18.467928] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.918 [2024-09-27 15:25:18.467966] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.918 [2024-09-27 15:25:18.467981] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.918 [2024-09-27 15:25:18.468001] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.918 [2024-09-27 15:25:18.468014] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.918 [2024-09-27 15:25:18.573987] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.918 [2024-09-27 15:25:18.574005] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.918 [2024-09-27 15:25:18.574012] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.918 [2024-09-27 15:25:18.574025] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.918 [2024-09-27 15:25:18.574079] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.918 ctrlr pubkey: 00:20:33.918 00000000 27 52 eb ce bd 65 d2 e1 db 1a 53 20 ac 04 43 c3 'R...e....S ..C. 00:20:33.918 00000010 ee dd b0 a3 86 e9 82 9f a2 a5 32 d9 6a 76 31 74 ..........2.jv1t 00:20:33.918 00000020 fe 62 b8 ee 0a 35 5e f2 c1 a3 c4 df 27 50 dd a2 .b...5^.....'P.. 00:20:33.918 00000030 e4 94 f9 a3 71 a3 10 22 11 e3 d9 77 1f fe f0 fc ....q.."...w.... 00:20:33.918 00000040 81 c8 34 43 b8 db 8a 6e e2 91 32 14 47 83 98 d0 ..4C...n..2.G... 00:20:33.918 00000050 5b 5d f7 60 ff 2f 13 13 f6 3a 1d 61 3a 7f 32 6c [].`./...:.a:.2l 00:20:33.918 00000060 dd b1 6e 26 ac e7 5f 39 a1 11 1d 8a d3 16 7d c4 ..n&.._9......}. 00:20:33.918 00000070 96 43 e4 8c 88 1f 55 89 4d 06 ad 24 ed 34 fe 93 .C....U.M..$.4.. 00:20:33.918 00000080 9a 4d 87 e1 99 b3 2f 7b 18 3f cb 1c 0a da 86 df .M..../{.?...... 00:20:33.918 00000090 b2 ae 88 5e 15 ae 3b 99 85 ea 58 9e 53 d4 f7 e2 ...^..;...X.S... 00:20:33.918 000000a0 02 a3 47 93 30 3b 1f 0e 0a a9 65 79 19 be 58 c2 ..G.0;....ey..X. 00:20:33.919 000000b0 15 6d f2 69 72 af 4b d4 a7 5f 25 44 fd 12 ba ac .m.ir.K.._%D.... 00:20:33.919 000000c0 3f 0f f0 d9 0c ea 1f 52 a5 e3 2b 6a 39 46 d7 57 ?......R..+j9F.W 00:20:33.919 000000d0 6c b8 fa 77 f2 f4 22 03 71 52 59 4f 3e 7b 9b 16 l..w..".qRYO>{.. 00:20:33.919 000000e0 4f 0d 82 4a 27 98 5a c8 e1 b0 73 af fa e5 f8 60 O..J'.Z...s....` 00:20:33.919 000000f0 b2 9f 7f b4 67 7f af 42 8f 71 76 7b 68 ae 3e 88 ....g..B.qv{h.>. 00:20:33.919 00000100 a2 cc fd 2f 6e 6c 8f ca 45 18 50 fd b0 e4 ab d4 .../nl..E.P..... 00:20:33.919 00000110 74 f5 7f 56 0c e3 23 1e ab 5a 3b b1 a6 97 c7 bb t..V..#..Z;..... 00:20:33.919 00000120 fe 30 9c 15 37 db d5 45 5a 30 a6 48 c0 d7 f8 48 .0..7..EZ0.H...H 00:20:33.919 00000130 1b b6 0c 74 80 28 c3 92 8d 06 ff e9 68 56 36 de ...t.(......hV6. 00:20:33.919 00000140 3c b4 f9 52 7a 14 09 9c 2a c6 4a de 3d 65 37 77 <..Rz...*.J.=e7w 00:20:33.919 00000150 34 aa e5 1c f3 14 71 c6 c2 ff 02 c1 00 c1 e6 7b 4.....q........{ 00:20:33.919 00000160 59 c0 bf e3 0d 4d 23 e8 5d 49 3e d8 cf bf d2 04 Y....M#.]I>..... 00:20:33.919 00000170 84 8e e7 0a 43 10 56 47 10 0c ff a0 10 8b 30 7e ....C.VG......0~ 00:20:33.919 00000180 45 09 40 c2 75 74 2f b4 21 96 d4 e6 dc 7c eb b3 E.@.ut/.!....|.. 00:20:33.919 00000190 29 c4 64 5f e4 89 0f 15 cb f2 5f b4 f2 06 12 cc ).d_......_..... 00:20:33.919 000001a0 01 8c 04 55 74 0a 04 10 de ae 74 00 ca e1 aa 4c ...Ut.....t....L 00:20:33.919 000001b0 8a 4f 93 e5 ff 69 d3 ec b4 a9 69 0a d5 41 63 e6 .O...i....i..Ac. 00:20:33.919 000001c0 e3 6b 71 fb 79 4e b5 82 95 da 09 92 73 03 71 e4 .kq.yN......s.q. 00:20:33.919 000001d0 62 da 81 50 23 a3 5f 28 92 b3 d2 a1 95 6b 1a 08 b..P#._(.....k.. 00:20:33.919 000001e0 fc 3a f4 c7 67 53 b7 b5 51 99 58 c2 71 70 f9 97 .:..gS..Q.X.qp.. 00:20:33.919 000001f0 0a 59 3d 2f 1d bd 16 ef 9a 28 2a 03 dc 4c 4f db .Y=/.....(*..LO. 00:20:33.919 host pubkey: 00:20:33.919 00000000 40 2c 4e 27 a6 64 ac 8c c0 02 a2 68 75 68 4c bd @,N'.d.....huhL. 00:20:33.919 00000010 ca 89 76 bc 04 81 76 9a 14 d9 17 49 e0 c8 04 ab ..v...v....I.... 00:20:33.919 00000020 0d fa 42 d0 90 6c 34 76 6f e5 5d 5d 72 5b 8c a6 ..B..l4vo.]]r[.. 00:20:33.919 00000030 7a 1f 33 bb 22 1c 82 8b d7 f8 94 eb b4 c2 9d ae z.3."........... 00:20:33.919 00000040 f8 67 7f b8 83 02 2b e9 33 dd ad 57 f4 67 6d 47 .g....+.3..W.gmG 00:20:33.919 00000050 d6 74 33 37 7a 2b 82 6c fe b8 ad 1d a1 d3 46 01 .t37z+.l......F. 00:20:33.919 00000060 ab 4b c0 9e 12 28 94 11 1e 20 78 e6 ab df 1f ef .K...(... x..... 00:20:33.919 00000070 92 3d 19 a9 10 a5 d9 e7 0b ed be b0 cf 53 04 36 .=...........S.6 00:20:33.919 00000080 e6 05 fe 9b 00 31 29 eb 8b 53 cc dd ba d1 ab 5b .....1)..S.....[ 00:20:33.919 00000090 0a 18 c7 ef 75 41 67 57 e5 af 44 45 c1 dd bb 58 ....uAgW..DE...X 00:20:33.919 000000a0 bc 09 2b 40 ef 7f 7d b8 e2 08 60 7a c4 51 a9 d2 ..+@..}...`z.Q.. 00:20:33.919 000000b0 f2 29 4f 4e a4 22 6f 9d 48 2a 30 11 00 72 c7 b2 .)ON."o.H*0..r.. 00:20:33.919 000000c0 7e c4 9e cd cd 5e 83 fb 9f 36 c7 20 b0 a0 f5 7d ~....^...6. ...} 00:20:33.919 000000d0 9c 11 27 5a ca fd 82 51 2f 2c 20 0c d0 38 72 68 ..'Z...Q/, ..8rh 00:20:33.919 000000e0 46 be 76 02 e3 d3 ba 35 f7 7c 89 0d 93 5f 1c 3c F.v....5.|..._.< 00:20:33.919 000000f0 73 ca 0b 42 2d b4 4c 41 28 b1 51 88 29 07 ef 77 s..B-.LA(.Q.)..w 00:20:33.919 00000100 68 65 39 68 1d b8 ba 57 05 e8 f9 22 fa fa 93 6b he9h...W..."...k 00:20:33.919 00000110 3c d2 d3 e1 3b 40 d5 a9 d6 2c a1 ab fd a9 88 0f <...;@...,...... 00:20:33.919 00000120 8f e0 87 4b cc 91 d8 2a 25 72 e7 66 60 04 fc b6 ...K...*%r.f`... 00:20:33.919 00000130 6d 13 d4 cf a7 ab c5 02 43 bd c6 85 9e 7c 26 fb m.......C....|&. 00:20:33.919 00000140 53 37 e7 10 41 e5 dd e6 c1 7a 2c 6d 72 d6 a4 be S7..A....z,mr... 00:20:33.919 00000150 5b f0 17 0b c7 7a 42 c3 01 11 8a b5 8e da c2 b2 [....zB......... 00:20:33.919 00000160 d4 1a 3b 00 ed b6 72 65 54 40 68 78 71 37 35 79 ..;...reT@hxq75y 00:20:33.919 00000170 69 b3 ea ca 0f 1b f3 5e c5 bf cf c3 64 52 b0 19 i......^....dR.. 00:20:33.919 00000180 d1 d8 e9 c8 c7 49 dc 82 f8 fc eb 50 5a e3 a1 7d .....I.....PZ..} 00:20:33.919 00000190 2d 88 93 76 5f 69 5f 11 d0 d2 2d 68 e7 9a 84 52 -..v_i_...-h...R 00:20:33.919 000001a0 0e d2 87 7b 03 32 18 65 0b 43 ce 81 9f f0 00 f9 ...{.2.e.C...... 00:20:33.919 000001b0 ba 83 20 05 ac ab 9b bc a8 c3 04 15 fb 13 b3 17 .. ............. 00:20:33.919 000001c0 91 50 2b a1 f0 ab 71 5a a1 9b 95 83 43 90 ce 8a .P+...qZ....C... 00:20:33.919 000001d0 af a0 6a 72 2e dd 6f a9 98 f4 db fb 60 b1 02 62 ..jr..o.....`..b 00:20:33.919 000001e0 68 6a 1c b6 b7 56 ef a1 01 aa 83 30 98 50 0f 85 hj...V.....0.P.. 00:20:33.919 000001f0 4b 1c 5a e2 e9 f9 ae e6 3d b4 69 e2 5b 30 e3 d4 K.Z.....=.i.[0.. 00:20:33.919 dh secret: 00:20:33.919 00000000 10 04 66 df ad f3 cb eb ac 2c d8 9f 3d da 6b f7 ..f......,..=.k. 00:20:33.919 00000010 0d c4 78 59 80 fb b3 6a f8 93 1d 1b ee 1b 5e a3 ..xY...j......^. 00:20:33.919 00000020 ab 44 c6 30 45 92 60 7b 32 01 44 7e 6d 53 c3 89 .D.0E.`{2.D~mS.. 00:20:33.919 00000030 dc 1b 8b 9f 94 3a 7c 3e 85 ba 65 d0 bb 2e 52 8b .....:|>..e...R. 00:20:33.919 00000040 e2 c0 78 50 32 46 22 49 e2 b6 c7 a2 92 4d 91 46 ..xP2F"I.....M.F 00:20:33.919 00000050 0c f4 3e a9 ba a4 f4 4c cd fb 23 ea e9 42 2f a8 ..>....L..#..B/. 00:20:33.919 00000060 47 0c b5 a3 9b b5 a7 78 5a f0 55 fd f3 01 37 77 G......xZ.U...7w 00:20:33.919 00000070 11 d3 9f 87 f2 1d 56 92 91 97 0b c9 cf 3d d1 24 ......V......=.$ 00:20:33.919 00000080 3f 32 d4 1f d3 d6 3a f8 ad 4c da fb 28 b1 04 c0 ?2....:..L..(... 00:20:33.919 00000090 26 a3 77 25 2a e0 21 50 d0 4f 8e 6d eb 25 a5 10 &.w%*.!P.O.m.%.. 00:20:33.919 000000a0 aa ca 06 f1 d7 fa 0d c2 c7 27 29 68 72 c4 58 9e .........')hr.X. 00:20:33.919 000000b0 8a 75 23 aa 2e b7 eb 53 44 01 87 26 43 47 70 a2 .u#....SD..&CGp. 00:20:33.919 000000c0 8d e6 cc 7d 7e a5 bd ee 54 5d 10 4b 2a 97 a4 2b ...}~...T].K*..+ 00:20:33.919 000000d0 28 4c 42 15 c2 ad 3f 47 69 9c f6 07 b2 ad 4e 7b (LB...?Gi.....N{ 00:20:33.919 000000e0 0d a2 49 f2 78 47 26 87 30 22 17 35 c6 f1 43 14 ..I.xG&.0".5..C. 00:20:33.919 000000f0 f2 ec 79 e6 1a 63 7f c3 33 f0 df 2f 30 74 a6 7c ..y..c..3../0t.| 00:20:33.919 00000100 5a 40 2f 88 99 34 06 79 7f a6 83 47 19 8e 32 0b Z@/..4.y...G..2. 00:20:33.919 00000110 14 c6 8c dd c6 68 54 87 1a 8d b3 e4 a2 2a 2b e3 .....hT......*+. 00:20:33.919 00000120 6c c9 aa 8d 96 30 8a 12 b1 4a e2 84 c5 f4 06 45 l....0...J.....E 00:20:33.919 00000130 7e d6 2d a7 84 03 b2 14 c4 ef 0e a5 7f ce 33 55 ~.-...........3U 00:20:33.919 00000140 fe 70 b0 12 a3 23 f9 38 03 a2 82 9e 90 0f 94 ee .p...#.8........ 00:20:33.919 00000150 b3 00 f3 72 39 5b 9e b9 41 cf 39 eb 60 ae 53 47 ...r9[..A.9.`.SG 00:20:33.919 00000160 b3 56 d3 5f 2e 04 57 17 a3 32 93 9b 11 20 d1 66 .V._..W..2... .f 00:20:33.919 00000170 39 24 0b ec 41 4f 1c 2d 36 18 c4 b1 19 ea ac cf 9$..AO.-6....... 00:20:33.919 00000180 85 0e c4 ca 43 28 da 6f 8e d5 c5 65 a3 b4 40 b0 ....C(.o...e..@. 00:20:33.919 00000190 6a e1 db e0 47 81 73 68 60 01 de a8 66 d9 17 41 j...G.sh`...f..A 00:20:33.919 000001a0 ed b3 c1 6d cc b4 96 3e 2c d0 e9 63 63 c8 9d 7f ...m...>,..cc... 00:20:33.919 000001b0 b2 9c 18 23 3c a8 45 7d 70 12 a3 a8 50 f8 54 11 ...#<.E}p...P.T. 00:20:33.919 000001c0 83 0d b8 e3 40 4e a7 59 7b a9 f6 1e 5a a1 af f6 ....@N.Y{...Z... 00:20:33.919 000001d0 fd 71 4c 01 5f d8 10 9e e8 d9 0c d0 be 50 49 10 .qL._........PI. 00:20:33.919 000001e0 68 be 82 ac cf c9 06 4c 6d 36 4a 3b 33 f5 5a e4 h......Lm6J;3.Z. 00:20:33.919 000001f0 43 08 28 ce 85 68 f4 99 8b dd 8c 4b ca 46 b2 97 C.(..h.....K.F.. 00:20:33.919 [2024-09-27 15:25:18.589771] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=2, dhgroup=3, seq=3428451772, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.919 [2024-09-27 15:25:18.589874] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.919 [2024-09-27 15:25:18.627065] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.919 [2024-09-27 15:25:18.627107] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.919 [2024-09-27 15:25:18.627117] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.919 [2024-09-27 15:25:18.627143] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.919 [2024-09-27 15:25:18.784460] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.919 [2024-09-27 15:25:18.784479] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.919 [2024-09-27 15:25:18.784487] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.919 [2024-09-27 15:25:18.784534] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.919 [2024-09-27 15:25:18.784558] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.919 ctrlr pubkey: 00:20:33.919 00000000 83 52 a7 d3 18 e0 a5 55 5a 9a 32 ed a9 a7 e8 c7 .R.....UZ.2..... 00:20:33.919 00000010 05 0b e0 bf 63 15 67 3f f3 c5 36 bd 6c 08 65 7b ....c.g?..6.l.e{ 00:20:33.919 00000020 23 0d 29 06 fe ea b7 ab ac 12 1f 4d 4f 35 0e 4c #.)........MO5.L 00:20:33.919 00000030 fa dc 3a 58 06 9f b4 85 ce 52 7d 79 a8 46 29 0b ..:X.....R}y.F). 00:20:33.919 00000040 ae e5 c0 d9 ad 29 3f bb 6b 17 1a eb 77 49 dd 10 .....)?.k...wI.. 00:20:33.919 00000050 df 3b 95 f1 93 75 04 79 4d a7 87 86 82 87 54 d3 .;...u.yM.....T. 00:20:33.919 00000060 eb 69 5e 4d 17 7b 53 22 5c 9a fa cf 1b 6d 55 dd .i^M.{S"\....mU. 00:20:33.919 00000070 cc 73 c3 5c 83 e4 b3 bc 41 0e df 5d 0a 4b 20 d5 .s.\....A..].K . 00:20:33.919 00000080 09 e3 03 24 2d 16 6b 26 47 5d 75 ba 5b 7c 9f 09 ...$-.k&G]u.[|.. 00:20:33.919 00000090 9f 91 a6 45 c2 89 96 17 e6 e1 ad 77 4b 71 1a a1 ...E.......wKq.. 00:20:33.919 000000a0 7e 92 d1 71 a2 f6 ec cc 7e ba 01 e3 eb 46 14 9d ~..q....~....F.. 00:20:33.919 000000b0 5e 2b ca 28 56 df d0 8e 2d 11 ec c9 ca 26 77 88 ^+.(V...-....&w. 00:20:33.920 000000c0 05 c3 5b 19 af 8e dc e8 f5 ea 94 6f f9 4a 38 3f ..[........o.J8? 00:20:33.920 000000d0 09 37 c5 b8 a4 21 1e 5a ec 77 94 db dc b2 87 49 .7...!.Z.w.....I 00:20:33.920 000000e0 11 b6 9c 54 54 9a d5 6a 14 ef 39 63 a8 2e 9e 3c ...TT..j..9c...< 00:20:33.920 000000f0 e1 8c 5c c5 d2 84 ad a5 a0 3f 70 ba 78 66 2e a9 ..\......?p.xf.. 00:20:33.920 00000100 df 9a c2 9e 80 47 06 cb df 3f e2 39 26 4c e8 76 .....G...?.9&L.v 00:20:33.920 00000110 80 10 e5 1b a6 7c cf 36 2d 63 45 a1 71 62 2f f3 .....|.6-cE.qb/. 00:20:33.920 00000120 73 73 e0 d5 43 7a ff aa ba e1 49 5d 4e 98 d2 34 ss..Cz....I]N..4 00:20:33.920 00000130 4a 2c 16 51 ac 86 c2 00 fb 3c 6a 1b b7 61 90 ea J,.Q.....,.....(.'.S@a 00:20:33.920 000001d0 42 95 7f aa ae c5 d7 13 9e 49 34 9a 21 e5 08 61 B........I4.!..a 00:20:33.920 000001e0 ce 68 04 f9 45 06 8b 09 19 82 74 67 9d 97 6e 11 .h..E.....tg..n. 00:20:33.920 000001f0 d5 8a e2 5f fc 60 76 06 54 a5 5d d0 36 ee c6 e5 ..._.`v.T.].6... 00:20:33.920 host pubkey: 00:20:33.920 00000000 75 b3 b8 29 81 90 73 62 1c 56 77 e1 ab 73 a7 04 u..)..sb.Vw..s.. 00:20:33.920 00000010 0d 86 a8 2b 6d 0b 2c 7b 89 1a 1a eb a4 c2 c2 78 ...+m.,{.......x 00:20:33.920 00000020 90 52 ca 7f f2 04 c1 95 49 66 1e 70 67 a0 ed eb .R......If.pg... 00:20:33.920 00000030 59 cc e5 89 b9 37 2c 50 0a ed fe 95 ac 75 51 3b Y....7,P.....uQ; 00:20:33.920 00000040 e7 d6 f7 fb 39 33 ee b1 d3 95 ab d4 8d 59 97 3a ....93.......Y.: 00:20:33.920 00000050 37 24 95 8c 5e c1 af 4b fa 45 22 f2 c8 47 d5 2a 7$..^..K.E"..G.* 00:20:33.920 00000060 cd 7c ae b1 d4 a3 4f 32 9c 89 30 ce 04 3b cb 77 .|....O2..0..;.w 00:20:33.920 00000070 68 0b 05 25 0d 35 41 1d 5e a3 ae 9e c3 81 87 f1 h..%.5A.^....... 00:20:33.920 00000080 1b f6 5e 5f 02 99 6e 34 03 71 4b 39 4e 73 d6 ef ..^_..n4.qK9Ns.. 00:20:33.920 00000090 33 45 d7 58 b2 41 42 04 97 16 7b bf 42 9b 3e 36 3E.X.AB...{.B.>6 00:20:33.920 000000a0 60 a7 55 51 80 b8 90 d5 5f a3 c3 16 67 94 d5 18 `.UQ...._...g... 00:20:33.920 000000b0 40 f1 13 7f f5 7c 71 86 30 dd 60 4a ad c3 85 e3 @....|q.0.`J.... 00:20:33.920 000000c0 aa 1f 26 e3 08 a9 b0 76 da 76 1b 88 05 12 28 29 ..&....v.v....() 00:20:33.920 000000d0 8d f0 9f 66 98 38 0f 73 2a db 52 14 d7 e8 69 db ...f.8.s*.R...i. 00:20:33.920 000000e0 da 79 02 d5 f2 db a8 fc 37 19 07 9f b7 52 41 e0 .y......7....RA. 00:20:33.920 000000f0 8c 5b 73 97 5a 39 27 a5 23 94 c1 ed 00 86 1b 43 .[s.Z9'.#......C 00:20:33.920 00000100 0a 56 85 f8 b0 eb 32 87 82 a5 a9 2a 14 11 46 bc .V....2....*..F. 00:20:33.920 00000110 e8 85 e9 ab 34 7c c4 c9 93 36 21 c1 3d 7d cd 53 ....4|...6!.=}.S 00:20:33.920 00000120 ba e2 5c ab 90 28 8c a1 13 39 88 29 f3 da 54 f4 ..\..(...9.)..T. 00:20:33.920 00000130 f9 38 45 53 32 44 e7 e0 bc c4 ed 78 f4 be ff 70 .8ES2D.....x...p 00:20:33.920 00000140 06 66 85 50 65 6b 61 fc 5a 42 05 29 8c a5 e5 dc .f.Peka.ZB.).... 00:20:33.920 00000150 96 1e 7d 21 3d fb e5 85 3d b5 21 d4 9a b1 31 58 ..}!=...=.!...1X 00:20:33.920 00000160 6b b7 84 f0 3b bf d0 48 95 b2 f7 26 49 09 83 cc k...;..H...&I... 00:20:33.920 00000170 35 a7 50 c8 64 9d 7e 64 98 6d 6f 18 ef 5c 89 d7 5.P.d.~d.mo..\.. 00:20:33.920 00000180 0b 74 a8 b1 98 18 f4 2e 54 7f a7 e3 db a9 d0 0f .t......T....... 00:20:33.920 00000190 0b 79 96 7a 0e 17 dc 60 37 e2 fb 8e 49 a1 b8 e0 .y.z...`7...I... 00:20:33.920 000001a0 e3 91 a5 05 b3 b5 d8 c0 d8 47 87 3e 7b 91 1e 65 .........G.>{..e 00:20:33.920 000001b0 2e 8a 2c a0 30 ec 95 bc e1 29 99 d0 22 2a 18 c9 ..,.0....).."*.. 00:20:33.920 000001c0 14 f9 89 7d ce 90 d6 1a 96 29 29 ba fd 4e 01 79 ...}.....))..N.y 00:20:33.920 000001d0 01 45 9c 79 4c a3 5d 98 57 f3 96 5e f7 bc 3c 68 .E.yL.].W..^... 00:20:33.920 00000070 90 28 2e 2e 99 57 14 5b 5a 4d 6f fb 15 32 87 10 .(...W.[ZMo..2.. 00:20:33.920 00000080 10 15 c8 18 51 c0 1d c0 ca 94 22 9e 1b 99 e4 89 ....Q....."..... 00:20:33.920 00000090 1c 4d 96 5a 96 be 31 8a 3c d7 dd c5 0d f0 a7 62 .M.Z..1.<......b 00:20:33.920 000000a0 b2 b3 7b 72 bd 07 2c 0c 11 eb 74 af 52 d1 82 09 ..{r..,...t.R... 00:20:33.920 000000b0 72 a0 8e ea 13 e6 a5 47 57 9f 6e c3 d6 81 d9 81 r......GW.n..... 00:20:33.920 000000c0 77 9e 04 e2 c2 cd 05 c5 b2 51 1c 1a 18 f2 38 66 w........Q....8f 00:20:33.920 000000d0 b5 3e 03 87 01 ae 1b b2 90 13 c9 cb 89 f1 3f 7d .>............?} 00:20:33.920 000000e0 61 ac 7d 40 4c 6d 1f 52 5c 4a c4 fe 70 fc 2b 9d a.}@Lm.R\J..p.+. 00:20:33.920 000000f0 a5 1f b2 a3 0d 81 d9 fa 8b 17 67 58 df 17 0c 52 ..........gX...R 00:20:33.920 00000100 66 61 7e 67 ef a0 27 00 cb aa 3e c5 5c 4b de 1b fa~g..'...>.\K.. 00:20:33.920 00000110 23 51 64 b3 6f 3e 61 70 12 aa 6d 77 93 6e 90 d3 #Qd.o>ap..mw.n.. 00:20:33.920 00000120 e9 f8 32 19 5a f9 3d f9 c8 25 f4 e9 57 e1 d7 f4 ..2.Z.=..%..W... 00:20:33.920 00000130 a5 7d e2 ae e6 1a 9b 6b 4d ae f7 be 3d 33 06 f5 .}.....kM...=3.. 00:20:33.920 00000140 84 fb 13 67 9d cd 3d 39 89 db b4 8a 09 6e e1 4c ...g..=9.....n.L 00:20:33.920 00000150 90 f6 25 e7 30 e7 3f f7 4d 35 fe 62 a3 21 e5 1c ..%.0.?.M5.b.!.. 00:20:33.920 00000160 7e c8 5c a2 75 41 32 6d ca 9d 4b 54 cc 7b a8 93 ~.\.uA2m..KT.{.. 00:20:33.920 00000170 de 25 1b 3b 68 b6 c3 73 62 a4 fe 3b e5 bc ca 72 .%.;h..sb..;...r 00:20:33.920 00000180 d7 67 4d da c5 6e ba 4d d9 e9 a7 3a 9c 23 3c 4a .gM..n.M...:.#,...}.>.P. 00:20:33.920 000001a0 1a 2f 24 ad 15 db c7 38 62 d4 94 f9 cb 51 a8 6c ./$....8b....Q.l 00:20:33.920 000001b0 50 01 4c 7e e9 7e 2f e6 60 0f 6d c1 46 85 24 ab P.L~.~/.`.m.F.$. 00:20:33.920 000001c0 34 6d 72 0b d2 91 94 f3 42 fa 4c df aa 00 cd 63 4mr.....B.L....c 00:20:33.920 000001d0 39 08 e2 99 a9 f8 20 61 be b0 bb bf b3 ff 64 e9 9..... a......d. 00:20:33.920 000001e0 f1 7a 71 a8 ca 78 e9 fd 3f 6c a4 ab 6a b5 4b c8 .zq..x..?l..j.K. 00:20:33.920 000001f0 5c 0d 43 f0 58 cb 52 3b 46 9a be a7 d9 d0 f8 af \.C.X.R;F....... 00:20:33.920 [2024-09-27 15:25:18.801073] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=2, dhgroup=3, seq=3428451773, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.920 [2024-09-27 15:25:18.817613] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.920 [2024-09-27 15:25:18.817643] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.920 [2024-09-27 15:25:18.817659] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.920 [2024-09-27 15:25:18.817665] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.920 [2024-09-27 15:25:18.923213] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.920 [2024-09-27 15:25:18.923230] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.920 [2024-09-27 15:25:18.923238] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.920 [2024-09-27 15:25:18.923247] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.920 [2024-09-27 15:25:18.923301] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.920 ctrlr pubkey: 00:20:33.920 00000000 83 52 a7 d3 18 e0 a5 55 5a 9a 32 ed a9 a7 e8 c7 .R.....UZ.2..... 00:20:33.920 00000010 05 0b e0 bf 63 15 67 3f f3 c5 36 bd 6c 08 65 7b ....c.g?..6.l.e{ 00:20:33.920 00000020 23 0d 29 06 fe ea b7 ab ac 12 1f 4d 4f 35 0e 4c #.)........MO5.L 00:20:33.920 00000030 fa dc 3a 58 06 9f b4 85 ce 52 7d 79 a8 46 29 0b ..:X.....R}y.F). 00:20:33.920 00000040 ae e5 c0 d9 ad 29 3f bb 6b 17 1a eb 77 49 dd 10 .....)?.k...wI.. 00:20:33.920 00000050 df 3b 95 f1 93 75 04 79 4d a7 87 86 82 87 54 d3 .;...u.yM.....T. 00:20:33.920 00000060 eb 69 5e 4d 17 7b 53 22 5c 9a fa cf 1b 6d 55 dd .i^M.{S"\....mU. 00:20:33.920 00000070 cc 73 c3 5c 83 e4 b3 bc 41 0e df 5d 0a 4b 20 d5 .s.\....A..].K . 00:20:33.920 00000080 09 e3 03 24 2d 16 6b 26 47 5d 75 ba 5b 7c 9f 09 ...$-.k&G]u.[|.. 00:20:33.920 00000090 9f 91 a6 45 c2 89 96 17 e6 e1 ad 77 4b 71 1a a1 ...E.......wKq.. 00:20:33.920 000000a0 7e 92 d1 71 a2 f6 ec cc 7e ba 01 e3 eb 46 14 9d ~..q....~....F.. 00:20:33.920 000000b0 5e 2b ca 28 56 df d0 8e 2d 11 ec c9 ca 26 77 88 ^+.(V...-....&w. 00:20:33.920 000000c0 05 c3 5b 19 af 8e dc e8 f5 ea 94 6f f9 4a 38 3f ..[........o.J8? 00:20:33.920 000000d0 09 37 c5 b8 a4 21 1e 5a ec 77 94 db dc b2 87 49 .7...!.Z.w.....I 00:20:33.920 000000e0 11 b6 9c 54 54 9a d5 6a 14 ef 39 63 a8 2e 9e 3c ...TT..j..9c...< 00:20:33.920 000000f0 e1 8c 5c c5 d2 84 ad a5 a0 3f 70 ba 78 66 2e a9 ..\......?p.xf.. 00:20:33.920 00000100 df 9a c2 9e 80 47 06 cb df 3f e2 39 26 4c e8 76 .....G...?.9&L.v 00:20:33.920 00000110 80 10 e5 1b a6 7c cf 36 2d 63 45 a1 71 62 2f f3 .....|.6-cE.qb/. 00:20:33.920 00000120 73 73 e0 d5 43 7a ff aa ba e1 49 5d 4e 98 d2 34 ss..Cz....I]N..4 00:20:33.920 00000130 4a 2c 16 51 ac 86 c2 00 fb 3c 6a 1b b7 61 90 ea J,.Q.....,.....(.'.S@a 00:20:33.921 000001d0 42 95 7f aa ae c5 d7 13 9e 49 34 9a 21 e5 08 61 B........I4.!..a 00:20:33.921 000001e0 ce 68 04 f9 45 06 8b 09 19 82 74 67 9d 97 6e 11 .h..E.....tg..n. 00:20:33.921 000001f0 d5 8a e2 5f fc 60 76 06 54 a5 5d d0 36 ee c6 e5 ..._.`v.T.].6... 00:20:33.921 host pubkey: 00:20:33.921 00000000 c0 32 b9 74 38 ff ef e3 f4 bb e5 0b ab ac 2c ca .2.t8.........,. 00:20:33.921 00000010 e0 b2 55 f3 30 67 5f 39 dc 61 36 dc c4 a8 6a b0 ..U.0g_9.a6...j. 00:20:33.921 00000020 5c 7f 8a b8 17 81 72 72 0c 61 20 33 cf 93 bf 31 \.....rr.a 3...1 00:20:33.921 00000030 90 4a 01 08 9a 1e 4a 5c c1 ed 0e ad 5c 3a 9f c2 .J....J\....\:.. 00:20:33.921 00000040 2c 86 b8 d6 5e ea 6b 79 87 73 cb 18 1e d4 43 b5 ,...^.ky.s....C. 00:20:33.921 00000050 bb 05 95 e8 f2 9e 56 4b fe 6f d5 ed f3 a9 16 80 ......VK.o...... 00:20:33.921 00000060 90 31 71 83 08 50 03 f3 9d 92 46 b3 25 26 29 07 .1q..P....F.%&). 00:20:33.921 00000070 35 9a 27 5d 1b df 80 b6 76 3d ce 78 03 4d c2 71 5.']....v=.x.M.q 00:20:33.921 00000080 68 ff a7 23 4c ea b4 93 4a fd 8c 3c e2 19 79 09 h..#L...J..<..y. 00:20:33.921 00000090 f6 85 7a c1 29 27 a7 83 16 42 40 45 e8 31 75 e2 ..z.)'...B@E.1u. 00:20:33.921 000000a0 77 f5 a1 0d 71 02 fa e9 f4 ce c2 4f ee ae de 99 w...q......O.... 00:20:33.921 000000b0 7a 8d 72 b4 ed f3 d5 fb 76 e3 43 95 7e d1 17 66 z.r.....v.C.~..f 00:20:33.921 000000c0 94 05 ff 78 db 8e e8 60 d6 2f ab 9e 3a 76 9e 76 ...x...`./..:v.v 00:20:33.921 000000d0 fe 40 57 f2 cc cf 18 8c 86 74 1c 71 07 52 45 9d .@W......t.q.RE. 00:20:33.921 000000e0 f2 be a5 da 68 05 08 f6 f0 28 e5 00 25 27 df df ....h....(..%'.. 00:20:33.921 000000f0 d9 29 f5 b3 59 fa e1 23 99 8e c3 91 72 e8 0f 88 .)..Y..#....r... 00:20:33.921 00000100 ff a5 6f 5a 7d 2f bc 2f ba 5a d0 64 8a a3 f8 a4 ..oZ}/./.Z.d.... 00:20:33.921 00000110 4c 15 81 49 1f 79 d8 d2 3c 7a 3b 18 67 1f 82 92 L..I.y..M.Hy.T 00:20:33.921 00000170 a6 70 81 98 b5 24 0c df 0d 64 96 ba ea 84 2d 02 .p...$...d....-. 00:20:33.921 00000180 10 7b e2 93 f0 74 6d 52 63 c0 52 be 04 1c 1e 51 .{...tmRc.R....Q 00:20:33.921 00000190 70 3b c4 49 17 a8 a2 b6 d2 0d 8e bc 67 b6 76 ba p;.I........g.v. 00:20:33.921 000001a0 44 3e 9c 63 ec e6 56 30 1c 37 e3 d7 d4 a1 a8 f5 D>.c..V0.7...... 00:20:33.921 000001b0 d0 01 cb 66 7f 03 9d a1 fc 74 5e ba bf 6a cb f2 ...f.....t^..j.. 00:20:33.921 000001c0 c1 66 ff 54 7c 53 bf b9 4e 65 42 ee d3 f3 2d 2d .f.T|S..NeB...-- 00:20:33.921 000001d0 b1 6c 22 ed 01 14 74 28 51 5e 55 b4 1f 1d 7f ae .l"...t(Q^U..... 00:20:33.921 000001e0 4a 71 77 9c 03 2c 04 7b fa 61 ec 73 a8 f1 d2 a7 Jqw..,.{.a.s.... 00:20:33.921 000001f0 13 65 e2 74 de 8c e1 64 02 9e bc e9 2c 0e 0d 79 .e.t...d....,..y 00:20:33.921 dh secret: 00:20:33.921 00000000 c4 8b 6f 5f 0d 44 35 9a 32 7e ce 87 34 9b d5 7f ..o_.D5.2~..4... 00:20:33.921 00000010 5a 21 1b 97 2e 00 1e 80 72 e9 9c 93 cc fb a8 96 Z!......r....... 00:20:33.921 00000020 5b 25 59 e8 df b2 e9 80 a7 28 3c b2 dd 3b 01 8a [%Y......(<..;.. 00:20:33.921 00000030 b5 5c c9 29 2a 13 43 f4 e5 8c 92 f2 ef eb b8 3b .\.)*.C........; 00:20:33.921 00000040 bf 77 21 33 f8 cd 6c fe a6 56 e1 6b 41 38 6b 92 .w!3..l..V.kA8k. 00:20:33.921 00000050 f8 4c 9f d0 25 e2 90 5e 89 45 b1 81 ce ac ec b8 .L..%..^.E...... 00:20:33.921 00000060 4a a4 4d 32 0b 19 9b d7 01 4c f5 de 73 0b 80 6d J.M2.....L..s..m 00:20:33.921 00000070 83 39 47 0d c1 98 04 84 06 29 bb 93 99 9f 3e fb .9G......)....>. 00:20:33.921 00000080 75 eb 69 b6 9a 46 fe 13 74 86 48 d5 ab 71 cb a7 u.i..F..t.H..q.. 00:20:33.921 00000090 52 1a ab bc 57 27 5d 11 99 01 14 ad 90 92 b4 38 R...W']........8 00:20:33.921 000000a0 7e 31 8c f5 ee b7 a3 65 27 e1 b4 3d c8 4f 48 c9 ~1.....e'..=.OH. 00:20:33.921 000000b0 b8 e6 83 ea dc 5a 61 41 0d 24 24 d2 a6 8f cd 85 .....ZaA.$$..... 00:20:33.921 000000c0 ba 86 c4 a9 5a 6b 05 c0 9d 26 57 ec be ee 22 93 ....Zk...&W...". 00:20:33.921 000000d0 ff 27 b6 49 dc fd 14 4e ea 7a 2d 6d 93 2f d3 2e .'.I...N.z-m./.. 00:20:33.921 000000e0 de 12 06 4d d6 dd 16 f4 e8 5c f9 05 9b 85 3b bc ...M.....\....;. 00:20:33.921 000000f0 06 e5 00 e8 5e 56 65 4b b1 4d 1d b3 ea 10 85 2a ....^VeK.M.....* 00:20:33.921 00000100 49 3b f4 63 f9 60 54 d2 ce bc 4f ef 28 25 bc 33 I;.c.`T...O.(%.3 00:20:33.921 00000110 06 52 9f 8a be 12 9c 55 59 0f 92 30 02 79 67 91 .R.....UY..0.yg. 00:20:33.921 00000120 1f 89 f9 76 ac 15 68 79 c9 ed b2 0e ff ee 68 f9 ...v..hy......h. 00:20:33.921 00000130 17 d1 37 59 eb a3 d2 65 92 07 9d e7 ee db 99 a0 ..7Y...e........ 00:20:33.921 00000140 7a 8c ff 33 7a c8 bd 81 e4 4d 5b 99 62 6f 7d af z..3z....M[.bo}. 00:20:33.921 00000150 97 d8 99 c8 cf 3a 1f b9 90 d5 68 07 36 fe 9d fa .....:....h.6... 00:20:33.921 00000160 72 4e 4c e4 ba ba d7 a1 ae 93 88 41 a8 3f c9 03 rNL........A.?.. 00:20:33.921 00000170 83 26 f8 5e d2 58 7d 91 52 77 12 dc 69 46 98 9a .&.^.X}.Rw..iF.. 00:20:33.921 00000180 e1 3e 33 8b 1c 45 b4 15 60 b6 a9 ed 9b 79 f8 5c .>3..E..`....y.\ 00:20:33.921 00000190 5f 77 d9 d9 77 fc 0a d6 28 d1 1a 08 36 c3 73 8c _w..w...(...6.s. 00:20:33.921 000001a0 b7 0c c6 5b 4d 40 7b b1 0b 6c 2f da 15 64 85 b3 ...[M@{..l/..d.. 00:20:33.921 000001b0 e6 3d 6a ec de 94 b1 9e a4 a4 6e dc da 93 d9 f7 .=j.......n..... 00:20:33.921 000001c0 08 56 5d 4a b6 87 49 5b db 56 0d 51 b4 c9 1a c3 .V]J..I[.V.Q.... 00:20:33.921 000001d0 0d 22 15 8d e8 b4 48 89 4d 49 2c 83 b7 f3 5f 24 ."....H.MI,..._$ 00:20:33.921 000001e0 db 63 cb 60 c3 29 66 be e6 35 90 5b f9 76 5e a8 .c.`.)f..5.[.v^. 00:20:33.921 000001f0 63 3d fc 32 c5 f2 58 4a 05 f5 cd bc 7f 9d c4 83 c=.2..XJ........ 00:20:33.921 [2024-09-27 15:25:18.939512] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=2, dhgroup=3, seq=3428451774, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.921 [2024-09-27 15:25:18.939575] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.921 [2024-09-27 15:25:18.976383] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.921 [2024-09-27 15:25:18.976411] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.921 [2024-09-27 15:25:18.976417] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.921 [2024-09-27 15:25:19.154393] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.921 [2024-09-27 15:25:19.154414] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.921 [2024-09-27 15:25:19.154422] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.921 [2024-09-27 15:25:19.154467] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.921 [2024-09-27 15:25:19.154490] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.921 ctrlr pubkey: 00:20:33.921 00000000 50 4c 64 05 aa 47 b2 4f 71 9e b7 29 4b d4 f1 17 PLd..G.Oq..)K... 00:20:33.921 00000010 49 c9 2c 67 26 3f 72 7d a7 d1 27 bf c7 cc ed f6 I.,g&?r}..'..... 00:20:33.921 00000020 ce 5c ee 20 c6 00 a1 db 1f 31 d6 a4 5a 7f b3 3f .\. .....1..Z..? 00:20:33.921 00000030 75 be cb c9 fe 13 6e 7c 0e d6 23 62 34 54 c0 6f u.....n|..#b4T.o 00:20:33.921 00000040 07 15 64 6e be 40 75 0e cd d7 d3 23 b0 47 65 93 ..dn.@u....#.Ge. 00:20:33.921 00000050 dc fc 42 a5 b5 1f 1a b3 a1 b6 8d 34 11 42 ec 63 ..B........4.B.c 00:20:33.921 00000060 5a 4e f5 46 31 dc 86 6a a1 37 9c 0c dc bd b8 80 ZN.F1..j.7...... 00:20:33.921 00000070 5f ed f4 9c c4 c7 e3 f6 12 08 c7 0d 84 5a 23 25 _............Z#% 00:20:33.921 00000080 72 c1 c6 48 6d b8 b4 8b ba 38 fe 0f f2 da 8c ec r..Hm....8...... 00:20:33.921 00000090 80 f7 2e 45 71 a1 45 2e 75 73 32 22 8a a6 bc e9 ...Eq.E.us2".... 00:20:33.921 000000a0 36 0e 55 43 c5 8a 61 ab 2b 54 96 40 1f e4 52 ef 6.UC..a.+T.@..R. 00:20:33.921 000000b0 de c5 a4 d8 38 2b bc 51 10 25 aa ba fc 73 31 76 ....8+.Q.%...s1v 00:20:33.921 000000c0 cd 8a ab 9b 04 f5 4e 32 9b e2 79 91 f9 4b 37 01 ......N2..y..K7. 00:20:33.921 000000d0 36 69 ae f0 43 b7 b1 5c 27 ab e0 d4 0d 0e 2a 9e 6i..C..\'.....*. 00:20:33.921 000000e0 24 85 6e 1d 0b cf 39 08 87 f6 42 58 d0 13 ff d1 $.n...9...BX.... 00:20:33.921 000000f0 26 dd af 0a 60 1a 89 6d eb a4 39 f2 b8 9a 7b 93 &...`..m..9...{. 00:20:33.921 00000100 67 50 9d 18 1f 07 55 8b a8 9f ec 73 f2 6f 84 4b gP....U....s.o.K 00:20:33.921 00000110 b2 4e 43 1e 55 6b b9 96 cf dd ac a9 99 27 da 28 .NC.Uk.......'.( 00:20:33.921 00000120 58 03 e0 67 8b 94 dd 39 6c 22 fb ca 91 ae d2 f9 X..g...9l"...... 00:20:33.921 00000130 e8 32 51 6d 36 6b 29 89 24 9b 51 02 be 61 7c 53 .2Qm6k).$.Q..a|S 00:20:33.921 00000140 1f cc 52 9a 86 8c 6b 74 4d 3d d9 55 a3 a4 8e b7 ..R...ktM=.U.... 00:20:33.921 00000150 f3 0b 0f 96 97 5f cd 39 e2 4c ec b7 30 90 ba 8c ....._.9.L..0... 00:20:33.921 00000160 ad 7d 70 b9 63 d4 37 4b ff 9b d5 82 e1 81 c2 f8 .}p.c.7K........ 00:20:33.921 00000170 76 f6 9c cd 8e e8 f5 5a 37 46 13 83 70 14 fb 08 v......Z7F..p... 00:20:33.921 00000180 ce a3 32 a8 bd 6a 6d e3 8a 6a 55 d4 c5 e2 1d 5f ..2..jm..jU...._ 00:20:33.921 00000190 31 ea 30 e1 d1 92 66 50 b5 60 ab 50 14 33 9b 98 1.0...fP.`.P.3.. 00:20:33.921 000001a0 e1 a6 5b c2 20 79 7d c0 c2 2c 22 1c ef c8 03 a6 ..[. y}..,"..... 00:20:33.921 000001b0 91 98 13 34 22 db 98 30 c1 4c be 8f 78 82 3d d7 ...4"..0.L..x.=. 00:20:33.921 000001c0 ca de 8a 71 ab 64 da eb d9 6f 04 f1 c0 74 ee c3 ...q.d...o...t.. 00:20:33.921 000001d0 e1 9d 11 cd d4 60 e6 5f 08 fe d0 ff 8a 93 b0 49 .....`._.......I 00:20:33.921 000001e0 b9 bf 34 50 b9 eb 5d f9 df 55 88 8e a7 71 98 c5 ..4P..]..U...q.. 00:20:33.921 000001f0 60 6c 1e 0f b9 0b 2a 52 44 92 95 84 5e 7a 7a 52 `l....*RD...^zzR 00:20:33.921 00000200 fa b8 e9 38 97 c6 a6 42 19 d3 87 d6 c9 44 58 f9 ...8...B.....DX. 00:20:33.921 00000210 6f c9 b9 d2 1a ba ad dc bd 45 f5 e9 8c 74 9d db o........E...t.. 00:20:33.921 00000220 8d 6f 82 6a 34 28 ab 45 40 99 84 71 16 29 a8 e0 .o.j4(.E@..q.).. 00:20:33.921 00000230 cc b8 90 84 71 11 1d 5f 4f 67 fa 16 e6 53 6e d7 ....q.._Og...Sn. 00:20:33.921 00000240 2e 1e 4b 98 bd c6 cd 1f ed 2a 9a 54 29 46 15 f8 ..K......*.T)F.. 00:20:33.921 00000250 64 83 c5 80 43 2b 7a e3 4e 02 81 8c 56 d7 13 42 d...C+z.N...V..B 00:20:33.921 00000260 28 c4 f4 a1 25 04 36 39 2c 7b 58 5b 79 0e 3f 7f (...%.69,{X[y.?. 00:20:33.921 00000270 df 98 4c 5f c8 3d 78 b2 ae e4 75 02 66 30 bc b7 ..L_.=x...u.f0.. 00:20:33.921 00000280 da df 75 13 de f9 c2 d1 06 c3 cc b0 49 17 bb 17 ..u.........I... 00:20:33.921 00000290 61 0a 86 f6 c0 ac b6 aa af 09 63 75 6d 25 b5 b9 a.........cum%.. 00:20:33.921 000002a0 87 fd e1 05 25 bc c4 2e fd f6 d6 f5 a5 84 ba e0 ....%........... 00:20:33.921 000002b0 54 14 74 e6 b3 ba 67 00 9e 5c 46 06 a1 bc 6b bb T.t...g..\F...k. 00:20:33.922 000002c0 dd dd 50 c9 2c ef 82 cf be 64 cf c3 66 37 d4 db ..P.,....d..f7.. 00:20:33.922 000002d0 c8 8d 46 4c a6 5b d1 08 06 78 d7 11 15 4b 74 f1 ..FL.[...x...Kt. 00:20:33.922 000002e0 a4 57 43 48 61 84 ea d4 7f 54 01 94 da a4 83 e2 .WCHa....T...... 00:20:33.922 000002f0 fc 85 4f 46 45 1e 7a 78 c4 93 61 44 a9 60 f8 ba ..OFE.zx..aD.`.. 00:20:33.922 host pubkey: 00:20:33.922 00000000 2d 7e d6 03 27 4a 1f f7 f4 19 56 f5 c5 a5 3b a9 -~..'J....V...;. 00:20:33.922 00000010 ad 48 55 13 28 c4 a9 6f 79 d3 be ab c0 91 31 c6 .HU.(..oy.....1. 00:20:33.922 00000020 ae 7b be 0e b9 8b 31 18 00 72 9c 36 e2 cf 6e 02 .{....1..r.6..n. 00:20:33.922 00000030 18 56 26 24 0a 45 83 76 1b 16 a5 26 58 2c ae 39 .V&$.E.v...&X,.9 00:20:33.922 00000040 f9 d6 b8 a4 a0 74 22 f7 17 58 b6 f4 0b 67 2d 39 .....t"..X...g-9 00:20:33.922 00000050 c4 9a e2 78 2b 35 a6 23 23 68 92 58 b2 75 99 1f ...x+5.##h.X.u.. 00:20:33.922 00000060 92 73 5e c5 fd bf 3b 94 a6 84 d6 26 62 eb 33 d8 .s^...;....&b.3. 00:20:33.922 00000070 fd a0 e7 2f 1d 01 72 44 4a f2 e3 83 72 56 f5 78 .../..rDJ...rV.x 00:20:33.922 00000080 ff 74 0b 4f 46 87 60 ec 77 9e 0b fe b7 fe 9a a0 .t.OF.`.w....... 00:20:33.922 00000090 92 cc 02 ca 7c 90 df 2a e9 45 e7 fb 9b 17 fd 7c ....|..*.E.....| 00:20:33.922 000000a0 3d fa d5 6d ee 1e 11 08 dc 87 b8 13 2c e9 11 af =..m........,... 00:20:33.922 000000b0 ee e2 c9 83 0b 95 91 25 67 eb 3b 02 34 b9 61 be .......%g.;.4.a. 00:20:33.922 000000c0 02 90 94 18 11 4f 8b ad 83 4a db 5b ff 20 d5 9e .....O...J.[. .. 00:20:33.922 000000d0 a3 a2 b9 35 01 99 a5 25 2c 58 fe c2 5f 22 c8 1b ...5...%,X.._".. 00:20:33.922 000000e0 74 a1 2b e3 cb 07 68 4a e1 53 38 d4 cf 67 a0 ad t.+...hJ.S8..g.. 00:20:33.922 000000f0 02 ee 2a de 65 d6 02 86 3e 4a b1 ae 69 06 22 4e ..*.e...>J..i."N 00:20:33.922 00000100 7a 99 43 a8 b8 84 04 5e b3 91 21 64 78 79 86 9b z.C....^..!dxy.. 00:20:33.922 00000110 3a 30 6a 7b 54 9c e0 a6 01 39 c4 60 47 fb c1 22 :0j{T....9.`G.." 00:20:33.922 00000120 fa ca ec 0f 42 6a 4d 51 e2 e4 e4 10 d1 3d a3 60 ....BjMQ.....=.` 00:20:33.922 00000130 7a 7c f7 5a 59 e7 73 a4 d7 c0 31 62 06 7b 06 27 z|.ZY.s...1b.{.' 00:20:33.922 00000140 76 a9 f0 a9 a3 b4 7b 99 6a 33 c3 cf 6f 36 f9 88 v.....{.j3..o6.. 00:20:33.922 00000150 9b c5 9d 9e 08 6a 49 ae 44 b5 7a 12 31 d7 ab 09 .....jI.D.z.1... 00:20:33.922 00000160 a9 ed 5b 5e 09 8e 86 d6 7e 6c 8f 9c 6d 75 09 6d ..[^....~l..mu.m 00:20:33.922 00000170 1a 37 00 42 7c e5 d9 e9 f8 41 b4 fd f6 d3 ee 9f .7.B|....A...... 00:20:33.922 00000180 34 72 3b 18 ef e7 16 7b 00 cc bc 0d 5d 5f c5 10 4r;....{....]_.. 00:20:33.922 00000190 e0 81 88 94 fc c2 ac be 0c 2b 93 43 1a ed 51 e8 .........+.C..Q. 00:20:33.922 000001a0 b0 31 a1 40 ad 6f b6 17 e2 69 7f 93 81 ca 5c 4a .1.@.o...i....\J 00:20:33.922 000001b0 b0 36 c7 93 a9 c7 85 da c2 c5 d9 d2 bc 7a cb 5d .6...........z.] 00:20:33.922 000001c0 f2 df 5c 28 f6 43 72 30 ad be 83 4a 0b db c3 13 ..\(.Cr0...J.... 00:20:33.922 000001d0 b8 6c 18 1f dd 8d 7f 04 30 28 e4 37 41 76 98 6b .l......0(.7Av.k 00:20:33.922 000001e0 1d 2b 64 93 28 93 ab 26 3e 7a 9b 4e 2d 21 19 b5 .+d.(..&>z.N-!.. 00:20:33.922 000001f0 80 c4 ca b0 33 4d df d3 d9 8d 3b 0d 3d 19 a5 51 ....3M....;.=..Q 00:20:33.922 00000200 38 74 52 27 5c f6 39 f5 73 22 fd b8 96 76 a3 67 8tR'\.9.s"...v.g 00:20:33.922 00000210 4d 9d 52 b0 a0 21 12 a4 61 2c f1 a9 d7 5a 1d 29 M.R..!..a,...Z.) 00:20:33.922 00000220 24 52 48 a9 a2 c4 dd 82 c1 29 17 99 8a 2e e3 be $RH......)...... 00:20:33.922 00000230 b9 e0 a5 f8 b3 78 98 8a 50 fb 0a ad 49 18 cd 18 .....x..P...I... 00:20:33.922 00000240 54 0e c0 90 83 be e8 2e 03 d2 49 06 cc d7 02 7d T.........I....} 00:20:33.922 00000250 80 d1 4e 4d 80 d1 fc 4d fb ff 08 29 92 02 dd a8 ..NM...M...).... 00:20:33.922 00000260 f6 de f9 8e 63 70 ff de f0 b8 27 b6 12 52 39 68 ....cp....'..R9h 00:20:33.922 00000270 96 4f b8 2f 27 55 62 b2 d7 fd 02 80 30 87 21 a2 .O./'Ub.....0.!. 00:20:33.922 00000280 06 9f da 33 b4 52 3f 32 cb 58 51 bd 83 43 aa e0 ...3.R?2.XQ..C.. 00:20:33.922 00000290 07 71 be 7c 62 89 87 96 45 04 42 ed 0a b6 05 c7 .q.|b...E.B..... 00:20:33.922 000002a0 b4 d9 16 07 57 0d 53 b4 89 59 a9 90 aa 00 53 5e ....W.S..Y....S^ 00:20:33.922 000002b0 6c 5c f1 ac 7c 7c f7 61 cf 0f 39 83 60 c3 2a cf l\..||.a..9.`.*. 00:20:33.922 000002c0 e4 fd 22 fd 70 71 1e 74 9c 2f 28 85 f4 c6 8d 28 ..".pq.t./(....( 00:20:33.922 000002d0 50 10 ca 21 5c e5 8f 5b 71 23 63 9a 6e 7e c3 41 P..!\..[q#c.n~.A 00:20:33.922 000002e0 3a fa 7e a7 7f ab 4c f6 c0 62 4c 5c e6 b4 e4 f6 :.~...L..bL\.... 00:20:33.922 000002f0 11 77 a4 93 c5 90 7a d3 71 1c 5e 7e 4e a8 46 11 .w....z.q.^~N.F. 00:20:33.922 dh secret: 00:20:33.922 00000000 52 1b 89 5e b4 c9 f3 e5 56 66 01 5e 4d c6 f6 73 R..^....Vf.^M..s 00:20:33.922 00000010 5a df da c0 74 87 08 9e a0 4d 73 b2 2b 8b af 4b Z...t....Ms.+..K 00:20:33.922 00000020 c7 fa 70 9e 94 63 95 70 af d3 7c c2 09 b5 b8 97 ..p..c.p..|..... 00:20:33.922 00000030 6b 1d 91 aa b1 02 26 aa 9c 1a 3f d2 53 46 5e 12 k.....&...?.SF^. 00:20:33.922 00000040 04 92 d2 71 ad 51 a4 3b 5c db d6 e6 05 5b b7 ef ...q.Q.;\....[.. 00:20:33.922 00000050 f5 85 e9 76 f0 19 cf 59 09 99 aa 2f c6 33 b1 96 ...v...Y.../.3.. 00:20:33.922 00000060 e1 f6 4a e6 83 d7 d0 ed 57 5d fa 3a f7 63 7a c1 ..J.....W].:.cz. 00:20:33.922 00000070 c7 8a 8a 16 f0 25 c4 ac ed b4 78 31 e2 73 3e 69 .....%....x1.s>i 00:20:33.922 00000080 a2 72 4a f9 c5 22 26 48 6e 3f 10 69 32 a3 fd 33 .rJ.."&Hn?.i2..3 00:20:33.922 00000090 c4 c3 2c d6 a2 a1 44 42 72 5a 99 bb 20 2e 85 92 ..,...DBrZ.. ... 00:20:33.922 000000a0 23 74 d0 84 3c f8 57 02 47 24 aa 9a f6 d1 7d 91 #t..<.W.G$....}. 00:20:33.922 000000b0 0b 6f 88 e0 4f c3 51 5b 6b 6a 45 db 63 a0 9b e6 .o..O.Q[kjE.c... 00:20:33.922 000000c0 6d c8 11 fd 35 83 78 f3 3f 85 b5 78 74 79 1f 03 m...5.x.?..xty.. 00:20:33.922 000000d0 ed 84 56 b0 f4 c1 fb d8 28 41 bb 54 c6 ae 7a c2 ..V.....(A.T..z. 00:20:33.922 000000e0 ca b2 fd 8d ca 02 ca 36 9c 38 4a 74 c0 50 b9 47 .......6.8Jt.P.G 00:20:33.922 000000f0 cf 4c bf 82 18 b0 38 1a 37 44 51 e5 31 63 0b f5 .L....8.7DQ.1c.. 00:20:33.922 00000100 ca 08 a5 49 d6 40 cd c4 58 dc e3 6f dd 92 2a 6b ...I.@..X..o..*k 00:20:33.922 00000110 d3 03 7b 03 6a 7d 43 18 5e ff c5 f6 ee 42 58 c0 ..{.j}C.^....BX. 00:20:33.922 00000120 22 4a 70 9a 43 27 8b 8e 82 d5 7d 87 76 29 2a 5b "Jp.C'....}.v)*[ 00:20:33.922 00000130 95 4d e1 74 c9 02 5d 29 b0 5f 96 7d 2c 53 61 fe .M.t..])._.},Sa. 00:20:33.922 00000140 c7 18 a3 d6 49 1e 5f 3c 3a 89 e2 27 c2 27 98 a3 ....I._<:..'.'.. 00:20:33.922 00000150 9c 61 47 0e ca 5c 6b f7 0f 87 49 d3 46 30 90 38 .aG..\k...I.F0.8 00:20:33.922 00000160 d6 62 f2 b4 91 8d 2b 5f d1 d9 84 17 9a b1 4f 76 .b....+_......Ov 00:20:33.922 00000170 b2 7e 19 58 59 f2 d5 05 55 6a a9 3d e0 53 a1 f5 .~.XY...Uj.=.S.. 00:20:33.922 00000180 2f a9 d0 d6 e1 3c 8a 0c 0a e5 37 2f ba 12 15 14 /....<....7/.... 00:20:33.922 00000190 6d ca 4f ab c8 ed e0 ae 66 12 b9 98 d2 d7 84 44 m.O.....f......D 00:20:33.922 000001a0 21 47 10 d8 d4 ed 2d dc 9c 3f af 74 51 52 d4 74 !G....-..?.tQR.t 00:20:33.922 000001b0 be 79 4f af 51 a9 dc ab db e6 fe 69 ef 9b fe 73 .yO.Q......i...s 00:20:33.922 000001c0 73 d1 d3 40 0b b1 bc 22 c1 fb de 93 8c c9 19 66 s..@...".......f 00:20:33.922 000001d0 40 dd 7e de 43 f6 90 0c c6 ba 8b a1 99 32 98 25 @.~.C........2.% 00:20:33.922 000001e0 ed 53 cc b8 25 b5 2a 61 22 53 77 5b 09 38 7c 64 .S..%.*a"Sw[.8|d 00:20:33.922 000001f0 ae 59 3b 0e 27 0b f0 1c 94 3c 67 e8 3c 4b db 36 .Y;.'.....8.....=. 00:20:33.922 00000220 b6 ef 28 5c 9c 2f 0a 29 99 a0 05 a3 aa 9a a2 5c ..(\./.).......\ 00:20:33.922 00000230 60 b8 0e 42 0f 4f d6 c0 eb 9b fa 49 4d de 55 eb `..B.O.....IM.U. 00:20:33.922 00000240 cc da 76 1a cc 0e d0 b5 20 c0 7b e2 01 a7 d7 a4 ..v..... .{..... 00:20:33.922 00000250 d0 66 9a 02 f9 45 f7 a7 c0 ce 5b 66 ce 59 ba c5 .f...E....[f.Y.. 00:20:33.922 00000260 07 d1 ce f0 3b dd c2 38 3c d9 cd 31 81 5c 5b a6 ....;..8<..1.\[. 00:20:33.922 00000270 53 af f9 c2 28 74 e3 1f 13 02 f8 b1 a1 33 d4 e8 S...(t.......3.. 00:20:33.922 00000280 cf 85 e0 e0 81 31 46 4d 82 94 bf 28 6b 75 93 6a .....1FM...(ku.j 00:20:33.922 00000290 a6 7c e2 c5 0d 48 0c e0 29 6f ef c6 94 bf 93 7b .|...H..)o.....{ 00:20:33.922 000002a0 c9 de 9a e3 8d 63 3d 76 0a 9a 40 46 95 eb ab c0 .....c=v..@F.... 00:20:33.922 000002b0 b8 31 86 86 65 fa 13 9a 29 89 2b 23 85 ba d6 47 .1..e...).+#...G 00:20:33.922 000002c0 d8 b0 17 e2 e9 34 c9 a7 20 64 5d a7 a1 e6 28 33 .....4.. d]...(3 00:20:33.922 000002d0 72 34 21 14 5a 97 3e fa 0d 98 ea 73 69 59 b1 4d r4!.Z.>....siY.M 00:20:33.922 000002e0 34 df 34 f5 a5 ff 25 8c e4 e6 e6 a9 30 8f a3 93 4.4...%.....0... 00:20:33.922 000002f0 98 19 a8 17 ac 03 ca 20 37 fd bf ca a6 86 8b 58 ....... 7......X 00:20:33.922 [2024-09-27 15:25:19.202703] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=2, dhgroup=4, seq=3428451775, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.922 [2024-09-27 15:25:19.238582] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.922 [2024-09-27 15:25:19.238627] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.922 [2024-09-27 15:25:19.238644] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.922 [2024-09-27 15:25:19.238664] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.922 [2024-09-27 15:25:19.238678] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.922 [2024-09-27 15:25:19.344899] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.922 [2024-09-27 15:25:19.344918] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.922 [2024-09-27 15:25:19.344925] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.922 [2024-09-27 15:25:19.344935] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.922 [2024-09-27 15:25:19.344992] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.922 ctrlr pubkey: 00:20:33.922 00000000 50 4c 64 05 aa 47 b2 4f 71 9e b7 29 4b d4 f1 17 PLd..G.Oq..)K... 00:20:33.922 00000010 49 c9 2c 67 26 3f 72 7d a7 d1 27 bf c7 cc ed f6 I.,g&?r}..'..... 00:20:33.922 00000020 ce 5c ee 20 c6 00 a1 db 1f 31 d6 a4 5a 7f b3 3f .\. .....1..Z..? 00:20:33.922 00000030 75 be cb c9 fe 13 6e 7c 0e d6 23 62 34 54 c0 6f u.....n|..#b4T.o 00:20:33.922 00000040 07 15 64 6e be 40 75 0e cd d7 d3 23 b0 47 65 93 ..dn.@u....#.Ge. 00:20:33.922 00000050 dc fc 42 a5 b5 1f 1a b3 a1 b6 8d 34 11 42 ec 63 ..B........4.B.c 00:20:33.922 00000060 5a 4e f5 46 31 dc 86 6a a1 37 9c 0c dc bd b8 80 ZN.F1..j.7...... 00:20:33.922 00000070 5f ed f4 9c c4 c7 e3 f6 12 08 c7 0d 84 5a 23 25 _............Z#% 00:20:33.922 00000080 72 c1 c6 48 6d b8 b4 8b ba 38 fe 0f f2 da 8c ec r..Hm....8...... 00:20:33.922 00000090 80 f7 2e 45 71 a1 45 2e 75 73 32 22 8a a6 bc e9 ...Eq.E.us2".... 00:20:33.923 000000a0 36 0e 55 43 c5 8a 61 ab 2b 54 96 40 1f e4 52 ef 6.UC..a.+T.@..R. 00:20:33.923 000000b0 de c5 a4 d8 38 2b bc 51 10 25 aa ba fc 73 31 76 ....8+.Q.%...s1v 00:20:33.923 000000c0 cd 8a ab 9b 04 f5 4e 32 9b e2 79 91 f9 4b 37 01 ......N2..y..K7. 00:20:33.923 000000d0 36 69 ae f0 43 b7 b1 5c 27 ab e0 d4 0d 0e 2a 9e 6i..C..\'.....*. 00:20:33.923 000000e0 24 85 6e 1d 0b cf 39 08 87 f6 42 58 d0 13 ff d1 $.n...9...BX.... 00:20:33.923 000000f0 26 dd af 0a 60 1a 89 6d eb a4 39 f2 b8 9a 7b 93 &...`..m..9...{. 00:20:33.923 00000100 67 50 9d 18 1f 07 55 8b a8 9f ec 73 f2 6f 84 4b gP....U....s.o.K 00:20:33.923 00000110 b2 4e 43 1e 55 6b b9 96 cf dd ac a9 99 27 da 28 .NC.Uk.......'.( 00:20:33.923 00000120 58 03 e0 67 8b 94 dd 39 6c 22 fb ca 91 ae d2 f9 X..g...9l"...... 00:20:33.923 00000130 e8 32 51 6d 36 6b 29 89 24 9b 51 02 be 61 7c 53 .2Qm6k).$.Q..a|S 00:20:33.923 00000140 1f cc 52 9a 86 8c 6b 74 4d 3d d9 55 a3 a4 8e b7 ..R...ktM=.U.... 00:20:33.923 00000150 f3 0b 0f 96 97 5f cd 39 e2 4c ec b7 30 90 ba 8c ....._.9.L..0... 00:20:33.923 00000160 ad 7d 70 b9 63 d4 37 4b ff 9b d5 82 e1 81 c2 f8 .}p.c.7K........ 00:20:33.923 00000170 76 f6 9c cd 8e e8 f5 5a 37 46 13 83 70 14 fb 08 v......Z7F..p... 00:20:33.923 00000180 ce a3 32 a8 bd 6a 6d e3 8a 6a 55 d4 c5 e2 1d 5f ..2..jm..jU...._ 00:20:33.923 00000190 31 ea 30 e1 d1 92 66 50 b5 60 ab 50 14 33 9b 98 1.0...fP.`.P.3.. 00:20:33.923 000001a0 e1 a6 5b c2 20 79 7d c0 c2 2c 22 1c ef c8 03 a6 ..[. y}..,"..... 00:20:33.923 000001b0 91 98 13 34 22 db 98 30 c1 4c be 8f 78 82 3d d7 ...4"..0.L..x.=. 00:20:33.923 000001c0 ca de 8a 71 ab 64 da eb d9 6f 04 f1 c0 74 ee c3 ...q.d...o...t.. 00:20:33.923 000001d0 e1 9d 11 cd d4 60 e6 5f 08 fe d0 ff 8a 93 b0 49 .....`._.......I 00:20:33.923 000001e0 b9 bf 34 50 b9 eb 5d f9 df 55 88 8e a7 71 98 c5 ..4P..]..U...q.. 00:20:33.923 000001f0 60 6c 1e 0f b9 0b 2a 52 44 92 95 84 5e 7a 7a 52 `l....*RD...^zzR 00:20:33.923 00000200 fa b8 e9 38 97 c6 a6 42 19 d3 87 d6 c9 44 58 f9 ...8...B.....DX. 00:20:33.923 00000210 6f c9 b9 d2 1a ba ad dc bd 45 f5 e9 8c 74 9d db o........E...t.. 00:20:33.923 00000220 8d 6f 82 6a 34 28 ab 45 40 99 84 71 16 29 a8 e0 .o.j4(.E@..q.).. 00:20:33.923 00000230 cc b8 90 84 71 11 1d 5f 4f 67 fa 16 e6 53 6e d7 ....q.._Og...Sn. 00:20:33.923 00000240 2e 1e 4b 98 bd c6 cd 1f ed 2a 9a 54 29 46 15 f8 ..K......*.T)F.. 00:20:33.923 00000250 64 83 c5 80 43 2b 7a e3 4e 02 81 8c 56 d7 13 42 d...C+z.N...V..B 00:20:33.923 00000260 28 c4 f4 a1 25 04 36 39 2c 7b 58 5b 79 0e 3f 7f (...%.69,{X[y.?. 00:20:33.923 00000270 df 98 4c 5f c8 3d 78 b2 ae e4 75 02 66 30 bc b7 ..L_.=x...u.f0.. 00:20:33.923 00000280 da df 75 13 de f9 c2 d1 06 c3 cc b0 49 17 bb 17 ..u.........I... 00:20:33.923 00000290 61 0a 86 f6 c0 ac b6 aa af 09 63 75 6d 25 b5 b9 a.........cum%.. 00:20:33.923 000002a0 87 fd e1 05 25 bc c4 2e fd f6 d6 f5 a5 84 ba e0 ....%........... 00:20:33.923 000002b0 54 14 74 e6 b3 ba 67 00 9e 5c 46 06 a1 bc 6b bb T.t...g..\F...k. 00:20:33.923 000002c0 dd dd 50 c9 2c ef 82 cf be 64 cf c3 66 37 d4 db ..P.,....d..f7.. 00:20:33.923 000002d0 c8 8d 46 4c a6 5b d1 08 06 78 d7 11 15 4b 74 f1 ..FL.[...x...Kt. 00:20:33.923 000002e0 a4 57 43 48 61 84 ea d4 7f 54 01 94 da a4 83 e2 .WCHa....T...... 00:20:33.923 000002f0 fc 85 4f 46 45 1e 7a 78 c4 93 61 44 a9 60 f8 ba ..OFE.zx..aD.`.. 00:20:33.923 host pubkey: 00:20:33.923 00000000 a9 45 1d 79 b0 1f 07 f0 53 b5 56 5f c7 64 50 51 .E.y....S.V_.dPQ 00:20:33.923 00000010 a4 c3 50 4d fd 74 39 5a ee 14 33 f2 00 2b 45 08 ..PM.t9Z..3..+E. 00:20:33.923 00000020 12 b3 86 d7 27 a5 ea 1d f8 2f 36 ed aa 24 a0 0e ....'..../6..$.. 00:20:33.923 00000030 fc 11 05 83 0b 97 73 e6 9c d0 4c 85 4f ba 55 c8 ......s...L.O.U. 00:20:33.923 00000040 5b 3d 14 c0 76 dd 77 d8 8c cb ec 1b 68 00 3e 36 [=..v.w.....h.>6 00:20:33.923 00000050 8a 9c be a7 db e9 d9 be 60 e7 32 2c 9f bb d6 54 ........`.2,...T 00:20:33.923 00000060 be 95 d1 96 d5 d8 ab bc b3 68 ed e9 59 a2 5d 47 .........h..Y.]G 00:20:33.923 00000070 2b ab 69 db 52 18 42 3b ed 30 e3 d3 d0 71 c3 af +.i.R.B;.0...q.. 00:20:33.923 00000080 13 3c 5e d9 64 48 82 f3 20 5d 73 2d 53 d0 4f 8b .<^.dH.. ]s-S.O. 00:20:33.923 00000090 9b dc 1c 5a 7b 9f 4e 0a f7 4e a8 0a 4c 57 56 9f ...Z{.N..N..LWV. 00:20:33.923 000000a0 ab 4f da 85 47 26 c1 51 e8 49 0b b8 07 7d 97 8a .O..G&.Q.I...}.. 00:20:33.923 000000b0 0b 82 29 b8 39 e3 67 b4 d5 24 69 03 37 67 d7 da ..).9.g..$i.7g.. 00:20:33.923 000000c0 ff 50 3b 03 57 c0 78 f1 7c 9b 6e a1 fb 83 8f af .P;.W.x.|.n..... 00:20:33.923 000000d0 09 b1 c7 3a 37 02 20 0e c4 92 ec 84 dd b2 b8 ff ...:7. ......... 00:20:33.923 000000e0 f7 9d e6 7d a7 b8 b7 94 6e ea 7f fe 13 e6 5b 17 ...}....n.....[. 00:20:33.923 000000f0 97 15 b6 2c 0e fe f3 35 94 13 aa 19 e4 7e 30 0a ...,...5.....~0. 00:20:33.923 00000100 4c 45 5e 42 83 67 bd 9c 80 c2 10 8e 36 a6 34 7e LE^B.g......6.4~ 00:20:33.923 00000110 24 0d be 87 a0 ce 1a e7 dd e3 d6 0d 4b 4b 0c 16 $...........KK.. 00:20:33.923 00000120 fb 3f 06 79 0b ba 06 04 5e 5c 3f c0 f1 6f ad d7 .?.y....^\?..o.. 00:20:33.923 00000130 d7 d7 f5 d5 1e 8c 99 50 c3 7e 0d 66 23 63 cf d3 .......P.~.f#c.. 00:20:33.923 00000140 98 8b e6 f7 90 39 63 d6 ab a9 0e 72 d7 6c 3b bc .....9c....r.l;. 00:20:33.923 00000150 11 28 3f af c0 90 6e a0 d6 b3 a1 fc d3 28 c2 b5 .(?...n......(.. 00:20:33.923 00000160 58 39 e6 17 46 6f e2 05 81 b2 63 69 d1 96 77 1b X9..Fo....ci..w. 00:20:33.923 00000170 e7 bf 4b 71 26 ad 5e 8f ba 51 09 69 c5 22 01 fa ..Kq&.^..Q.i.".. 00:20:33.923 00000180 ec 5f 05 62 13 b8 47 d8 96 9c 27 33 e2 7b 30 35 ._.b..G...'3.{05 00:20:33.923 00000190 bf 3e 13 5a 7d 6e ea b3 c8 64 5e b2 d4 f0 ba 31 .>.Z}n...d^....1 00:20:33.923 000001a0 84 62 5d dd 59 79 ac 40 42 1f 95 22 ee c5 75 c0 .b].Yy.@B.."..u. 00:20:33.923 000001b0 ad 4e 90 e4 8c aa 66 62 96 3f 36 81 89 ce b0 4c .N....fb.?6....L 00:20:33.923 000001c0 65 71 2c f0 31 0e e4 3a 0d ad b3 7b 1c 3f bc 97 eq,.1..:...{.?.. 00:20:33.923 000001d0 c4 31 80 11 09 4a f9 97 dd 2f 73 49 26 db 56 4e .1...J.../sI&.VN 00:20:33.923 000001e0 61 54 08 6e 44 38 a2 ac 76 01 b3 b4 b2 b9 6d a8 aT.nD8..v.....m. 00:20:33.923 000001f0 07 05 23 4f 16 99 47 43 67 9b a4 2a f5 3b 73 40 ..#O..GCg..*.;s@ 00:20:33.923 00000200 87 06 d6 ea 7e 7c 2f 41 09 99 f3 4a 27 42 89 93 ....~|/A...J'B.. 00:20:33.923 00000210 60 bb 60 e7 96 18 c7 9c b1 96 00 e5 12 95 71 d7 `.`...........q. 00:20:33.923 00000220 a7 7d 7f ee 68 d9 ef ac ee 13 20 d0 a6 93 01 18 .}..h..... ..... 00:20:33.923 00000230 7c 87 55 94 00 28 fc 08 1a e5 4c 1b 33 9b 11 8f |.U..(....L.3... 00:20:33.923 00000240 b1 c6 b2 bf 47 99 16 23 1e a8 ec 23 25 f0 03 27 ....G..#...#%..' 00:20:33.923 00000250 6f f6 f4 75 97 43 be bb 7e a1 a0 fb 85 e6 2a a3 o..u.C..~.....*. 00:20:33.923 00000260 cf b0 13 04 09 6a 8b 6d 33 b3 73 38 70 c7 56 71 .....j.m3.s8p.Vq 00:20:33.923 00000270 c7 f0 ea 9b cf f2 37 1e d2 58 b6 be 7b 36 12 a8 ......7..X..{6.. 00:20:33.923 00000280 fa 92 69 d6 8d 4a b3 61 a7 69 48 35 e3 f1 55 26 ..i..J.a.iH5..U& 00:20:33.923 00000290 22 a9 03 8e a7 24 87 87 e9 23 43 78 1c 76 08 ff "....$...#Cx.v.. 00:20:33.923 000002a0 7f 4e ed 2e 13 42 8d 6c 9e 39 eb ec 1a 62 f9 d2 .N...B.l.9...b.. 00:20:33.923 000002b0 84 68 d8 69 99 a8 f6 92 ab 5c 4b 64 74 16 de 2c .h.i.....\Kdt.., 00:20:33.923 000002c0 fa e6 98 24 ad 37 25 02 63 43 76 d3 fa b3 2b 94 ...$.7%.cCv...+. 00:20:33.923 000002d0 9d 89 36 ea f7 73 fd ad 57 9c bb a0 80 fe 21 fb ..6..s..W.....!. 00:20:33.923 000002e0 c8 54 bd 46 25 7b bf ca b1 83 dc bb ce 78 d6 91 .T.F%{.......x.. 00:20:33.923 000002f0 ea bb 9c 58 e2 1a b2 c4 3b 38 41 83 3c 1f 60 d9 ...X....;8A.<.`. 00:20:33.923 dh secret: 00:20:33.923 00000000 03 bc f2 5e 97 9c fe 6e e4 ce 78 e8 00 7a be 2a ...^...n..x..z.* 00:20:33.923 00000010 7e 4c be 07 48 2c 13 34 52 87 09 44 9c 9c f7 63 ~L..H,.4R..D...c 00:20:33.923 00000020 bd d4 c0 a8 73 91 81 b7 9b c0 b8 44 86 b5 97 1f ....s......D.... 00:20:33.923 00000030 cb 8d cc c1 24 96 4c d3 e9 6c da a6 ca db f4 35 ....$.L..l.....5 00:20:33.923 00000040 9a 24 c3 7b 99 26 38 a9 d8 ab d2 41 27 71 37 07 .$.{.&8....A'q7. 00:20:33.923 00000050 32 5f a0 23 77 d0 d1 75 0c a4 84 4e 31 93 ee b6 2_.#w..u...N1... 00:20:33.923 00000060 a8 29 78 83 a3 c0 85 af 3c 25 50 72 f6 df b8 08 .)x.....<%Pr.... 00:20:33.923 00000070 44 2c df bc 55 11 d9 c0 b3 79 53 f7 f9 d6 54 01 D,..U....yS...T. 00:20:33.923 00000080 ad 36 14 1a 60 1e e9 6a e3 99 26 d2 da b4 ee 3e .6..`..j..&....> 00:20:33.923 00000090 55 21 61 22 3f 22 70 9b a7 bd 8a 81 d6 01 82 1c U!a"?"p......... 00:20:33.923 000000a0 d9 ce 3e 19 f0 ec d8 84 ea c9 30 61 32 42 d4 cf ..>.......0a2B.. 00:20:33.923 000000b0 3b d3 cb cb 12 4a 6b eb e5 f5 05 ea 1c bd 8d 9c ;....Jk......... 00:20:33.923 000000c0 04 d2 97 00 b3 8b 43 44 47 bb da fa 7a 78 c9 b1 ......CDG...zx.. 00:20:33.923 000000d0 ca f8 52 e8 60 0c 73 58 b7 9c 19 14 7a 24 98 06 ..R.`.sX....z$.. 00:20:33.923 000000e0 13 86 58 dd a2 fe 4f 13 13 18 1f 1c 81 de f7 25 ..X...O........% 00:20:33.923 000000f0 85 a4 d5 e0 7b 42 87 67 f9 8f 91 f9 63 3a 19 53 ....{B.g....c:.S 00:20:33.923 00000100 59 29 3c e1 80 8f 72 f7 8d 51 63 b5 26 4c 4c cc Y)<...r..Qc.&LL. 00:20:33.923 00000110 e2 bd 51 d4 f5 0e ac 2d ad 91 44 16 ba cc 72 22 ..Q....-..D...r" 00:20:33.923 00000120 85 54 7f b2 10 e8 ab 81 d1 db 7f cc e5 9f c0 a8 .T.............. 00:20:33.923 00000130 1e 38 58 b1 11 9c 7b ce a9 ff 35 5f 3f 5c 44 c2 .8X...{...5_?\D. 00:20:33.923 00000140 ca 9a 9d 11 ee e6 f0 d2 be 1e 9d ed bf 0b 17 25 ...............% 00:20:33.923 00000150 3a f7 cf 6f 39 b6 e6 ec fa 21 87 4c 2c 0c ba cd :..o9....!.L,... 00:20:33.923 00000160 79 94 71 17 3b fe a5 34 c1 33 1e 45 7d 2a 74 96 y.q.;..4.3.E}*t. 00:20:33.923 00000170 c2 9e 20 ed 89 a6 a1 6e 75 a0 a6 fe dd 17 3d 5f .. ....nu.....=_ 00:20:33.923 00000180 38 ad c0 5f f3 0a fb 39 d1 6f 94 7d 96 31 1d 8b 8.._...9.o.}.1.. 00:20:33.923 00000190 97 56 09 96 80 50 99 9b 5e 5f 0f 8c 48 6e dd e7 .V...P..^_..Hn.. 00:20:33.923 000001a0 a4 27 3f ab dc 45 59 1c b7 30 85 e0 c7 2c b5 43 .'?..EY..0...,.C 00:20:33.923 000001b0 4a ad db 3e be 49 23 11 02 67 76 59 2a 1d 13 20 J..>.I#..gvY*.. 00:20:33.923 000001c0 01 d2 19 c9 0f ec 21 3f 42 16 4b 49 4b 3c ba 23 ......!?B.KIK<.# 00:20:33.923 000001d0 6a ff 32 e4 60 5e 2a e3 0f f5 39 75 76 1b 64 e1 j.2.`^*...9uv.d. 00:20:33.923 000001e0 ae 15 54 66 2a 9b bd a3 f9 0d 8d 9d 58 e4 75 54 ..Tf*.......X.uT 00:20:33.923 000001f0 44 7f f8 2d 8b 25 3e a8 10 69 aa 1c 98 ab f0 40 D..-.%>..i.....@ 00:20:33.923 00000200 f5 f3 de 6e 7d 2d 50 9c 26 d4 82 6d 89 31 f2 0c ...n}-P.&..m.1.. 00:20:33.923 00000210 c5 24 a1 f4 2e 5f 5f d6 a5 a5 2b 86 07 91 88 9d .$...__...+..... 00:20:33.923 00000220 9f 5f a6 3c 2e e4 a3 cf 59 d8 21 90 6a 1c 7c 7c ._.<....Y.!.j.|| 00:20:33.923 00000230 ee c3 62 7f 2b 6b bf 11 9e 15 f4 44 fa 6c 4b f8 ..b.+k.....D.lK. 00:20:33.923 00000240 84 08 87 84 ae f8 ff 62 72 8c 54 78 e4 09 24 73 .......br.Tx..$s 00:20:33.923 00000250 25 a1 4e 67 18 c6 f5 f6 92 be 5b ba ce 8d 1c 7d %.Ng......[....} 00:20:33.923 00000260 be ff 37 2f 4c 66 41 c3 0c 65 c8 79 65 ae 8d 13 ..7/LfA..e.ye... 00:20:33.923 00000270 54 67 33 59 e5 72 99 3b ce 0c be b5 f3 a0 7b 63 Tg3Y.r.;......{c 00:20:33.923 00000280 f0 f7 bf 8d 72 c7 43 f5 6c 64 e0 e2 02 0a 1b ba ....r.C.ld...... 00:20:33.923 00000290 61 f8 ec 09 db b9 5d a5 37 6f f0 f3 5d b9 64 56 a.....].7o..].dV 00:20:33.923 000002a0 35 d9 25 e6 60 ff cc c7 91 94 2a b6 7d 5f 47 dc 5.%.`.....*.}_G. 00:20:33.923 000002b0 c1 34 43 89 ad 9c 9a 8b 87 db 6f 4c 17 36 92 ea .4C.......oL.6.. 00:20:33.923 000002c0 0d 21 e9 89 03 58 1e 97 5d 10 10 b3 46 4f 37 32 .!...X..]...FO72 00:20:33.924 000002d0 64 85 0e d6 69 52 d0 96 80 c1 e5 12 3d 7f 1d 57 d...iR......=..W 00:20:33.924 000002e0 fb 8c bc e2 46 92 d3 19 6f 2a 09 2e 8b 49 b1 1d ....F...o*...I.. 00:20:33.924 000002f0 2a a1 45 eb b9 02 63 fb f2 65 c9 ce 7a d7 78 62 *.E...c..e..z.xb 00:20:33.924 [2024-09-27 15:25:19.392796] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=2, dhgroup=4, seq=3428451776, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.924 [2024-09-27 15:25:19.392898] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.924 [2024-09-27 15:25:19.451024] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.924 [2024-09-27 15:25:19.451071] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.924 [2024-09-27 15:25:19.451081] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.924 [2024-09-27 15:25:19.451107] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.924 [2024-09-27 15:25:19.620891] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.924 [2024-09-27 15:25:19.620913] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.924 [2024-09-27 15:25:19.620920] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.924 [2024-09-27 15:25:19.620965] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.924 [2024-09-27 15:25:19.620987] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.924 ctrlr pubkey: 00:20:33.924 00000000 57 aa db c3 1e 90 d7 dc ec fc 4f dc 36 67 87 4c W.........O.6g.L 00:20:33.924 00000010 87 6d 41 50 59 8b 2a eb 7b 2d 32 ba c7 43 fa 0e .mAPY.*.{-2..C.. 00:20:33.924 00000020 f1 d5 a7 13 1f 20 66 32 4b 71 f4 b7 00 3b 20 58 ..... f2Kq...; X 00:20:33.924 00000030 15 fd 83 0c 22 b0 9c 8e a7 de d6 7f db df a1 7f ...."........... 00:20:33.924 00000040 b7 af cd a4 44 8d 2f c1 45 4f 67 d8 c1 f1 d6 2e ....D./.EOg..... 00:20:33.924 00000050 3e d8 8d a7 81 94 75 76 00 c3 f0 b3 cc 01 50 1b >.....uv......P. 00:20:33.924 00000060 42 e3 cb 43 e9 37 4a 7d b7 d2 b1 f0 e5 6c 8e 7b B..C.7J}.....l.{ 00:20:33.924 00000070 f9 6e e4 8f d0 dc e9 70 0a c1 4f 4c 00 3c a8 c2 .n.....p..OL.<.. 00:20:33.924 00000080 53 27 87 82 ec 12 57 13 32 84 42 ff de 80 8c fd S'....W.2.B..... 00:20:33.924 00000090 47 34 d5 24 ce b3 de 4f 42 40 a3 4b c2 ce 4a ac G4.$...OB@.K..J. 00:20:33.924 000000a0 95 ca 8c 20 2d fb 04 bd d4 78 6a 9b 2b f4 3a a9 ... -....xj.+.:. 00:20:33.924 000000b0 d2 14 09 93 93 2b 56 bc ba cf 4e 1d dc c7 bd b8 .....+V...N..... 00:20:33.924 000000c0 c3 d7 65 69 9a 85 40 b8 c1 a3 3c 7b 7f 67 34 60 ..ei..@...<{.g4` 00:20:33.924 000000d0 1f da 55 a4 8a be e0 13 87 49 15 9d 2c ae b5 cd ..U......I..,... 00:20:33.924 000000e0 18 2a 6a b3 9a 79 2b 39 52 cf 28 7c e5 37 46 a7 .*j..y+9R.(|.7F. 00:20:33.924 000000f0 56 35 e5 33 2b f8 4e db b4 1e fc 76 b9 68 de 10 V5.3+.N....v.h.. 00:20:33.924 00000100 ac 13 97 8e e9 bb e9 9f db f5 78 b4 55 9c 70 07 ..........x.U.p. 00:20:33.924 00000110 45 93 46 99 43 ac fe c0 08 69 ff 9e 14 a0 6e 0c E.F.C....i....n. 00:20:33.924 00000120 ed b2 18 4c 52 18 2b e1 df 72 65 17 3c 61 89 aa ...LR.+..re.y.^9K...4M 00:20:33.924 00000160 bf 31 06 a7 41 41 c3 e1 35 62 44 e8 01 ef b6 6b .1..AA..5bD....k 00:20:33.924 00000170 1f a2 0e 59 0b 92 02 47 d6 2d 90 62 8e 8f e6 a0 ...Y...G.-.b.... 00:20:33.924 00000180 ea 5e f5 bc 45 ad 60 29 6c 90 47 93 89 57 f0 17 .^..E.`)l.G..W.. 00:20:33.924 00000190 76 b6 12 00 ea f6 90 42 d3 c9 55 9f 45 60 00 cf v......B..U.E`.. 00:20:33.924 000001a0 26 d3 44 c8 b6 b4 41 b5 ee ab 36 40 c3 d8 4b 70 &.D...A...6@..Kp 00:20:33.924 000001b0 59 61 e2 02 28 7a 8c 19 0d cf 0b 0e 11 68 be 8d Ya..(z.......h.. 00:20:33.924 000001c0 70 6f bf 27 90 86 c2 29 30 28 0b 39 96 23 ab 08 po.'...)0(.9.#.. 00:20:33.924 000001d0 67 e2 b7 4d b2 cd 05 41 2d 2e 8d bc ed 01 69 c6 g..M...A-.....i. 00:20:33.924 000001e0 30 47 4f 28 26 6e ce e0 07 76 a4 b7 af f6 93 50 0GO(&n...v.....P 00:20:33.924 000001f0 45 a6 90 df 23 0b 85 46 d1 29 4f 27 ab 00 26 cb E...#..F.)O'..&. 00:20:33.924 00000200 5f a0 ca 88 9a b3 90 bf a8 1e 54 e8 9b 0a 2a ef _.........T...*. 00:20:33.924 00000210 93 f6 e4 fb 8a 62 dc fb c0 f7 aa 46 8a df fa 2e .....b.....F.... 00:20:33.924 00000220 88 32 ac 47 d5 13 b9 06 a8 d9 a2 57 ab 0c 05 4a .2.G.......W...J 00:20:33.924 00000230 e8 1d f3 61 27 92 2e 76 c5 b9 67 69 80 ad ae 72 ...a'..v..gi...r 00:20:33.924 00000240 6d b8 30 d7 7a 69 c3 8f c4 9a 35 c1 00 8b 65 29 m.0.zi....5...e) 00:20:33.924 00000250 e1 bb aa 4e 9d 0c 45 a5 38 ab 7e 5e a6 ff 77 1d ...N..E.8.~^..w. 00:20:33.924 00000260 32 11 14 88 ae 8e cb e4 77 5e b5 f2 21 c7 1e fa 2.......w^..!... 00:20:33.924 00000270 89 e3 47 ca 1f 8a b1 b7 e4 e6 b6 29 62 6d f5 60 ..G........)bm.` 00:20:33.924 00000280 59 19 6f 10 66 08 75 b7 37 7d d2 eb b2 26 bb 60 Y.o.f.u.7}...&.` 00:20:33.924 00000290 78 94 93 06 64 22 29 fe 7b 42 2d 7f 65 dd 3d 46 x...d").{B-.e.=F 00:20:33.924 000002a0 e6 03 0b cc 4a 0f 5f 71 5b c8 ab 94 6c fd 6b d3 ....J._q[...l.k. 00:20:33.924 000002b0 86 42 f0 c2 9b 15 ff 28 ab 8b 24 9d f3 2c b5 1a .B.....(..$..,.. 00:20:33.924 000002c0 b4 03 4c 91 ad 92 e9 6a 26 54 e0 d9 04 78 a6 0c ..L....j&T...x.. 00:20:33.924 000002d0 fe 23 db 6a 42 e3 90 df 74 6a 5d 5d 95 d0 9c 85 .#.jB...tj]].... 00:20:33.924 000002e0 8c 6e cc 31 58 e2 d9 39 ce c3 49 ee 81 44 9d 11 .n.1X..9..I..D.. 00:20:33.924 000002f0 45 db 10 cb 2e 84 91 fb 4f 5b 8c a1 7c ca ef c9 E.......O[..|... 00:20:33.924 host pubkey: 00:20:33.924 00000000 8d 2c 94 1a fe 1d c3 b8 ce 56 85 96 f8 59 6c 92 .,.......V...Yl. 00:20:33.924 00000010 67 1d 16 57 06 4e 1c 88 ff c5 19 f4 22 e1 ce 3c g..W.N......"..< 00:20:33.924 00000020 da 0a 08 a9 1f 36 6b bf 01 3e 7b d5 80 d3 e7 5a .....6k..>{....Z 00:20:33.924 00000030 34 99 8f 8e 2c 50 45 f1 f9 2b 53 58 13 2c 43 c6 4...,PE..+SX.,C. 00:20:33.924 00000040 6d af 7f 61 cc 88 36 7b a4 3c 28 39 fe 4b 20 b5 m..a..6{.<(9.K . 00:20:33.924 00000050 d7 af 21 db a7 04 58 5b 33 3a 76 be 9e f0 42 a5 ..!...X[3:v...B. 00:20:33.924 00000060 97 1e c5 ef bc 7b dd 47 e1 c7 25 0e d7 3e ea 77 .....{.G..%..>.w 00:20:33.924 00000070 48 f7 c1 e2 88 36 42 68 15 84 3c d9 92 05 d4 79 H....6Bh..<....y 00:20:33.924 00000080 0f 48 e3 91 ee c7 fa 6d 30 2d 7a 64 71 ab 9a f0 .H.....m0-zdq... 00:20:33.924 00000090 73 9e b3 1c 0f 1e 34 fb 61 7f da cd fb 16 fc a8 s.....4.a....... 00:20:33.924 000000a0 2f 0c a0 c4 c4 d8 e3 a2 7b 0f 4a fb 79 d9 64 0f /.......{.J.y.d. 00:20:33.924 000000b0 07 29 c5 ac 5b 5e bb 84 4b 53 18 74 c7 d5 a1 8c .)..[^..KS.t.... 00:20:33.924 000000c0 19 26 40 ca 21 8c a1 29 5d 50 7b aa e3 bc a8 93 .&@.!..)]P{..... 00:20:33.924 000000d0 50 f6 be a6 2c a0 cb 3e e8 fa 5f 65 da d7 f7 97 P...,..>.._e.... 00:20:33.924 000000e0 de 50 be ec f4 89 8f c8 6d 44 1f f2 ba 05 f0 1a .P......mD...... 00:20:33.924 000000f0 42 4c 65 52 5b 4d 8e 22 43 8f 45 6c 2d 93 e3 a4 BLeR[M."C.El-... 00:20:33.924 00000100 d8 66 0f 04 7f 5e 4b ad 75 d3 3a ad 71 80 7d 4a .f...^K.u.:.q.}J 00:20:33.924 00000110 16 48 f1 1d cb 1c fa 82 fc da b8 73 14 2f aa d6 .H.........s./.. 00:20:33.924 00000120 ee 60 9a c8 83 92 1a 72 d1 c8 57 01 2c 3b 85 44 .`.....r..W.,;.D 00:20:33.924 00000130 73 5c 99 ae c1 83 b8 1e 12 a8 62 75 ef 51 e6 14 s\........bu.Q.. 00:20:33.924 00000140 e3 97 49 0b fc 06 89 a1 a2 6c 1d 20 bc 54 a8 40 ..I......l. .T.@ 00:20:33.924 00000150 b7 78 61 ef b2 b5 58 3b 2e cc 48 86 79 cb 44 50 .xa...X;..H.y.DP 00:20:33.924 00000160 d8 e5 75 8a b2 7b 32 02 de 74 88 5e e1 12 23 81 ..u..{2..t.^..#. 00:20:33.924 00000170 d5 8e cb cb 2e 4a 15 44 f3 48 10 66 f4 0e eb 0d .....J.D.H.f.... 00:20:33.924 00000180 06 79 ea d5 03 a7 78 55 26 d9 7c 49 31 f4 cf be .y....xU&.|I1... 00:20:33.924 00000190 a5 3e 56 9a d7 34 9d a3 87 9e e1 44 38 ca 50 c0 .>V..4.....D8.P. 00:20:33.924 000001a0 dd 88 21 19 ef 53 4d 51 05 fb fa d6 fc 68 50 2f ..!..SMQ.....hP/ 00:20:33.924 000001b0 a7 9b b4 62 96 03 a0 27 5b 35 24 24 d2 2a 3f 50 ...b...'[5$$.*?P 00:20:33.924 000001c0 5c d6 6a 92 0d cf 1f cf 54 2d 02 18 e2 0a 1b 47 \.j.....T-.....G 00:20:33.924 000001d0 d5 0c ec 5f e0 67 1c a3 4c d4 51 93 6a 6e c8 1e ..._.g..L.Q.jn.. 00:20:33.924 000001e0 cf 09 b0 38 c2 37 b1 a5 26 22 c3 92 41 8f 96 f8 ...8.7..&"..A... 00:20:33.924 000001f0 99 47 f8 b6 b5 a0 2a 7e 22 d6 60 34 3c 59 08 6e .G....*~".`4.E 00:20:33.924 00000210 24 6f b3 31 6a c3 b6 30 84 dd 46 bc d8 f7 1a a8 $o.1j..0..F..... 00:20:33.924 00000220 dc 86 b7 5b dc bd e2 42 f7 1e 62 a9 04 d5 ad e9 ...[...B..b..... 00:20:33.924 00000230 bf 2b 33 7f b0 e8 21 18 1f 66 6a 39 48 0f 31 15 .+3...!..fj9H.1. 00:20:33.924 00000240 72 db 56 f9 e5 d5 72 a6 f1 01 67 19 b8 1e bb 91 r.V...r...g..... 00:20:33.924 00000250 01 88 94 14 fd 89 7f aa 4f 84 d6 19 fc a1 44 b7 ........O.....D. 00:20:33.924 00000260 74 99 98 ad a6 94 c4 4b 5b 9b 0b 87 a9 34 79 5b t......K[....4y[ 00:20:33.924 00000270 68 e3 63 34 64 e7 58 2a 26 49 be 6d 67 20 4f 1f h.c4d.X*&I.mg O. 00:20:33.924 00000280 57 f3 c6 c7 cc 08 d8 bb e2 09 2e 5b 5a ea 24 83 W..........[Z.$. 00:20:33.924 00000290 dc ed 4a 07 5a 06 19 a6 02 e3 5a 96 92 d1 8e 08 ..J.Z.....Z..... 00:20:33.925 000002a0 06 e9 82 47 4b 34 3d b0 89 78 83 7c 26 e2 c9 df ...GK4=..x.|&... 00:20:33.925 000002b0 9c 32 64 5d dc 9c b4 90 07 50 e9 a8 8c be e7 60 .2d].....P.....` 00:20:33.925 000002c0 b3 7d 0b 3c fc e8 c4 76 a6 3f ac 96 a3 91 7b b4 .}.<...v.?....{. 00:20:33.925 000002d0 f6 97 f2 61 4f 14 99 ff 71 03 f9 2a e8 e4 d9 0e ...aO...q..*.... 00:20:33.925 000002e0 e4 c2 14 bd cb 0f fb 15 5f 21 b3 10 15 7d 12 c1 ........_!...}.. 00:20:33.925 000002f0 b5 87 61 90 83 30 8c a9 8e 01 9a 4e a8 52 59 14 ..a..0.....N.RY. 00:20:33.925 dh secret: 00:20:33.925 00000000 68 40 54 9e a5 ea 2a 6b dc 1a 34 75 e4 be 78 11 h@T...*k..4u..x. 00:20:33.925 00000010 ed c5 60 6c 5a 3d 38 83 1a 16 07 16 43 57 41 32 ..`lZ=8.....CWA2 00:20:33.925 00000020 ac 2d 9c a4 2c 73 76 ed 57 b8 8b 92 0a 26 40 12 .-..,sv.W....&@. 00:20:33.925 00000030 d6 9c 7b c9 e3 1f df 73 2f 24 88 4f a0 44 27 0f ..{....s/$.O.D'. 00:20:33.925 00000040 ca 36 9a cd c5 46 33 3c 6f c9 b4 cd 83 15 ff 7a .6...F3L..gj../."H 00:20:33.925 000000c0 2d d3 f5 a1 52 8f 3a cd 55 e7 92 4b 30 5b 5f 39 -...R.:.U..K0[_9 00:20:33.925 000000d0 c6 d6 78 61 04 96 97 e2 fa 38 e0 bc 43 7f ad 99 ..xa.....8..C... 00:20:33.925 000000e0 2e fd 78 a0 d4 73 1b cb 0e 7d 4f 9e a5 71 7f e3 ..x..s...}O..q.. 00:20:33.925 000000f0 3e e6 80 f8 f5 07 ae 7b 13 5d 43 b8 a7 20 99 1b >......{.]C.. .. 00:20:33.925 00000100 bb 53 e2 e9 60 5b 76 2d 47 63 bb f7 43 47 1d 57 .S..`[v-Gc..CG.W 00:20:33.925 00000110 bb 19 fc 6f 6a 5c 8e 90 ed 39 e6 ee 09 3d df 5c ...oj\...9...=.\ 00:20:33.925 00000120 a4 fd c2 ca 4b 0d d0 21 1b 26 b9 38 10 ee 36 d4 ....K..!.&.8..6. 00:20:33.925 00000130 90 84 70 f0 83 bd d1 75 f5 94 06 ee 8b 4b 36 0b ..p....u.....K6. 00:20:33.925 00000140 00 8f a6 c6 8c dd 3e 83 d4 84 e2 eb 5b 86 68 1b ......>.....[.h. 00:20:33.925 00000150 30 f8 07 0f 15 78 68 25 3c ab a2 c8 fe 33 84 01 0....xh%<....3.. 00:20:33.925 00000160 e4 91 5d 9b f4 9c ad 55 3d 51 f9 be 81 d9 95 8e ..]....U=Q...... 00:20:33.925 00000170 82 1b bf 69 07 83 be 72 11 23 df 29 e3 bc e1 e6 ...i...r.#.).... 00:20:33.925 00000180 1e 88 ed 46 a1 b4 f5 5b a2 46 d0 1b cf d8 c7 ea ...F...[.F...... 00:20:33.925 00000190 a7 79 43 f1 7d 1f 7a f6 8a 36 83 48 b3 7f ac 9b .yC.}.z..6.H.... 00:20:33.925 000001a0 83 47 2e b0 9d 6b 1c ea de 41 d4 bd 75 9a 1a ed .G...k...A..u... 00:20:33.925 000001b0 d0 b8 7e 71 d2 34 89 1f e8 ab a6 6c e9 f0 5e 52 ..~q.4.....l..^R 00:20:33.925 000001c0 12 21 d4 94 e2 12 f5 c8 fa 28 e7 c2 c7 b2 5c c7 .!.......(....\. 00:20:33.925 000001d0 aa 9b 4b 2d 93 79 cd ff f5 4c 14 94 0a 5a 25 08 ..K-.y...L...Z%. 00:20:33.925 000001e0 13 eb c7 33 36 80 e7 22 4b f2 79 c6 96 ea 3e 5b ...36.."K.y...>[ 00:20:33.925 000001f0 6b 40 24 80 a4 e8 08 5c d4 f3 8b b3 dd 5e f8 58 k@$....\.....^.X 00:20:33.925 00000200 43 31 54 a2 c2 93 2b be 1e ad e3 c0 a0 8f a0 65 C1T...+........e 00:20:33.925 00000210 5a 49 e6 a7 ed 7b 0c fb 87 c0 3b e6 b6 42 ff 1c ZI...{....;..B.. 00:20:33.925 00000220 d6 52 8c ab 22 0e 9c 05 b1 85 67 b0 64 43 82 f4 .R..".....g.dC.. 00:20:33.925 00000230 fb bb 12 39 f3 44 74 b4 49 4f 63 b0 01 8c c0 1f ...9.Dt.IOc..... 00:20:33.925 00000240 5a ce 36 3c 51 20 3f 89 c7 07 0e e3 e1 b3 67 3c Z.6.....uv......P. 00:20:33.925 00000060 42 e3 cb 43 e9 37 4a 7d b7 d2 b1 f0 e5 6c 8e 7b B..C.7J}.....l.{ 00:20:33.925 00000070 f9 6e e4 8f d0 dc e9 70 0a c1 4f 4c 00 3c a8 c2 .n.....p..OL.<.. 00:20:33.925 00000080 53 27 87 82 ec 12 57 13 32 84 42 ff de 80 8c fd S'....W.2.B..... 00:20:33.925 00000090 47 34 d5 24 ce b3 de 4f 42 40 a3 4b c2 ce 4a ac G4.$...OB@.K..J. 00:20:33.925 000000a0 95 ca 8c 20 2d fb 04 bd d4 78 6a 9b 2b f4 3a a9 ... -....xj.+.:. 00:20:33.925 000000b0 d2 14 09 93 93 2b 56 bc ba cf 4e 1d dc c7 bd b8 .....+V...N..... 00:20:33.925 000000c0 c3 d7 65 69 9a 85 40 b8 c1 a3 3c 7b 7f 67 34 60 ..ei..@...<{.g4` 00:20:33.925 000000d0 1f da 55 a4 8a be e0 13 87 49 15 9d 2c ae b5 cd ..U......I..,... 00:20:33.925 000000e0 18 2a 6a b3 9a 79 2b 39 52 cf 28 7c e5 37 46 a7 .*j..y+9R.(|.7F. 00:20:33.925 000000f0 56 35 e5 33 2b f8 4e db b4 1e fc 76 b9 68 de 10 V5.3+.N....v.h.. 00:20:33.925 00000100 ac 13 97 8e e9 bb e9 9f db f5 78 b4 55 9c 70 07 ..........x.U.p. 00:20:33.925 00000110 45 93 46 99 43 ac fe c0 08 69 ff 9e 14 a0 6e 0c E.F.C....i....n. 00:20:33.925 00000120 ed b2 18 4c 52 18 2b e1 df 72 65 17 3c 61 89 aa ...LR.+..re.y.^9K...4M 00:20:33.925 00000160 bf 31 06 a7 41 41 c3 e1 35 62 44 e8 01 ef b6 6b .1..AA..5bD....k 00:20:33.925 00000170 1f a2 0e 59 0b 92 02 47 d6 2d 90 62 8e 8f e6 a0 ...Y...G.-.b.... 00:20:33.925 00000180 ea 5e f5 bc 45 ad 60 29 6c 90 47 93 89 57 f0 17 .^..E.`)l.G..W.. 00:20:33.925 00000190 76 b6 12 00 ea f6 90 42 d3 c9 55 9f 45 60 00 cf v......B..U.E`.. 00:20:33.925 000001a0 26 d3 44 c8 b6 b4 41 b5 ee ab 36 40 c3 d8 4b 70 &.D...A...6@..Kp 00:20:33.925 000001b0 59 61 e2 02 28 7a 8c 19 0d cf 0b 0e 11 68 be 8d Ya..(z.......h.. 00:20:33.925 000001c0 70 6f bf 27 90 86 c2 29 30 28 0b 39 96 23 ab 08 po.'...)0(.9.#.. 00:20:33.925 000001d0 67 e2 b7 4d b2 cd 05 41 2d 2e 8d bc ed 01 69 c6 g..M...A-.....i. 00:20:33.925 000001e0 30 47 4f 28 26 6e ce e0 07 76 a4 b7 af f6 93 50 0GO(&n...v.....P 00:20:33.925 000001f0 45 a6 90 df 23 0b 85 46 d1 29 4f 27 ab 00 26 cb E...#..F.)O'..&. 00:20:33.925 00000200 5f a0 ca 88 9a b3 90 bf a8 1e 54 e8 9b 0a 2a ef _.........T...*. 00:20:33.925 00000210 93 f6 e4 fb 8a 62 dc fb c0 f7 aa 46 8a df fa 2e .....b.....F.... 00:20:33.925 00000220 88 32 ac 47 d5 13 b9 06 a8 d9 a2 57 ab 0c 05 4a .2.G.......W...J 00:20:33.925 00000230 e8 1d f3 61 27 92 2e 76 c5 b9 67 69 80 ad ae 72 ...a'..v..gi...r 00:20:33.925 00000240 6d b8 30 d7 7a 69 c3 8f c4 9a 35 c1 00 8b 65 29 m.0.zi....5...e) 00:20:33.925 00000250 e1 bb aa 4e 9d 0c 45 a5 38 ab 7e 5e a6 ff 77 1d ...N..E.8.~^..w. 00:20:33.925 00000260 32 11 14 88 ae 8e cb e4 77 5e b5 f2 21 c7 1e fa 2.......w^..!... 00:20:33.925 00000270 89 e3 47 ca 1f 8a b1 b7 e4 e6 b6 29 62 6d f5 60 ..G........)bm.` 00:20:33.925 00000280 59 19 6f 10 66 08 75 b7 37 7d d2 eb b2 26 bb 60 Y.o.f.u.7}...&.` 00:20:33.925 00000290 78 94 93 06 64 22 29 fe 7b 42 2d 7f 65 dd 3d 46 x...d").{B-.e.=F 00:20:33.925 000002a0 e6 03 0b cc 4a 0f 5f 71 5b c8 ab 94 6c fd 6b d3 ....J._q[...l.k. 00:20:33.925 000002b0 86 42 f0 c2 9b 15 ff 28 ab 8b 24 9d f3 2c b5 1a .B.....(..$..,.. 00:20:33.925 000002c0 b4 03 4c 91 ad 92 e9 6a 26 54 e0 d9 04 78 a6 0c ..L....j&T...x.. 00:20:33.925 000002d0 fe 23 db 6a 42 e3 90 df 74 6a 5d 5d 95 d0 9c 85 .#.jB...tj]].... 00:20:33.925 000002e0 8c 6e cc 31 58 e2 d9 39 ce c3 49 ee 81 44 9d 11 .n.1X..9..I..D.. 00:20:33.925 000002f0 45 db 10 cb 2e 84 91 fb 4f 5b 8c a1 7c ca ef c9 E.......O[..|... 00:20:33.925 host pubkey: 00:20:33.925 00000000 ca 9b f1 88 0c 88 3c c2 74 06 01 c5 08 c4 1f f9 ......<.t....... 00:20:33.925 00000010 59 d9 4c f1 6f af f2 70 65 a2 8e 44 05 3d 09 4b Y.L.o..pe..D.=.K 00:20:33.925 00000020 78 8c 91 b0 11 c8 50 17 51 c4 c1 81 d0 41 01 eb x.....P.Q....A.. 00:20:33.925 00000030 9e ef 52 d7 94 6b b5 f4 1b 5d eb 97 55 5e 64 be ..R..k...]..U^d. 00:20:33.925 00000040 4c f6 44 d4 ec 8f a2 05 86 af 1e 38 04 3d 5b 18 L.D........8.=[. 00:20:33.925 00000050 db b6 75 46 ae a7 58 83 d3 20 4c 30 49 2d 64 3a ..uF..X.. L0I-d: 00:20:33.925 00000060 08 99 49 5c a9 29 df 7f d9 62 11 c8 ea b8 95 8e ..I\.)...b...... 00:20:33.925 00000070 0d 1b eb b3 5c 01 13 4c c6 eb 69 82 cd f0 ef 3d ....\..L..i....= 00:20:33.925 00000080 01 72 e1 2f 6b 0b 1c 13 2e 25 49 04 cb c4 89 6f .r./k....%I....o 00:20:33.926 00000090 5e 8b eb 4d d6 5d 6f e1 a6 05 6d 39 f8 67 48 8f ^..M.]o...m9.gH. 00:20:33.926 000000a0 43 40 00 19 02 27 25 26 f0 07 98 45 34 bc d1 d7 C@...'%&...E4... 00:20:33.926 000000b0 31 95 55 95 76 c0 cc a0 8e d6 2b 8c 29 49 58 5e 1.U.v.....+.)IX^ 00:20:33.926 000000c0 56 82 bd 2a ce ec 94 64 90 72 ce 6b 3d 55 da 50 V..*...d.r.k=U.P 00:20:33.926 000000d0 36 9d d8 8b 1f 66 e8 91 79 5a 5f 34 9d af f9 b0 6....f..yZ_4.... 00:20:33.926 000000e0 dd 66 a1 ab ff 22 fe 61 53 f5 51 ae f3 de d3 07 .f...".aS.Q..... 00:20:33.926 000000f0 f7 b7 d6 17 08 1f a6 73 5f 7e ad ce bf 9f b4 07 .......s_~...... 00:20:33.926 00000100 11 46 9d 9d d3 f4 9c 09 53 5e ea 2a b2 4a 11 a2 .F......S^.*.J.. 00:20:33.926 00000110 44 5a 29 aa bb 29 0c 7e 3e ae 52 fc 98 49 cf 4a DZ)..).~>.R..I.J 00:20:33.926 00000120 3d 25 fc 5c bc 95 eb c7 30 d5 1a 56 e1 23 76 96 =%.\....0..V.#v. 00:20:33.926 00000130 44 81 f4 83 db 21 62 b5 7d 0a 87 fe f8 aa 32 81 D....!b.}.....2. 00:20:33.926 00000140 5b bf d0 f6 f4 62 ae c5 a9 39 2c ce 4f f9 3d e5 [....b...9,.O.=. 00:20:33.926 00000150 d0 8f 6b 60 a7 34 0c d8 72 99 4a e3 03 9c ce 0b ..k`.4..r.J..... 00:20:33.926 00000160 7c cd 46 c6 4d 04 44 db 3b 75 6e 27 df c6 a6 d2 |.F.M.D.;un'.... 00:20:33.926 00000170 e4 be b7 82 69 c8 80 ec 67 45 8f b4 dd a3 04 96 ....i...gE...... 00:20:33.926 00000180 ba e8 8c 1b ca 0e 38 ca 59 eb 21 e3 b3 82 2d d2 ......8.Y.!...-. 00:20:33.926 00000190 6a 9c 83 b0 84 9d 91 81 aa 0a 2b 47 24 86 73 9f j.........+G$.s. 00:20:33.926 000001a0 b2 2b 45 34 ca 75 07 61 73 a5 7e db 27 8d 4b 58 .+E4.u.as.~.'.KX 00:20:33.926 000001b0 f8 0b a4 54 de f7 75 b5 9b eb e5 70 41 6f 6a 7c ...T..u....pAoj| 00:20:33.926 000001c0 51 14 91 38 26 b5 46 7e fb 4f fd 1d 07 36 a7 74 Q..8&.F~.O...6.t 00:20:33.926 000001d0 31 c5 26 67 7a c0 64 8d be 04 c3 18 9a c2 b9 0e 1.&gz.d......... 00:20:33.926 000001e0 9e 1c 4c a6 78 92 cc 2b 8d 70 0a 9e 9f cc c0 b7 ..L.x..+.p...... 00:20:33.926 000001f0 02 3e fe 89 7a e3 5c 49 af 04 07 18 fd 1f 01 15 .>..z.\I........ 00:20:33.926 00000200 bd c8 f0 ea 32 44 b3 45 e6 d8 ba 53 ab 1c 7d 56 ....2D.E...S..}V 00:20:33.926 00000210 07 00 8a b4 be 72 3b 47 bf 12 7f bd 97 d9 58 69 .....r;G......Xi 00:20:33.926 00000220 b2 4e 7f b8 dd 1f c4 7e 25 f8 32 87 94 88 8b 24 .N.....~%.2....$ 00:20:33.926 00000230 c0 f4 61 29 b3 62 70 03 1a 82 b8 7f c1 e1 5b 78 ..a).bp.......[x 00:20:33.926 00000240 2a 2d 89 f3 e1 8a 50 3a 6b 2f 25 ba 7e 8e aa b6 *-....P:k/%.~... 00:20:33.926 00000250 31 88 63 6e a0 32 df 6c 69 76 e5 bf 8d 87 b4 bc 1.cn.2.liv...... 00:20:33.926 00000260 8b c8 48 42 5a 9e 28 7b d9 d2 c7 c7 cf a5 ce 44 ..HBZ.({.......D 00:20:33.926 00000270 eb 5b 05 c6 89 c3 e9 98 bd fd df f7 a9 5b 37 24 .[...........[7$ 00:20:33.926 00000280 68 e4 ba 96 42 4e 5c 4d c3 9e a1 1a 03 fc 5b af h...BN\M......[. 00:20:33.926 00000290 25 0e cf a0 4a 10 93 73 e2 05 01 6b fb ba 42 57 %...J..s...k..BW 00:20:33.926 000002a0 98 7b ee 75 53 6c 93 a7 81 68 1f 5c c5 d9 7a 92 .{.uSl...h.\..z. 00:20:33.926 000002b0 6d af d0 a3 a8 5f cf 80 5d 56 31 a9 ac d1 98 41 m...._..]V1....A 00:20:33.926 000002c0 d0 fa 2a 29 c9 ad f0 6b e4 71 36 32 10 d6 f4 d2 ..*)...k.q62.... 00:20:33.926 000002d0 62 b2 6e 1e df 1d 3e a6 e3 a2 ec a8 b1 0f 7f 29 b.n...>........) 00:20:33.926 000002e0 92 1a 81 17 73 a8 b3 3c a6 38 7b e3 80 aa 8f c5 ....s..<.8{..... 00:20:33.926 000002f0 92 13 38 8c bf 4d 55 40 ab aa 5e 5d 2d 84 d6 33 ..8..MU@..^]-..3 00:20:33.926 dh secret: 00:20:33.926 00000000 1b ba d0 ba 78 ee da 54 13 b7 85 7d ff 24 a3 34 ....x..T...}.$.4 00:20:33.926 00000010 75 5a 1c b5 59 99 c9 e5 a5 d2 14 22 23 19 25 62 uZ..Y......"#.%b 00:20:33.926 00000020 eb 87 14 de 42 1e 79 55 b5 48 15 8e ce 04 d7 58 ....B.yU.H.....X 00:20:33.926 00000030 11 63 dd 97 6a 5b 07 25 db 5d b5 67 21 cd 3b 39 .c..j[.%.].g!.;9 00:20:33.926 00000040 85 79 6c 5d 9b 8d f3 89 e6 0a 31 34 49 46 5a 75 .yl]......14IFZu 00:20:33.926 00000050 39 0a 8a 80 36 12 f3 82 6d 29 67 64 41 e4 11 63 9...6...m)gdA..c 00:20:33.926 00000060 b5 df 61 d5 22 f6 f6 e1 65 55 79 4d 22 b5 13 4b ..a."...eUyM"..K 00:20:33.926 00000070 fd 18 55 8e c2 00 6e 9c 23 bc e6 f3 e7 4f bd af ..U...n.#....O.. 00:20:33.926 00000080 63 e9 33 93 74 02 2d 53 f1 fd e2 f0 8b 27 98 5b c.3.t.-S.....'.[ 00:20:33.926 00000090 26 da 13 31 19 ad 36 bb 40 38 fe 54 c1 44 b0 dd &..1..6.@8.T.D.. 00:20:33.926 000000a0 90 e5 8d a8 49 6d f0 65 20 94 3d 0e 12 55 75 fb ....Im.e .=..Uu. 00:20:33.926 000000b0 81 46 80 26 6f 58 63 8a b7 a4 9a 2a 7f fd 44 ef .F.&oXc....*..D. 00:20:33.926 000000c0 29 a2 60 74 97 d3 67 51 e2 68 5f 46 c9 cc 5b b8 ).`t..gQ.h_F..[. 00:20:33.926 000000d0 ad 9e 52 21 37 7e 62 8d bb a2 c0 9d af 4e c1 4b ..R!7~b......N.K 00:20:33.926 000000e0 ed f4 38 54 eb 9c 58 d8 be 4e c9 66 dd fa b0 a2 ..8T..X..N.f.... 00:20:33.926 000000f0 44 6b 54 11 53 c2 f6 4e b9 f9 0c 14 8b 39 a3 e5 DkT.S..N.....9.. 00:20:33.926 00000100 95 08 03 1a 5b 20 21 01 ec 8b 5f 76 81 b3 b0 59 ....[ !..._v...Y 00:20:33.926 00000110 02 6f 5b 72 ab 05 71 71 e8 df ff 48 60 b3 54 06 .o[r..qq...H`.T. 00:20:33.926 00000120 ca 49 54 25 ec b7 9e 2d 75 4b 1d 21 7c 73 b3 2b .IT%...-uK.!|s.+ 00:20:33.926 00000130 b3 5e 2a f0 89 ac 52 75 fe b2 54 b0 92 43 fb 76 .^*...Ru..T..C.v 00:20:33.926 00000140 98 54 8d 74 f1 7f 10 fc ad cb c2 f8 4a c8 ec 64 .T.t........J..d 00:20:33.926 00000150 91 07 1c d2 90 94 d0 5b 47 c9 2d c7 c3 f3 4a 4d .......[G.-...JM 00:20:33.926 00000160 c6 1b 07 db 1a d3 8d 0a 96 fb f1 24 b9 b1 c6 be ...........$.... 00:20:33.926 00000170 5d ab c3 86 04 43 43 64 48 a8 7f 49 b9 3b 9c 64 ]....CCdH..I.;.d 00:20:33.926 00000180 a7 f3 b0 5d 46 c1 18 d9 a9 00 d4 2e d5 d6 1e 69 ...]F..........i 00:20:33.926 00000190 69 9d 95 7a 87 53 54 df d7 e6 24 2a 59 cb 7a f9 i..z.ST...$*Y.z. 00:20:33.926 000001a0 ba a5 83 a7 f4 cb 27 a8 0a bc 56 54 9a b2 da 72 ......'...VT...r 00:20:33.926 000001b0 04 74 74 2b f6 5a f9 a6 fd 89 bb f0 35 2c 66 7b .tt+.Z......5,f{ 00:20:33.926 000001c0 5b 5b 81 da a5 22 51 03 f3 c1 95 a7 56 4e 60 c4 [[..."Q.....VN`. 00:20:33.926 000001d0 54 72 6f 2e ee ca 9e 07 67 53 25 86 d3 f3 0b 9f Tro.....gS%..... 00:20:33.926 000001e0 a4 df 3b b9 06 21 1f 2b e2 b6 67 c8 71 fb 5f e9 ..;..!.+..g.q._. 00:20:33.926 000001f0 dd cb b3 b2 af 19 f4 02 f1 10 39 eb ed e3 05 d3 ..........9..... 00:20:33.926 00000200 f3 04 9d 20 3c 5d df 86 53 b9 4d 31 4b 6f d8 92 ... <]..S.M1Ko.. 00:20:33.926 00000210 27 cd 33 c8 bc 3e e9 6b 98 51 40 7c 08 73 06 92 '.3..>.k.Q@|.s.. 00:20:33.926 00000220 ea 9a 9b 07 f6 8a a7 59 e6 ea 8f 50 38 37 c7 87 .......Y...P87.. 00:20:33.926 00000230 b4 0b bf c9 c5 c7 ef 2a d2 83 fc f5 85 0e ff a1 .......*........ 00:20:33.926 00000240 88 ee 55 28 53 3c d4 45 76 d5 17 5a 42 d5 b0 f2 ..U(S<.Ev..ZB... 00:20:33.926 00000250 bc 14 ca 09 a4 60 8e f3 67 c9 31 e4 d7 a9 0f a2 .....`..g.1..... 00:20:33.926 00000260 f2 34 19 7d de e3 eb 92 36 6b 72 d7 f8 00 c2 ec .4.}....6kr..... 00:20:33.926 00000270 76 76 a4 d8 b0 a4 6b 85 2a 05 73 62 c0 82 97 e0 vv....k.*.sb.... 00:20:33.926 00000280 f7 0c bf 07 5e 10 e4 da 7f 2f 2f 74 22 a4 8e 99 ....^....//t"... 00:20:33.926 00000290 93 40 03 34 be 9d 01 8e c6 10 71 2d ae 09 21 a9 .@.4......q-..!. 00:20:33.926 000002a0 6b 6c a9 4b 8c df 73 92 52 b2 7e f7 01 c9 94 c8 kl.K..s.R.~..... 00:20:33.926 000002b0 9c d4 27 b6 a0 f5 42 1c b8 ec 41 b0 f3 44 83 58 ..'...B...A..D.X 00:20:33.926 000002c0 28 ab be 66 e5 fc df f3 8d 00 24 88 df 3f ce ea (..f......$..?.. 00:20:33.926 000002d0 72 93 1f 85 41 a0 39 b9 b0 97 cf 35 5c 51 d6 34 r...A.9....5\Q.4 00:20:33.926 000002e0 db 66 bd c4 50 f1 e1 fd 60 0a 0b 7c dc 0e 41 e3 .f..P...`..|..A. 00:20:33.926 000002f0 09 68 03 56 52 2b 05 45 16 97 ce af f8 06 cf 34 .h.VR+.E.......4 00:20:33.926 [2024-09-27 15:25:19.856561] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=2, dhgroup=4, seq=3428451778, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.926 [2024-09-27 15:25:19.856668] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.926 [2024-09-27 15:25:19.913147] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.926 [2024-09-27 15:25:19.913189] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.926 [2024-09-27 15:25:19.913198] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.926 [2024-09-27 15:25:19.913225] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.926 [2024-09-27 15:25:20.094150] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.926 [2024-09-27 15:25:20.094179] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.926 [2024-09-27 15:25:20.094187] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.926 [2024-09-27 15:25:20.094239] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.926 [2024-09-27 15:25:20.094265] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.926 ctrlr pubkey: 00:20:33.926 00000000 37 dc e4 d5 30 da fb 08 9c c9 37 42 dc ba a6 42 7...0.....7B...B 00:20:33.926 00000010 a3 f6 3d c6 2f 5e 2d a0 09 ef 70 32 0b 26 98 82 ..=./^-...p2.&.. 00:20:33.926 00000020 10 a5 c7 a4 9a a5 5b f0 0e 40 5d 11 63 5c e9 c5 ......[..@].c\.. 00:20:33.926 00000030 f8 dd 70 94 be ec fc ea ed b8 d7 a2 bc 2b 7f a0 ..p..........+.. 00:20:33.926 00000040 82 4c b2 b6 f4 68 46 76 57 7f 3a ad 65 58 38 51 .L...hFvW.:.eX8Q 00:20:33.926 00000050 3e 80 98 d5 ac 49 b3 7f 86 c2 50 10 a6 d7 04 a2 >....I....P..... 00:20:33.926 00000060 0c 88 62 78 f8 8c 35 da c6 80 a3 8b 9d 9f 2f 8b ..bx..5......./. 00:20:33.926 00000070 75 c3 bf f6 1b 9b ad 1b 42 70 d8 89 5b 41 a3 83 u.......Bp..[A.. 00:20:33.926 00000080 63 5c 24 ec 44 c0 dd e5 de 6e 9a ee 2d f4 8d 4d c\$.D....n..-..M 00:20:33.926 00000090 4c 13 2f a4 67 9e 09 e0 a7 c2 ff a1 5e 5b 28 ac L./.g.......^[(. 00:20:33.926 000000a0 77 e3 aa e0 8e 3f 28 a1 01 18 ef 39 e1 91 0f f6 w....?(....9.... 00:20:33.926 000000b0 3d b2 99 4d ea c1 9c 16 72 69 27 07 6e 7a 30 45 =..M....ri'.nz0E 00:20:33.926 000000c0 d6 a4 bc 89 c7 50 c1 8a fb c0 cd cc bd cf f7 db .....P.......... 00:20:33.926 000000d0 8e 3c 41 de 50 c8 6a f2 9c fc 7d 35 46 43 84 c0 .....k..;ee. 00:20:33.927 000002a0 37 e8 8f fa bc c8 45 ab 9f f4 94 92 dc ba 6f e8 7.....E.......o. 00:20:33.927 000002b0 57 ab e6 f8 d2 fa 56 36 81 cc a7 68 67 b5 e0 98 W.....V6...hg... 00:20:33.927 000002c0 4d 35 32 cf 04 a6 1e 18 06 82 a6 08 b8 56 eb ad M52..........V.. 00:20:33.927 000002d0 d0 b8 f2 f9 e9 59 21 05 85 97 a8 37 98 f7 e5 36 .....Y!....7...6 00:20:33.927 000002e0 ee dd e1 60 e7 ba b3 aa bc 58 2a 98 6b 37 69 c4 ...`.....X*.k7i. 00:20:33.927 000002f0 b1 2e bd c1 01 f6 f4 40 6a fa b9 51 86 52 e8 31 .......@j..Q.R.1 00:20:33.927 host pubkey: 00:20:33.927 00000000 bc 05 7d 14 aa 77 be f1 9b 72 f2 ab 87 36 11 37 ..}..w...r...6.7 00:20:33.927 00000010 b9 91 b7 e2 63 f0 22 a1 90 1a de 73 1d 0c 93 c1 ....c."....s.... 00:20:33.927 00000020 c5 b6 ee 10 1d 9e 87 ce 20 64 51 0c e4 f5 f1 ac ........ dQ..... 00:20:33.927 00000030 8d 0f 98 c1 f6 e0 32 b2 2a d0 db f0 a5 f2 77 1a ......2.*.....w. 00:20:33.927 00000040 bc 64 6a b7 a7 76 8d c6 69 a8 5c 78 f0 1f 2c 6f .dj..v..i.\x..,o 00:20:33.927 00000050 fb 2e 8a 0e c9 5e cd a9 a0 98 ae 8d c7 3f a6 61 .....^.......?.a 00:20:33.927 00000060 ad de d9 cc b6 fb 9b 32 11 57 4b f5 72 23 fc e5 .......2.WK.r#.. 00:20:33.927 00000070 a0 6a f4 14 be 05 c6 19 43 38 71 ef 28 b8 05 48 .j......C8q.(..H 00:20:33.927 00000080 74 ab 56 b9 63 dc 58 07 32 13 07 50 dd 1d 7a a9 t.V.c.X.2..P..z. 00:20:33.927 00000090 a2 7e 15 71 06 a2 d2 15 82 47 7d 30 70 d6 19 be .~.q.....G}0p... 00:20:33.927 000000a0 2c 4d 1e e1 1e 86 f9 fd 12 7b c0 0c fa ef 89 b1 ,M.......{...... 00:20:33.927 000000b0 81 89 4c 0d c1 71 f6 5a 71 76 f0 a8 aa 0b 90 b9 ..L..q.Zqv...... 00:20:33.927 000000c0 b7 0a d2 7d 42 be 21 d2 78 be 90 b6 4d 21 c2 3c ...}B.!.x...M!.< 00:20:33.927 000000d0 c8 d5 65 93 df 8a aa 49 29 45 d0 eb c8 62 4d 27 ..e....I)E...bM' 00:20:33.927 000000e0 69 01 32 13 f7 d5 83 87 25 70 3a a7 f9 85 90 f5 i.2.....%p:..... 00:20:33.927 000000f0 42 86 f0 78 2b d9 a3 0d f8 9e e9 ca be ac 41 5a B..x+.........AZ 00:20:33.927 00000100 47 e3 82 fe 5e cf 89 dc 47 da 4a 14 bd c6 a5 02 G...^...G.J..... 00:20:33.927 00000110 70 3b 55 ae a7 31 ef eb c5 2b b9 2d 4f 45 5e 91 p;U..1...+.-OE^. 00:20:33.927 00000120 71 bf df 1d 51 f3 40 fb fe 40 d9 8d bb 68 5d eb q...Q.@..@...h]. 00:20:33.927 00000130 74 41 7b 20 ea 5a 12 91 26 a2 a0 a5 6e a2 6e eb tA{ .Z..&...n.n. 00:20:33.927 00000140 c5 c2 2b 26 e6 33 e1 c2 77 d7 f4 f5 b5 e5 0f 93 ..+&.3..w....... 00:20:33.927 00000150 06 a0 ed a9 39 3e c0 86 da 1a 6b d3 25 0b 88 51 ....9>....k.%..Q 00:20:33.927 00000160 28 03 14 7d a7 05 ca b6 09 bc f5 6e 92 94 53 06 (..}.......n..S. 00:20:33.927 00000170 be ff f1 1e 20 b3 0b 6c b2 99 50 c2 7a e1 bd 4c .... ..l..P.z..L 00:20:33.927 00000180 8c a1 5b 97 0d 76 26 7e c6 9b 95 78 5c cd 30 62 ..[..v&~...x\.0b 00:20:33.927 00000190 52 ae 9c c0 b2 6d bc d6 fa 88 fd 5a 58 28 99 1b R....m.....ZX(.. 00:20:33.927 000001a0 be cb 54 27 70 00 08 24 54 c4 47 e0 0b b6 8e 8a ..T'p..$T.G..... 00:20:33.927 000001b0 4f 0d a5 0b 2b 46 93 f1 81 13 94 2d 4e b5 03 a3 O...+F.....-N... 00:20:33.927 000001c0 89 ce df 73 1e 91 94 b7 c5 ad 58 b0 de ed fe 83 ...s......X..... 00:20:33.927 000001d0 50 25 5e 12 8d e0 d6 6f 77 e3 3c 8a c1 32 73 f3 P%^....ow.<..2s. 00:20:33.927 000001e0 58 dc d4 a3 cb b4 13 ca 85 82 b9 06 cd 7e f6 0e X............~.. 00:20:33.927 000001f0 85 4c 42 9a 5a eb 2e 5f 17 29 54 56 c5 d1 85 7b .LB.Z.._.)TV...{ 00:20:33.927 00000200 f1 e0 19 38 7a f8 b6 e5 97 f9 c2 14 a9 34 19 0d ...8z........4.. 00:20:33.927 00000210 f0 31 35 65 2f 79 89 d4 3e dc 60 38 1d 0b 17 30 .15e/y..>.`8...0 00:20:33.927 00000220 32 c9 05 d2 64 93 80 33 ea 7e cc c4 d7 72 4f d2 2...d..3.~...rO. 00:20:33.927 00000230 93 aa f3 3c f1 f1 55 e4 dc 02 e6 53 f0 c4 1f 21 ...<..U....S...! 00:20:33.927 00000240 fc ff 2b 56 1b 27 52 16 94 02 19 61 46 1f 1b ba ..+V.'R....aF... 00:20:33.927 00000250 93 a5 07 94 f0 05 7b b4 4f 70 ad 1e 4a 73 11 b5 ......{.Op..Js.. 00:20:33.927 00000260 de ca 81 c3 24 60 c4 a5 b6 88 ae 70 6b ff 39 53 ....$`.....pk.9S 00:20:33.927 00000270 e0 82 85 ab ee 23 55 f9 39 2e 2a 08 14 e3 d2 f3 .....#U.9.*..... 00:20:33.927 00000280 35 d5 8c d8 26 de 26 3f 2f 85 6e d6 8b 09 56 82 5...&.&?/.n...V. 00:20:33.927 00000290 38 0c 00 15 b2 ca 82 10 08 2a 09 8f 3f 01 11 a0 8........*..?... 00:20:33.927 000002a0 fb 77 7c 0c 67 d6 34 b0 98 f6 68 bb 8a c1 36 74 .w|.g.4...h...6t 00:20:33.927 000002b0 73 71 a3 35 fe ea 45 e4 69 f7 8c c1 ca 8d 37 f3 sq.5..E.i.....7. 00:20:33.927 000002c0 aa 5e f4 94 f9 92 bd 33 47 4c f8 98 d5 6b 44 bb .^.....3GL...kD. 00:20:33.927 000002d0 04 be 9d c3 6f 5c 77 ee 64 77 7a c7 0e 32 8b 76 ....o\w.dwz..2.v 00:20:33.927 000002e0 20 63 e4 d2 f3 ee da b9 7a 3f 8c 1f 48 16 37 e3 c......z?..H.7. 00:20:33.927 000002f0 59 a5 a1 81 22 24 30 a1 56 a5 1f be ca a3 e9 3f Y..."$0.V......? 00:20:33.927 dh secret: 00:20:33.927 00000000 6d b4 ea 4a 6d 1e e7 9c f5 f5 a4 c0 5d af b6 04 m..Jm.......]... 00:20:33.927 00000010 52 8b cb 15 8b fe 6c 5c d8 c5 d1 4e bd 5c 78 65 R.....l\...N.\xe 00:20:33.927 00000020 58 89 c2 61 fc b2 03 3e 00 1d 31 2d 79 af 97 d2 X..a...>..1-y... 00:20:33.927 00000030 99 9a e8 d2 00 b6 18 43 84 8e e2 25 25 b8 60 db .......C...%%.`. 00:20:33.927 00000040 b0 81 2d c3 1c 31 ba eb ad 9e 25 43 d8 99 d7 db ..-..1....%C.... 00:20:33.927 00000050 59 59 55 1a c4 f8 7d 8b 62 33 2e 8e 54 a0 ac 63 YYU...}.b3..T..c 00:20:33.927 00000060 4a 54 cb e6 a8 ce 8a a0 da 93 da 79 a2 27 8b 38 JT.........y.'.8 00:20:33.927 00000070 9c 49 cb f4 ec b5 6c 3b d2 ad 1a 68 c7 10 ed 74 .I....l;...h...t 00:20:33.927 00000080 50 ee 4b 50 ca 31 3e 84 c9 46 4f 6a 47 ba b5 d0 P.KP.1>..FOjG... 00:20:33.927 00000090 73 56 b4 c8 7b fe 32 d4 4e 97 6d b7 4b 13 b6 ae sV..{.2.N.m.K... 00:20:33.927 000000a0 1c d5 c1 d4 4f 6a 02 d0 85 3e ac 68 c6 76 f7 67 ....Oj...>.h.v.g 00:20:33.927 000000b0 0d a7 b5 87 d7 3b 29 32 00 12 97 5f 39 cd 4c e6 .....;)2..._9.L. 00:20:33.927 000000c0 4c 36 32 0d b0 ef e6 09 c3 ef d3 eb 2c e7 2c d2 L62.........,.,. 00:20:33.927 000000d0 34 c6 16 16 1d f2 77 ae a4 8c aa 81 3c 21 b2 b9 4.....w.....1.. 00:20:33.927 00000150 d0 71 28 4d 93 8f 77 31 1b f3 c5 2f 64 2e 85 df .q(M..w1.../d... 00:20:33.927 00000160 e6 32 dd bc cc c3 86 44 30 42 f7 d6 be 07 16 47 .2.....D0B.....G 00:20:33.927 00000170 81 46 88 5f 00 c7 b6 f0 3e 17 13 f5 52 6a 50 73 .F._....>...RjPs 00:20:33.927 00000180 f0 9e 0e 21 76 ec 02 bb 9c 1e 40 13 58 73 39 91 ...!v.....@.Xs9. 00:20:33.927 00000190 68 d9 dd 8d 3d 77 4a 6b 97 eb 51 6b fd c9 76 a9 h...=wJk..Qk..v. 00:20:33.927 000001a0 d2 5d d1 9c 66 fd 4d 2b 35 77 e7 e4 bd 5f 60 2b .]..f.M+5w..._`+ 00:20:33.927 000001b0 45 50 a6 b2 0d 7d e3 97 56 ac 7b ad 1e f6 f2 50 EP...}..V.{....P 00:20:33.927 000001c0 dd 15 6d a7 c4 d9 57 b4 0c 71 d2 6d b8 89 0a 7b ..m...W..q.m...{ 00:20:33.927 000001d0 b4 27 19 6f a5 43 e2 60 3e 0a e8 f2 8a 0d 9d 39 .'.o.C.`>......9 00:20:33.927 000001e0 10 51 67 d8 c9 21 6b a0 a7 b3 96 c4 8a ec 90 7e .Qg..!k........~ 00:20:33.927 000001f0 d6 3f 83 a0 65 49 b6 50 c0 86 45 cd 53 96 c0 a8 .?..eI.P..E.S... 00:20:33.927 00000200 86 cb cc fb e2 52 ce 79 4f 97 82 f3 1c ff 93 31 .....R.yO......1 00:20:33.927 00000210 a8 78 aa 64 28 d5 df ac b2 e7 bc c6 57 e4 2c 64 .x.d(.......W.,d 00:20:33.927 00000220 ea 46 b5 47 2e 2d b1 a3 ae 43 0c ae a0 0d 6a 59 .F.G.-...C....jY 00:20:33.927 00000230 b5 0b 99 23 33 f8 20 07 3e a2 31 c0 ee 14 69 e8 ...#3. .>.1...i. 00:20:33.927 00000240 2b 1c 9e d7 4d aa c1 0e fb 2c 50 d0 04 15 86 8f +...M....,P..... 00:20:33.927 00000250 f8 e7 f8 19 98 59 f6 9c fd b2 e2 24 2c c7 61 fd .....Y.....$,.a. 00:20:33.927 00000260 29 50 eb fb 22 ba ad 45 8c aa 9c 47 e2 e2 61 f9 )P.."..E...G..a. 00:20:33.927 00000270 2d e9 c7 e7 b6 c2 5f bf 10 dd 87 5b ab 37 63 c4 -....._....[.7c. 00:20:33.927 00000280 e1 be 42 81 27 d3 23 97 b2 86 4d 39 ec ed b5 d1 ..B.'.#...M9.... 00:20:33.927 00000290 4c 4d 8d d7 70 32 31 82 c4 e8 73 8d c9 88 3b ad LM..p21...s...;. 00:20:33.927 000002a0 a0 5e dc d3 25 b8 5c 8f b4 ff b7 2e 41 3e ab 78 .^..%.\.....A>.x 00:20:33.927 000002b0 25 8c 5a bd 1e cd 13 3c f8 f7 2c f7 45 da d4 18 %.Z....<..,.E... 00:20:33.927 000002c0 ad 10 6b f2 7c 69 f4 21 75 93 b2 7a 2f f4 20 fc ..k.|i.!u..z/. . 00:20:33.927 000002d0 a5 8a 9f d0 9b f6 a3 16 a0 0c bb e8 fa 76 a7 55 .............v.U 00:20:33.927 000002e0 7b f3 fc bf 8f b5 17 67 c8 09 53 22 b5 11 21 47 {......g..S"..!G 00:20:33.927 000002f0 ba c9 ff 69 c8 f9 0f d0 9a 1a 9b 23 a6 d4 e1 39 ...i.......#...9 00:20:33.927 [2024-09-27 15:25:20.146323] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=2, dhgroup=4, seq=3428451779, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.927 [2024-09-27 15:25:20.184899] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.927 [2024-09-27 15:25:20.184952] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.927 [2024-09-27 15:25:20.184972] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.927 [2024-09-27 15:25:20.184998] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.928 [2024-09-27 15:25:20.185010] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.928 [2024-09-27 15:25:20.291922] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.928 [2024-09-27 15:25:20.291944] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.928 [2024-09-27 15:25:20.291951] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.928 [2024-09-27 15:25:20.291961] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.928 [2024-09-27 15:25:20.292019] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.928 ctrlr pubkey: 00:20:33.928 00000000 37 dc e4 d5 30 da fb 08 9c c9 37 42 dc ba a6 42 7...0.....7B...B 00:20:33.928 00000010 a3 f6 3d c6 2f 5e 2d a0 09 ef 70 32 0b 26 98 82 ..=./^-...p2.&.. 00:20:33.928 00000020 10 a5 c7 a4 9a a5 5b f0 0e 40 5d 11 63 5c e9 c5 ......[..@].c\.. 00:20:33.928 00000030 f8 dd 70 94 be ec fc ea ed b8 d7 a2 bc 2b 7f a0 ..p..........+.. 00:20:33.928 00000040 82 4c b2 b6 f4 68 46 76 57 7f 3a ad 65 58 38 51 .L...hFvW.:.eX8Q 00:20:33.928 00000050 3e 80 98 d5 ac 49 b3 7f 86 c2 50 10 a6 d7 04 a2 >....I....P..... 00:20:33.928 00000060 0c 88 62 78 f8 8c 35 da c6 80 a3 8b 9d 9f 2f 8b ..bx..5......./. 00:20:33.928 00000070 75 c3 bf f6 1b 9b ad 1b 42 70 d8 89 5b 41 a3 83 u.......Bp..[A.. 00:20:33.928 00000080 63 5c 24 ec 44 c0 dd e5 de 6e 9a ee 2d f4 8d 4d c\$.D....n..-..M 00:20:33.928 00000090 4c 13 2f a4 67 9e 09 e0 a7 c2 ff a1 5e 5b 28 ac L./.g.......^[(. 00:20:33.928 000000a0 77 e3 aa e0 8e 3f 28 a1 01 18 ef 39 e1 91 0f f6 w....?(....9.... 00:20:33.928 000000b0 3d b2 99 4d ea c1 9c 16 72 69 27 07 6e 7a 30 45 =..M....ri'.nz0E 00:20:33.928 000000c0 d6 a4 bc 89 c7 50 c1 8a fb c0 cd cc bd cf f7 db .....P.......... 00:20:33.928 000000d0 8e 3c 41 de 50 c8 6a f2 9c fc 7d 35 46 43 84 c0 .....k..;ee. 00:20:33.928 000002a0 37 e8 8f fa bc c8 45 ab 9f f4 94 92 dc ba 6f e8 7.....E.......o. 00:20:33.928 000002b0 57 ab e6 f8 d2 fa 56 36 81 cc a7 68 67 b5 e0 98 W.....V6...hg... 00:20:33.928 000002c0 4d 35 32 cf 04 a6 1e 18 06 82 a6 08 b8 56 eb ad M52..........V.. 00:20:33.928 000002d0 d0 b8 f2 f9 e9 59 21 05 85 97 a8 37 98 f7 e5 36 .....Y!....7...6 00:20:33.928 000002e0 ee dd e1 60 e7 ba b3 aa bc 58 2a 98 6b 37 69 c4 ...`.....X*.k7i. 00:20:33.928 000002f0 b1 2e bd c1 01 f6 f4 40 6a fa b9 51 86 52 e8 31 .......@j..Q.R.1 00:20:33.928 host pubkey: 00:20:33.928 00000000 71 b6 0e e0 a2 be 48 1b 90 2c af 41 f1 c2 99 94 q.....H..,.A.... 00:20:33.928 00000010 b7 62 2c e0 b1 b5 3c d2 ad 02 9c 4c fa e9 03 c0 .b,...<....L.... 00:20:33.928 00000020 23 bd cf dd 2b 2e b4 6d 1f 64 62 df 08 74 4d 23 #...+..m.db..tM# 00:20:33.928 00000030 66 e4 79 02 64 95 cb b1 88 b1 b8 e3 0c 1a bb 09 f.y.d........... 00:20:33.928 00000040 d2 48 ff 72 dc 18 9a 2c 69 00 80 f3 ab 53 6a 21 .H.r...,i....Sj! 00:20:33.928 00000050 c8 dc 35 11 ac a4 ca 4a fd 91 e5 15 fc b6 e0 ed ..5....J........ 00:20:33.928 00000060 8f 92 fc 09 39 2b c5 7c 22 f7 4c 63 49 37 db 91 ....9+.|".LcI7.. 00:20:33.928 00000070 3d 23 40 c6 4f 47 a3 80 ac a0 76 e1 5d 44 36 bb =#@.OG....v.]D6. 00:20:33.928 00000080 32 a8 de e7 e5 a0 47 2a 8f 49 08 23 4f 78 f3 8f 2.....G*.I.#Ox.. 00:20:33.928 00000090 f3 3e 10 ad e9 66 03 fe 9f ba 39 60 94 ce 34 3e .>...f....9`..4> 00:20:33.928 000000a0 49 9c e9 56 71 5c e0 85 34 b5 8c 70 2f 62 29 9c I..Vq\..4..p/b). 00:20:33.928 000000b0 df 2f 05 91 9d 61 62 0a 20 6a 51 2a e2 e5 1a 43 ./...ab. jQ*...C 00:20:33.928 000000c0 fb f2 58 0f 1f b4 af fd c3 9c ee 71 6f 31 7d a1 ..X........qo1}. 00:20:33.928 000000d0 d2 74 e1 ba db 9d 37 b7 9d 20 7c 1e f1 64 02 f6 .t....7.. |..d.. 00:20:33.928 000000e0 d1 24 60 e4 94 22 dc 1e af 38 05 a5 c0 0e 1d d4 .$`.."...8...... 00:20:33.928 000000f0 f6 9d 11 f8 76 42 ad eb 55 b5 43 dc d4 8e ce ea ....vB..U.C..... 00:20:33.928 00000100 5a b7 5f fe 9b 6c 76 4a e2 47 a8 81 99 cc 78 0a Z._..lvJ.G....x. 00:20:33.928 00000110 37 af 1b 50 e6 9f 89 62 ea e1 55 70 ad 73 6c 1b 7..P...b..Up.sl. 00:20:33.928 00000120 cf 9f f5 7c bd d6 0f 77 a4 37 7d 84 3f 9d 75 e3 ...|...w.7}.?.u. 00:20:33.928 00000130 ea 09 f4 88 36 c3 1b 69 5a 65 4e fe cb 8c 02 7c ....6..iZeN....| 00:20:33.928 00000140 b9 d5 a8 08 3f c6 8c 16 15 e2 85 f0 e6 1a 9e df ....?........... 00:20:33.928 00000150 56 7f 48 79 02 be fd 6e 02 dd c8 a8 8f c3 fc 7b V.Hy...n.......{ 00:20:33.928 00000160 0e 36 ac 87 e4 7d d6 79 1c 87 da 5d f2 73 07 00 .6...}.y...].s.. 00:20:33.928 00000170 1c 04 ac 1b 8e 89 af ff e1 c4 8c ff 39 8f 31 94 ............9.1. 00:20:33.928 00000180 91 f7 18 9d 3a 69 fc 6b d2 74 9e 1c 04 45 da 63 ....:i.k.t...E.c 00:20:33.928 00000190 df a5 03 14 d3 cb 45 49 92 51 9b bb b2 26 26 a5 ......EI.Q...&&. 00:20:33.928 000001a0 bd 8c 61 9c fe 5d dd cb e4 4d 7c 71 d2 d4 05 03 ..a..]...M|q.... 00:20:33.928 000001b0 c2 9d 56 92 29 3f ec df 88 c3 a5 ca 8b b0 64 68 ..V.)?........dh 00:20:33.928 000001c0 44 e1 91 ca 87 fc de db f1 94 b7 00 98 cf 70 ed D.............p. 00:20:33.928 000001d0 f6 74 9f 76 0e bf e0 07 79 77 80 fa 36 d3 57 8e .t.v....yw..6.W. 00:20:33.928 000001e0 09 ef 63 20 f9 dc 02 8c 91 4f 23 f2 a8 53 0b d5 ..c .....O#..S.. 00:20:33.928 000001f0 e0 1c 2c 1a 62 b5 32 e4 81 53 ad 7d c0 15 1b 84 ..,.b.2..S.}.... 00:20:33.928 00000200 57 a5 b3 8a c7 c4 b7 21 23 ba 0a d1 87 f8 e6 cd W......!#....... 00:20:33.928 00000210 2e 1a 53 51 c8 37 ef 6a 20 b6 e4 bf 8b 6f 34 a8 ..SQ.7.j ....o4. 00:20:33.928 00000220 6e 0a 20 d3 e3 24 70 49 e8 40 52 e7 6d 85 cb 05 n. ..$pI.@R.m... 00:20:33.928 00000230 ed dd 7e d1 d8 ce be cb f7 cb a5 fa fb a6 a3 79 ..~............y 00:20:33.928 00000240 e7 09 a1 8f 38 ca 89 e8 99 f9 84 84 36 57 fc 0a ....8.......6W.. 00:20:33.928 00000250 b9 63 4e 2e 2a 49 1b 88 33 e0 58 9d 69 fc 69 ca .cN.*I..3.X.i.i. 00:20:33.928 00000260 61 ff 6f 8a 81 51 e2 1d 50 76 20 ad cd ca e6 d6 a.o..Q..Pv ..... 00:20:33.928 00000270 3f 15 82 b8 47 24 2f 43 76 56 66 00 cd 35 fa 9d ?...G$/CvVf..5.. 00:20:33.928 00000280 92 06 95 19 dc 1e ef 7a ff 9f bb a1 ab 20 cd 8a .......z..... .. 00:20:33.928 00000290 a2 94 91 a1 b1 55 91 04 90 ad 84 87 83 d3 e0 6c .....U.........l 00:20:33.928 000002a0 f9 90 65 d0 6e 95 6f 96 e5 7a 80 21 12 83 21 28 ..e.n.o..z.!..!( 00:20:33.928 000002b0 d0 20 61 d8 69 9b 5b 83 fe b5 52 b2 2d b4 e7 82 . a.i.[...R.-... 00:20:33.928 000002c0 a1 df e6 f8 7a 91 28 c1 10 d1 c3 e5 f8 23 dd bc ....z.(......#.. 00:20:33.928 000002d0 7a 6d e1 cc 4f 6f fe 24 da 42 3d 8a 62 10 64 b8 zm..Oo.$.B=.b.d. 00:20:33.928 000002e0 79 4a bf 66 09 a3 54 50 3a e9 c5 03 1a f0 16 47 yJ.f..TP:......G 00:20:33.928 000002f0 07 cc 61 11 bb fe ed 9f 7a 99 cb 0c 2a 06 56 c7 ..a.....z...*.V. 00:20:33.928 dh secret: 00:20:33.928 00000000 3e d4 8d ff b6 25 cc c2 7e 77 b3 d2 62 f3 1e cd >....%..~w..b... 00:20:33.928 00000010 04 9a 57 70 a3 58 a3 ab c1 0d 0c 81 fd 86 0a 71 ..Wp.X.........q 00:20:33.928 00000020 96 0a df 75 f4 e0 69 e2 1c 80 b2 11 21 82 51 97 ...u..i.....!.Q. 00:20:33.928 00000030 13 8a 64 93 49 fc 8a d4 13 55 35 0c bf 6a 32 75 ..d.I....U5..j2u 00:20:33.928 00000040 ed 37 7b 28 79 d3 e3 17 68 db 61 c0 f4 55 31 57 .7{(y...h.a..U1W 00:20:33.928 00000050 6e 43 2a bf ad d4 b6 04 a6 43 09 99 66 dc d2 ce nC*......C..f... 00:20:33.928 00000060 56 a0 d4 ba 5a 0c 98 8a ff 67 16 68 f3 5f f1 c4 V...Z....g.h._.. 00:20:33.928 00000070 63 7e 80 58 0f 27 aa 97 79 a3 25 07 78 12 7f 28 c~.X.'..y.%.x..( 00:20:33.928 00000080 b3 bb db 31 de ec 3f bc 12 3e d9 a3 b6 d9 24 5b ...1..?..>....$[ 00:20:33.928 00000090 41 af ab 3f 8f 09 9f 94 56 44 ef 61 96 af 5f 19 A..?....VD.a.._. 00:20:33.928 000000a0 bb 15 35 64 43 e6 11 91 4b d4 a9 81 74 cc c2 3d ..5dC...K...t..= 00:20:33.928 000000b0 e7 29 69 50 29 dd 32 ba 7c b5 1f 52 18 0f d7 b1 .)iP).2.|..R.... 00:20:33.928 000000c0 b1 f5 7c c1 23 9b 26 8b cc af ab 53 a4 31 41 70 ..|.#.&....S.1Ap 00:20:33.928 000000d0 e3 a7 f4 f0 eb e7 a8 38 75 b8 09 93 55 71 98 74 .......8u...Uq.t 00:20:33.928 000000e0 91 a2 60 76 56 96 d0 07 d0 21 86 bb cc dc 2f c0 ..`vV....!..../. 00:20:33.928 000000f0 ee 11 a1 54 95 50 8b 32 54 8f f1 2a 7c 37 dd 77 ...T.P.2T..*|7.w 00:20:33.928 00000100 bf f7 9c f8 a1 cd ab 5c 9a e0 c5 f8 16 b9 56 b7 .......\......V. 00:20:33.928 00000110 d2 ae 4f 76 5f d9 94 78 47 80 9b 1b 82 ea c8 7b ..Ov_..xG......{ 00:20:33.928 00000120 f2 62 0e df 1f 94 fa 6c e0 05 68 3d ed 91 04 5d .b.....l..h=...] 00:20:33.928 00000130 69 47 4d e5 74 37 5f 59 f4 a8 65 08 44 4f 2c 3b iGM.t7_Y..e.DO,; 00:20:33.928 00000140 28 47 96 46 9a b6 ce 15 91 0b 28 20 e4 58 fe fb (G.F......( .X.. 00:20:33.928 00000150 32 c3 33 09 7d 22 8f d4 22 d9 ab 9e 10 1a aa 40 2.3.}".."......@ 00:20:33.928 00000160 bc 2b fe 67 18 22 5b 1c 7f 76 51 b8 ab 68 60 72 .+.g."[..vQ..h`r 00:20:33.928 00000170 74 d1 ec 2d 69 01 1b 2e 00 63 6b 45 5b 48 47 a7 t..-i....ckE[HG. 00:20:33.928 00000180 31 fa 28 ca 7f 43 f3 f6 c6 ea e6 33 68 e3 8d 7d 1.(..C.....3h..} 00:20:33.929 00000190 8a b8 b4 eb 6a 9a bd 78 65 ad ba 45 21 fc 4a 0f ....j..xe..E!.J. 00:20:33.929 000001a0 c6 1d 82 c4 ff 96 86 ef 6c 40 88 3b 88 7b 17 27 ........l@.;.{.' 00:20:33.929 000001b0 1d c9 02 4a 77 04 8a 2d b7 b8 75 e1 39 a3 62 f1 ...Jw..-..u.9.b. 00:20:33.929 000001c0 22 62 cf 13 52 81 51 ec c2 fc 4b f3 57 f9 60 5f "b..R.Q...K.W.`_ 00:20:33.929 000001d0 0b d8 e3 9b d3 e7 2a 85 a2 d6 38 09 26 da 00 cd ......*...8.&... 00:20:33.929 000001e0 7e 77 17 50 e4 0c e6 52 b4 c5 99 54 74 a5 96 68 ~w.P...R...Tt..h 00:20:33.929 000001f0 ae 16 03 b5 fb 37 2d 6a b5 0d 2a 33 b9 52 4c db .....7-j..*3.RL. 00:20:33.929 00000200 68 76 a3 9f a0 70 1d e6 08 41 d1 c2 40 54 86 d1 hv...p...A..@T.. 00:20:33.929 00000210 74 3b 35 db 1e d2 34 42 ba 42 2f 0c e1 6d 33 bc t;5...4B.B/..m3. 00:20:33.929 00000220 4e 0b bb 67 69 af 93 e0 5c a0 5a 1c 74 8e 14 b6 N..gi...\.Z.t... 00:20:33.929 00000230 88 83 10 d5 83 01 61 fb 67 ac c4 8c 07 fb c3 3e ......a.g......> 00:20:33.929 00000240 9e af 0d 9e 11 f3 92 50 52 a2 f7 21 da 00 4d d3 .......PR..!..M. 00:20:33.929 00000250 41 f1 48 a0 36 61 22 bd 08 d8 d9 0b df 9c d2 f5 A.H.6a"......... 00:20:33.929 00000260 f1 e5 14 46 ab b2 0b 04 65 3c 12 3c 4a bf 0f 3a ...F....e<.. 00:20:33.929 000000a0 e1 c3 22 a8 63 e0 74 62 23 3f 35 fc b1 ab 4a 04 ..".c.tb#?5...J. 00:20:33.929 000000b0 d3 19 8b 27 b8 ac d2 93 23 c7 8a 44 8d 59 2c 01 ...'....#..D.Y,. 00:20:33.929 000000c0 75 51 93 13 e3 ed 4d 2f 3b 2c 34 3a 36 ec 21 95 uQ....M/;,4:6.!. 00:20:33.929 000000d0 c4 a5 ae 07 0a a6 b1 e5 fb 85 1d bf 99 04 f7 0c ................ 00:20:33.929 000000e0 44 6b fe de b2 69 63 cc 42 d2 9c d2 d9 c7 3c 6b Dk...ic.B......G..:K 00:20:33.929 00000110 c1 9a 11 e4 26 6e 74 71 75 f9 09 09 06 59 72 06 ....&ntqu....Yr. 00:20:33.929 00000120 2b f1 c3 23 1e e2 01 ff 3d 48 28 e8 f5 12 a4 7f +..#....=H(..... 00:20:33.929 00000130 09 75 b3 4f 06 a9 f4 cb 92 2d 60 ad 2b 5a db a3 .u.O.....-`.+Z.. 00:20:33.929 00000140 ad ef 1b 77 6a f5 90 d3 10 bd d8 42 68 98 b3 d7 ...wj......Bh... 00:20:33.929 00000150 65 68 e3 f9 05 ba d3 72 0a 41 89 c4 7a cb 17 80 eh.....r.A..z... 00:20:33.929 00000160 d6 4c 99 80 50 f6 08 11 9d f1 fa f8 26 9f 21 87 .L..P.......&.!. 00:20:33.929 00000170 54 74 e1 ae 46 28 bc 99 d6 02 aa 63 e2 8c 9b 00 Tt..F(.....c.... 00:20:33.929 00000180 19 a3 43 8f a5 df ee f1 cb 22 c1 f6 83 1b 34 55 ..C......"....4U 00:20:33.929 00000190 52 cc aa 71 eb 72 d1 67 b1 de a3 47 a2 56 a9 15 R..q.r.g...G.V.. 00:20:33.930 000001a0 be cf d2 d9 a0 f5 7c 7e e5 a3 8a ef 07 16 57 20 ......|~......W 00:20:33.930 000001b0 4d e8 71 9e 96 ae 6e ce 68 ef f6 74 19 e8 1d 58 M.q...n.h..t...X 00:20:33.930 000001c0 90 3c e6 c1 8b ff 76 5d 05 fb ee 97 1c 2d 25 57 .<....v].....-%W 00:20:33.930 000001d0 97 2e 92 5b ad 56 49 dd ac 93 c7 81 3a 9e 41 c9 ...[.VI.....:.A. 00:20:33.930 000001e0 2e 60 bf 6a a2 2e fc d7 ed 1d ab 3a 08 78 a5 7f .`.j.......:.x.. 00:20:33.930 000001f0 21 6d a2 ed fc 3c 9d 9f 11 55 1e 9c 49 4a 6b 12 !m...<...U..IJk. 00:20:33.930 00000200 d5 f1 42 41 79 36 af f2 43 8f 91 3b 36 af 4c 33 ..BAy6..C..;6.L3 00:20:33.930 00000210 e3 1b bc 20 35 24 42 6d 7d 59 c1 7b 5a 11 2e 0b ... 5$Bm}Y.{Z... 00:20:33.930 00000220 82 97 52 af 74 e0 bc 60 4c 57 d3 94 de 4f 96 08 ..R.t..`LW...O.. 00:20:33.930 00000230 0e 9f 4b 83 3f e2 23 ef 1b dd f9 47 2a 03 53 65 ..K.?.#....G*.Se 00:20:33.930 00000240 38 94 dc 55 a4 7d 29 fa 94 e7 52 dd 6c 4f 1a 7d 8..U.})...R.lO.} 00:20:33.930 00000250 72 c1 2e df a2 a8 70 2d 7c b2 89 2e 1c bd f6 e1 r.....p-|....... 00:20:33.930 00000260 1e 9a ad 3c d3 32 60 bc 36 52 5a 4d ea ec 80 bb ...<.2`.6RZM.... 00:20:33.930 00000270 96 d9 60 18 07 44 fa df 48 9a 05 e9 2f ed 97 dd ..`..D..H.../... 00:20:33.930 00000280 51 44 24 06 44 ff 89 ec ef 7c 49 7c 9b ec 8b a6 QD$.D....|I|.... 00:20:33.930 00000290 3f 90 c3 22 ae 64 59 16 19 dd e6 78 c9 4c 83 4b ?..".dY....x.L.K 00:20:33.930 000002a0 ba 76 84 69 26 a0 de d1 80 e9 38 47 a5 7b 1a 76 .v.i&.....8G.{.v 00:20:33.930 000002b0 d8 30 f1 20 cc 0e 64 3a 47 fc e0 09 9a da d1 91 .0. ..d:G....... 00:20:33.930 000002c0 d7 0b 31 15 76 85 ab f4 28 7e c4 79 73 2b a6 4d ..1.v...(~.ys+.M 00:20:33.930 000002d0 ce 1d 1b 4a 79 bc bb bb f6 47 5a 12 a9 55 ec d3 ...Jy....GZ..U.. 00:20:33.930 000002e0 ff 3d ad c4 3a a6 af b0 e8 0e 4d 23 a4 83 15 72 .=..:.....M#...r 00:20:33.930 000002f0 2f ce 27 8c 88 7c 11 ec 05 3f a0 28 f3 31 a9 3e /.'..|...?.(.1.> 00:20:33.930 dh secret: 00:20:33.930 00000000 73 33 b7 0d 68 f8 f7 4b 6d e1 6f ed 44 4b d6 82 s3..h..Km.o.DK.. 00:20:33.930 00000010 3d 29 d5 3f a4 0b 93 59 c7 45 2f 77 d4 0d 78 5f =).?...Y.E/w..x_ 00:20:33.930 00000020 d9 9c 8a 2d 44 7c c5 68 1d 6a 37 15 2c b8 e6 d7 ...-D|.h.j7.,... 00:20:33.930 00000030 9c 7f af 13 4f 5c e1 1b 57 2a df 51 8a 77 92 3b ....O\..W*.Q.w.; 00:20:33.930 00000040 46 3f d5 63 9c 6f 94 2f 0e ce a2 61 d7 d2 bc a7 F?.c.o./...a.... 00:20:33.930 00000050 c8 24 1a 45 a4 e9 56 d9 62 66 7a 31 92 c5 64 51 .$.E..V.bfz1..dQ 00:20:33.930 00000060 fc 66 a2 8b 32 2d 58 cc 23 bc 29 cd 44 00 71 3d .f..2-X.#.).D.q= 00:20:33.930 00000070 0d ba 0b ed 72 27 9b 30 11 83 86 38 fe 86 32 5f ....r'.0...8..2_ 00:20:33.930 00000080 4a 82 2d 81 3a 61 09 65 73 c6 8e 15 58 50 93 49 J.-.:a.es...XP.I 00:20:33.930 00000090 6c aa be 99 cc 26 74 cf 39 e4 55 0c af 4f 6b 70 l....&t.9.U..Okp 00:20:33.930 000000a0 cc 59 70 70 8f 9a 6a 9e 29 ec 4c f9 36 b8 70 bd .Ypp..j.).L.6.p. 00:20:33.930 000000b0 0b a7 ec 17 29 28 d3 be c0 16 eb 4a 77 3d fc 1c ....)(.....Jw=.. 00:20:33.930 000000c0 79 66 40 be 21 5a 88 26 ee aa da 3e 33 7a fa e5 yf@.!Z.&...>3z.. 00:20:33.930 000000d0 f1 06 e9 12 b7 a6 3e 51 6b 65 44 38 26 71 0f 3e ......>QkeD8&q.> 00:20:33.930 000000e0 56 53 00 b0 6e b5 bb 5f f3 86 f7 28 46 db 43 65 VS..n.._...(F.Ce 00:20:33.930 000000f0 4e a4 7b 19 85 ed 2b 4e 20 4e f3 8e 07 3c 83 0a N.{...+N N...<.. 00:20:33.930 00000100 3a 33 98 c8 f7 a6 2a ee 77 e1 e3 0f c8 92 21 79 :3....*.w.....!y 00:20:33.930 00000110 4d 04 c9 e4 76 2c 62 cb 5a 6d e9 30 30 cf a1 b0 M...v,b.Zm.00... 00:20:33.930 00000120 d2 7f 44 2d af 46 7d 23 e8 a2 28 72 19 1b 98 a6 ..D-.F}#..(r.... 00:20:33.930 00000130 2d 84 a1 b8 aa 3c 6b 14 e9 3c 6c a1 91 94 b6 43 -....L. 00:20:33.930 000001e0 e3 c0 98 8b 8a a4 6f d3 10 ba 3a 52 d8 5a d1 ee ......o...:R.Z.. 00:20:33.930 000001f0 9d a6 c1 64 79 72 10 e7 3e 3c 6b ca de 30 fc 45 ...dyr..>Kf.D... 00:20:33.930 00000290 e4 5a 4e b7 a9 90 02 8a 0e 6f 94 a7 d0 75 44 f7 .ZN......o...uD. 00:20:33.930 000002a0 cd ab 73 1a 3a 55 79 a2 ea 51 68 db c0 d8 99 df ..s.:Uy..Qh..... 00:20:33.930 000002b0 01 db e9 d8 05 5f 99 f9 df 15 03 6e 30 fa cf c0 ....._.....n0... 00:20:33.930 000002c0 56 3e 63 8f 0b 11 2d 03 7a c5 a7 26 86 8f a5 17 V>c...-.z..&.... 00:20:33.930 000002d0 f8 d8 4e 4d 75 3b f0 0f 9e 02 f5 a1 b8 89 99 e9 ..NMu;.......... 00:20:33.930 000002e0 8f 5f df 38 9b fe cd 9e 6c 90 7d f4 85 d6 01 c6 ._.8....l.}..... 00:20:33.930 000002f0 e5 d8 a3 c2 fb 48 ff 3f 27 ef 3d bc 87 87 fc 4b .....H.?'.=....K 00:20:33.930 [2024-09-27 15:25:20.632659] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=2, dhgroup=4, seq=3428451781, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.930 [2024-09-27 15:25:20.668210] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.930 [2024-09-27 15:25:20.668250] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.930 [2024-09-27 15:25:20.668266] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.930 [2024-09-27 15:25:20.668286] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.930 [2024-09-27 15:25:20.668301] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.930 [2024-09-27 15:25:20.773212] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.930 [2024-09-27 15:25:20.773229] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.930 [2024-09-27 15:25:20.773237] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.930 [2024-09-27 15:25:20.773246] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.930 [2024-09-27 15:25:20.773300] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.930 ctrlr pubkey: 00:20:33.930 00000000 35 53 a1 94 55 a2 64 f6 c3 88 ec ca cd 74 ea a0 5S..U.d......t.. 00:20:33.930 00000010 26 2b 6b fe 07 5a f8 4c 17 e7 9e 32 49 e3 75 fb &+k..Z.L...2I.u. 00:20:33.930 00000020 e2 78 6e 84 77 ac eb b9 37 1a 9f b8 b7 bb b3 2d .xn.w...7......- 00:20:33.930 00000030 6d c1 ca d7 8e 31 7a 96 1a 58 43 38 83 06 63 85 m....1z..XC8..c. 00:20:33.930 00000040 c0 77 d5 53 a4 fc 0c 52 b9 a4 19 0b 59 76 e3 1f .w.S...R....Yv.. 00:20:33.930 00000050 0f 8e 4a bb f2 58 99 e1 b1 ae 79 e3 9f 91 51 c6 ..J..X....y...Q. 00:20:33.930 00000060 79 b6 c0 54 1b 35 ee 75 f6 d3 3a 72 e5 d5 d9 40 y..T.5.u..:r...@ 00:20:33.930 00000070 6b 39 cc 4a e6 cb ac 45 b3 33 35 1a 1f f2 d6 1e k9.J...E.35..... 00:20:33.930 00000080 f9 3a 4e 6f 90 2f 8d 40 f5 0e 6f d8 69 8f 5a c7 .:No./.@..o.i.Z. 00:20:33.930 00000090 72 5d fd 87 86 86 1c b9 36 75 fa 50 e8 bc 3e 0d r]......6u.P..>. 00:20:33.930 000000a0 e1 c3 22 a8 63 e0 74 62 23 3f 35 fc b1 ab 4a 04 ..".c.tb#?5...J. 00:20:33.930 000000b0 d3 19 8b 27 b8 ac d2 93 23 c7 8a 44 8d 59 2c 01 ...'....#..D.Y,. 00:20:33.930 000000c0 75 51 93 13 e3 ed 4d 2f 3b 2c 34 3a 36 ec 21 95 uQ....M/;,4:6.!. 00:20:33.930 000000d0 c4 a5 ae 07 0a a6 b1 e5 fb 85 1d bf 99 04 f7 0c ................ 00:20:33.930 000000e0 44 6b fe de b2 69 63 cc 42 d2 9c d2 d9 c7 3c 6b Dk...ic.B......G..\!.ah1N~7 00:20:33.931 00000010 af 82 bd d1 d1 16 ac 2d bc b1 e9 18 fa 24 5b 5b .......-.....$[[ 00:20:33.931 00000020 2a 9f 38 bb b1 6c e1 7e 8d 5b d2 b3 c9 47 ff 56 *.8..l.~.[...G.V 00:20:33.931 00000030 75 b8 3c 6e 18 20 8d be bb dc e1 ef 8a a5 e8 fc u.`(.RL0..p...^.. 00:20:33.931 00000110 51 44 a5 25 26 71 4a 29 dc 75 4b 25 dc 00 e5 d3 QD.%&qJ).uK%.... 00:20:33.931 00000120 bc 7e e5 16 e4 c3 85 cf 59 90 07 da db 0b 45 94 .~......Y.....E. 00:20:33.931 00000130 19 e2 32 41 1c 06 a0 26 ac 8e 27 2f e1 90 4a cb ..2A...&..'/..J. 00:20:33.931 00000140 19 12 bd a7 71 70 74 8d 75 4b b1 c6 0b c3 4e 1e ....qpt.uK....N. 00:20:33.931 00000150 c6 1b c3 e3 b9 ea 65 ee 3c 1f 7a 2f 1f 30 01 0d ......e.<.z/.0.. 00:20:33.931 00000160 37 f5 61 13 fe 0c df 42 13 27 38 01 14 c6 8a ce 7.a....B.'8..... 00:20:33.931 00000170 df bf aa a2 ad 5d 33 a2 9c 97 b8 2b 18 43 c9 31 .....]3....+.C.1 00:20:33.931 00000180 41 09 c8 05 c2 20 83 cf e2 74 6a af 53 71 58 44 A.... ...tj.SqXD 00:20:33.931 00000190 2e c4 b3 5f 00 21 f8 77 db e9 2e d2 49 d0 e7 e8 ..._.!.w....I... 00:20:33.931 000001a0 2c 38 84 8e 2d 90 ee 07 5a 9c 42 c3 4e 78 34 e5 ,8..-...Z.B.Nx4. 00:20:33.931 000001b0 c7 8c 6c e1 de 43 7d 1a f2 f7 75 5f b4 c0 05 9c ..l..C}...u_.... 00:20:33.931 000001c0 53 70 cf ee 88 1d 0d f6 cd 18 e0 05 16 14 db c0 Sp.............. 00:20:33.931 000001d0 b5 8b ef ff ae 4b 2a 2b be 79 66 36 5e f6 f5 3d .....K*+.yf6^..= 00:20:33.931 000001e0 6a 7e fa 1e 6a e3 b4 89 b1 3c b9 0b 60 57 4d 8a j~..j....<..`WM. 00:20:33.931 000001f0 bf 59 0c e3 b6 52 08 72 5f 2e fc 37 6a 14 bf 14 .Y...R.r_..7j... 00:20:33.931 00000200 37 fb 87 0f 70 89 b7 f6 fa c9 06 8f a2 fd 22 c8 7...p.........". 00:20:33.931 00000210 ec 73 a1 6c ee 7e 30 83 ff 17 d8 84 75 21 35 f9 .s.l.~0.....u!5. 00:20:33.931 00000220 74 4d 35 bc 13 2b f3 e6 b2 1b f8 a9 66 20 49 ee tM5..+......f I. 00:20:33.931 00000230 35 8d 32 70 eb be 36 74 c9 f6 1d ae f6 59 3a a4 5.2p..6t.....Y:. 00:20:33.931 00000240 07 17 a9 73 a8 c7 dc 54 42 5a e7 47 f6 52 46 19 ...s...TBZ.G.RF. 00:20:33.931 00000250 e4 5a ae 12 d9 e1 f2 39 9c 00 ee d6 87 14 a0 5a .Z.....9.......Z 00:20:33.931 00000260 16 24 e1 74 5c fc 03 5c c1 71 17 46 67 6a f1 2a .$.t\..\.q.Fgj.* 00:20:33.931 00000270 b0 ab e1 8d 65 9e 94 e4 b0 88 d7 79 c9 8f 6a 68 ....e......y..jh 00:20:33.931 00000280 83 0f 02 60 9a bf c9 78 36 99 50 df 7a 19 a3 b3 ...`...x6.P.z... 00:20:33.931 00000290 cb 9c 42 fc 22 30 5e b2 90 ed a8 ef c6 a4 57 03 ..B."0^.......W. 00:20:33.931 000002a0 be 1e b0 94 88 2e 29 3b 0f b9 88 ba a3 3b ce 70 ......);.....;.p 00:20:33.931 000002b0 23 ba a4 86 64 64 dc 17 6a 82 99 0b 57 60 0e dc #...dd..j...W`.. 00:20:33.931 000002c0 31 30 34 5d c2 b0 52 0c 79 a4 b8 3e ca ca 17 7a 104]..R.y..>...z 00:20:33.931 000002d0 90 8d b7 00 92 a3 e4 cb 6b a5 8e 2f 1b 34 99 a9 ........k../.4.. 00:20:33.931 000002e0 d6 e7 62 c6 bd e1 b5 a7 96 f0 3c b9 cc 58 95 5f ..b.......<..X._ 00:20:33.931 000002f0 7b 12 36 2e 00 ba be 08 7d 3d 4f e0 d2 53 d0 3d {.6.....}=O..S.= 00:20:33.931 dh secret: 00:20:33.931 00000000 bb 4c 18 79 a3 21 51 e4 75 06 91 83 ac 7a 64 4b .L.y.!Q.u....zdK 00:20:33.931 00000010 1a db 29 60 63 f0 61 4b c8 3d f9 08 da d0 60 e0 ..)`c.aK.=....`. 00:20:33.931 00000020 b5 e7 c1 a9 aa b2 f1 3a cf 58 01 02 f7 6d 07 1d .......:.X...m.. 00:20:33.931 00000030 93 04 c5 37 b0 2f 71 18 28 af 72 27 ef 58 23 33 ...7./q.(.r'.X#3 00:20:33.931 00000040 5e 21 c0 0e 5b a7 c1 56 85 c5 0c 11 72 9e 5c d9 ^!..[..V....r.\. 00:20:33.931 00000050 4e 7b 4a b3 f1 b1 a1 2e 58 1c 4c a8 05 e7 cd 35 N{J.....X.L....5 00:20:33.931 00000060 ad ba 17 9d 5b b0 92 4e 33 ee a2 29 e6 c9 b5 aa ....[..N3..).... 00:20:33.931 00000070 f3 29 52 b5 e5 76 e7 5f 67 77 49 3e a0 45 a0 68 .)R..v._gwI>.E.h 00:20:33.931 00000080 e9 71 c0 98 22 ce 0a 1e 60 f6 ab 42 62 ac a6 be .q.."...`..Bb... 00:20:33.931 00000090 e7 ac ce 3d cc a2 1a 38 ba fb 86 ad e2 91 2f d5 ...=...8....../. 00:20:33.931 000000a0 04 5d 95 9c 67 8a 7f 46 f9 03 49 88 fb 95 d5 ad .]..g..F..I..... 00:20:33.931 000000b0 47 b8 a6 b2 f6 8b ed 3b 4e 2e 70 fc 03 57 34 4e G......;N.p..W4N 00:20:33.931 000000c0 4c 5d 86 30 be fc d9 10 1b 2b e0 50 7e f9 1d c7 L].0.....+.P~... 00:20:33.931 000000d0 93 38 0c 77 73 b9 ea 32 8e b5 e5 c4 1b fb f7 d3 .8.ws..2........ 00:20:33.931 000000e0 c4 94 17 40 56 f2 c8 be ac 62 ed 5f 3a 03 30 07 ...@V....b._:.0. 00:20:33.931 000000f0 32 d9 90 b5 55 86 88 59 78 34 6c 64 fc a2 de 96 2...U..Yx4ld.... 00:20:33.931 00000100 b1 40 87 44 fe e6 94 22 7e 92 e7 64 5d 47 01 d1 .@.D..."~..d]G.. 00:20:33.931 00000110 4f 16 8e 5f 45 b0 83 f7 2a a8 2a 60 87 32 e0 45 O.._E...*.*`.2.E 00:20:33.931 00000120 59 c5 c2 5f a0 2a 51 1c ab c0 d3 fa 53 1d 7a 0c Y.._.*Q.....S.z. 00:20:33.931 00000130 50 54 d7 06 a9 b7 07 d9 0a 12 dd e8 ef 54 b8 5d PT...........T.] 00:20:33.931 00000140 a9 f0 d8 30 b1 2e 8b e2 aa 75 3e b1 91 49 51 21 ...0.....u>..IQ! 00:20:33.931 00000150 69 7e 09 50 14 2a d3 7d a9 e8 21 72 dc 6b 5c f2 i~.P.*.}..!r.k\. 00:20:33.931 00000160 85 9b 94 93 fa f3 0a 78 ad ad a9 62 02 13 f9 cb .......x...b.... 00:20:33.931 00000170 7e a0 2d ec 0f 62 57 34 e6 53 7e 70 8b f9 15 17 ~.-..bW4.S~p.... 00:20:33.931 00000180 f2 46 86 ff be 84 fb a8 e1 a3 c4 b1 84 f6 59 08 .F............Y. 00:20:33.931 00000190 e3 32 20 00 a5 29 39 e9 8c ef 3e 84 21 a0 d9 df .2 ..)9...>.!... 00:20:33.931 000001a0 dc 33 9d b6 b3 4a b8 37 b8 73 d8 e9 28 db 9a 48 .3...J.7.s..(..H 00:20:33.931 000001b0 ed 25 9c 65 3e 61 68 70 ce da 46 65 e2 30 00 79 .%.e>ahp..Fe.0.y 00:20:33.931 000001c0 d5 4d 12 32 fb aa 15 dd c4 e7 2e a7 42 bb 17 29 .M.2........B..) 00:20:33.931 000001d0 51 78 a5 b1 21 22 4d 0a fe ba 27 4d 7e 6d a3 81 Qx..!"M...'M~m.. 00:20:33.931 000001e0 15 f5 82 8a be b6 ce a9 c0 e8 04 61 9b c4 65 9a ...........a..e. 00:20:33.931 000001f0 84 f8 22 67 c6 56 55 00 4f 7c e8 89 1b 1d 4c f1 .."g.VU.O|....L. 00:20:33.931 00000200 76 36 16 3c 79 57 eb b9 0e e1 10 65 6c e6 eb c6 v6..F].. 00:20:33.932 00000170 e2 c6 cd 00 6d 1a 5c 36 9a 33 67 d6 76 1f c6 15 ....m.\6.3g.v... 00:20:33.932 00000180 f2 e2 57 50 b7 04 a1 14 9a a3 f1 1a 53 8e 5c 36 ..WP........S.\6 00:20:33.932 00000190 b0 d7 1e 54 3d fc f2 3f 6c e7 65 b2 2e 99 d9 28 ...T=..?l.e....( 00:20:33.932 000001a0 e8 0a be 47 c9 6b e2 b7 74 1d 57 43 8a ee 7c 3f ...G.k..t.WC..|? 00:20:33.932 000001b0 df d0 20 42 b8 9c 14 5a 4c f1 7b 3f 94 94 b0 f4 .. B...ZL.{?.... 00:20:33.932 000001c0 77 75 39 49 40 4b 36 c0 9c 8d f5 16 fe 05 36 b6 wu9I@K6.......6. 00:20:33.932 000001d0 35 53 7d 10 37 99 0b 5e 65 a5 6c 4c 0d e2 be 5a 5S}.7..^e.lL...Z 00:20:33.932 000001e0 22 71 b0 e1 6a 69 bf 2f c7 37 2d 6f 3e 34 9e 92 "q..ji./.7-o>4.. 00:20:33.932 000001f0 c2 42 61 d4 a5 2c 27 ab 2c 54 4f fe 69 d7 30 63 .Ba..,'.,TO.i.0c 00:20:33.932 00000200 96 0e 40 c2 f7 c4 20 19 65 59 fe 46 32 c4 3e 36 ..@... .eY.F2.>6 00:20:33.932 00000210 41 c8 0b 61 d3 48 92 f1 19 26 a8 e2 49 da ca c8 A..a.H...&..I... 00:20:33.932 00000220 06 93 8f 5e 8f 3f 84 1b 3a aa e9 73 1c 55 19 31 ...^.?..:..s.U.1 00:20:33.932 00000230 e3 0b 1b 32 68 20 d3 b8 b4 7d 60 e4 0f 47 a7 82 ...2h ...}`..G.. 00:20:33.932 00000240 53 1d 04 34 8d 07 f5 27 26 79 05 0b 68 8c bd 98 S..4...'&y..h... 00:20:33.932 00000250 03 3d c1 27 f0 2c ef cb 0c e2 5d 5c 59 5c 84 2b .=.'.,....]\Y\.+ 00:20:33.932 00000260 ab ec 6b 05 e2 df 58 18 53 39 41 f9 a1 86 1e e1 ..k...X.S9A..... 00:20:33.932 00000270 29 48 a6 33 93 09 81 7a 9b 51 e6 8b 54 73 49 b7 )H.3...z.Q..TsI. 00:20:33.932 00000280 dc 3b 6a b0 fc bc 54 8a 30 a2 c7 00 c5 01 c0 d7 .;j...T.0....... 00:20:33.932 00000290 cf 19 28 f4 78 89 2e 3b 0c 8e 53 67 cb 0b b8 c9 ..(.x..;..Sg.... 00:20:33.932 000002a0 c2 84 75 8e f7 f1 a0 9e 43 fb 12 aa db 37 4d ba ..u.....C....7M. 00:20:33.932 000002b0 73 7c 50 2b fa 3a 6c 64 30 c9 99 8d 66 b3 87 42 s|P+.:ld0...f..B 00:20:33.932 000002c0 c1 d2 cd e5 a8 bd 47 d6 7e 6b 5a 6e 6b b6 cb b3 ......G.~kZnk... 00:20:33.932 000002d0 0b 70 e0 5c 60 ab 8b 0f e8 a6 cf b4 35 4e d8 96 .p.\`.......5N.. 00:20:33.932 000002e0 49 74 72 41 e8 5b a4 2d 79 ce 18 a9 ec a5 e5 fd ItrA.[.-y....... 00:20:33.932 000002f0 e5 6c 5f 9f 66 99 4a 7c 71 0f 9e 59 e7 7f 06 0e .l_.f.J|q..Y.... 00:20:33.932 host pubkey: 00:20:33.932 00000000 6b de 62 ed b0 08 9a 08 cc 02 47 79 6b 35 9f 26 k.b.......Gyk5.& 00:20:33.932 00000010 8b a9 52 c1 3e d6 d9 c2 9d 0e 11 f3 4f 66 b7 18 ..R.>.......Of.. 00:20:33.932 00000020 42 90 38 8b 51 1d 8c 84 04 bc 44 23 61 4d 5e 8e B.8.Q.....D#aM^. 00:20:33.932 00000030 34 0f 5c a9 07 39 d8 d8 57 ca 54 c1 63 bc be dc 4.\..9..W.T.c... 00:20:33.932 00000040 61 72 16 45 4f 87 66 69 b4 7f 43 65 21 87 9c c1 ar.EO.fi..Ce!... 00:20:33.932 00000050 26 bc 9d 14 df c1 a1 c7 d8 3b e0 47 a5 e6 fe d4 &........;.G.... 00:20:33.932 00000060 6e 31 1c 4d 30 c7 95 b9 4e c6 e5 04 32 8f 04 ba n1.M0...N...2... 00:20:33.932 00000070 60 28 27 81 bd fa 09 f7 6f aa 9c b7 4c 5b 76 43 `('.....o...L[vC 00:20:33.932 00000080 1b ee c3 74 e7 a5 2f cb 3d 74 9d 67 9e 6b 08 21 ...t../.=t.g.k.! 00:20:33.932 00000090 4e 06 3c b7 34 9a f5 a2 21 25 06 b2 12 55 01 be N.<.4...!%...U.. 00:20:33.932 000000a0 3f 98 97 63 bf 52 76 57 85 03 71 5e 47 42 bb 18 ?..c.RvW..q^GB.. 00:20:33.932 000000b0 43 dd 32 38 28 5e 38 0f e3 f3 0b 5f a6 c4 03 6e C.28(^8...._...n 00:20:33.932 000000c0 c4 38 b2 04 63 70 4f bf 8e bc 63 3f c5 34 18 9d .8..cpO...c?.4.. 00:20:33.932 000000d0 f0 77 83 c3 f0 a5 c9 3a 62 53 ad 03 6a 64 84 bd .w.....:bS..jd.. 00:20:33.932 000000e0 bb ff 51 d4 c8 13 a8 c0 a4 be 42 b6 dd 58 72 b8 ..Q.......B..Xr. 00:20:33.932 000000f0 35 43 da d2 29 b3 61 12 74 50 2b a1 1b ce 73 f9 5C..).a.tP+...s. 00:20:33.932 00000100 e7 66 ef 4b e2 a7 7e c9 f6 0f 8d a5 9a 71 bb 85 .f.K..~......q.. 00:20:33.932 00000110 a0 e7 42 20 7c 9d 7c 42 51 a2 20 59 c5 26 97 38 ..B |.|BQ. Y.&.8 00:20:33.932 00000120 e2 6a 8f 34 98 08 59 e9 82 c6 08 23 4b 84 da 02 .j.4..Y....#K... 00:20:33.932 00000130 06 10 9a b9 8b aa 42 36 3e 14 0f 91 b3 ae c9 46 ......B6>......F 00:20:33.932 00000140 97 2c ad b5 40 02 c7 5d ab 35 b7 f1 44 e3 65 a0 .,..@..].5..D.e. 00:20:33.932 00000150 b1 77 97 2f cc 1f a1 61 12 61 60 c4 d8 c9 9e 7a .w./...a.a`....z 00:20:33.932 00000160 a4 33 52 6c 02 7b 4a 66 fc ee 4e e7 b5 0f 96 ed .3Rl.{Jf..N..... 00:20:33.932 00000170 d2 37 f6 b9 62 8f 0d df 99 04 b7 e9 70 ef f1 02 .7..b.......p... 00:20:33.932 00000180 90 0e a9 63 ec 71 83 10 76 24 39 9a dc d2 fc a2 ...c.q..v$9..... 00:20:33.932 00000190 98 40 a9 fa cd 6c 42 54 9e ca 6a 5c 17 7f 5b 9a .@...lBT..j\..[. 00:20:33.932 000001a0 4e fc 1a d5 69 64 1c ca 81 5a db 0e d7 7f aa 84 N...id...Z...... 00:20:33.932 000001b0 a3 99 33 f3 fc 67 54 34 42 b7 aa db 44 cd a9 6d ..3..gT4B...D..m 00:20:33.932 000001c0 32 59 70 8d fd e9 b4 c5 a6 92 b9 40 59 7a 7f f2 2Yp........@Yz.. 00:20:33.932 000001d0 ce 8c da 8a 68 65 dc f2 9e f2 52 71 b2 c2 5f c3 ....he....Rq.._. 00:20:33.932 000001e0 b0 b0 9d de be 5f f8 72 3c 99 db f5 db 49 fa b0 ....._.r<....I.. 00:20:33.932 000001f0 24 10 a3 ed 89 ba 1d 61 6a e9 8d 7a 4e d1 13 81 $......aj..zN... 00:20:33.932 00000200 bd 49 d9 42 1c f6 d4 a5 6f 79 2a 03 76 62 3a 94 .I.B....oy*.vb:. 00:20:33.932 00000210 84 35 c5 8a dc 2d 07 93 7c 1d 1a 73 53 9c 64 95 .5...-..|..sS.d. 00:20:33.932 00000220 b3 53 bd db e9 7b 72 fa 0b ee 41 68 e2 28 ae 29 .S...{r...Ah.(.) 00:20:33.932 00000230 4b 09 cf 60 e5 6f 00 82 fc 69 0d f9 09 49 77 a9 K..`.o...i...Iw. 00:20:33.932 00000240 a0 0a 1a e0 3a b5 56 5e 35 50 eb 4b eb be e1 57 ....:.V^5P.K...W 00:20:33.932 00000250 6a 69 fe 55 70 08 38 12 01 6c 64 15 a7 de 7a cb ji.Up.8..ld...z. 00:20:33.932 00000260 cd 3c 9e e5 73 82 c2 c3 09 cd 96 17 16 ca 13 8c .<..s........... 00:20:33.932 00000270 45 08 44 8f b1 8f e5 1b fd c7 b0 5a 15 e6 a5 26 E.D........Z...& 00:20:33.932 00000280 a7 ef 99 d1 52 77 a6 1d b9 73 57 ff a8 66 31 a2 ....Rw...sW..f1. 00:20:33.932 00000290 6b cc 11 37 c3 f0 34 ac d5 4a 82 ab 53 b7 b4 79 k..7..4..J..S..y 00:20:33.932 000002a0 e9 80 7b f0 d3 0f 7c 39 7d 7a 2c 9e d1 ce f8 ed ..{...|9}z,..... 00:20:33.932 000002b0 89 6a 08 b0 b8 ac 4e b6 f1 ab 17 e3 28 81 d8 45 .j....N.....(..E 00:20:33.932 000002c0 bb 57 9d 5f b4 0d 82 45 99 1a a9 4c b0 43 23 36 .W._...E...L.C#6 00:20:33.932 000002d0 67 f4 c9 00 d9 46 f3 1e 25 09 a3 82 e6 e1 b1 60 g....F..%......` 00:20:33.932 000002e0 14 53 ed 79 98 a3 3d 0e d0 b2 51 f3 e0 a0 be 5e .S.y..=...Q....^ 00:20:33.932 000002f0 c3 9d 40 f5 44 8b 3f a4 c6 f6 91 a6 06 ec 19 e5 ..@.D.?......... 00:20:33.932 dh secret: 00:20:33.932 00000000 c5 d6 d3 02 7b 52 60 a0 d6 0d 79 c1 b2 14 92 79 ....{R`...y....y 00:20:33.932 00000010 47 97 12 47 58 a7 e6 0a 38 e1 11 f1 b5 c8 e7 d1 G..GX...8....... 00:20:33.932 00000020 6f 60 24 3e 54 22 75 09 3d d4 ce 66 eb 45 12 6c o`$>T"u.=..f.E.l 00:20:33.932 00000030 a0 da 22 c7 43 5e 08 2d 30 ba c4 3f 97 0e 84 b0 ..".C^.-0..?.... 00:20:33.932 00000040 86 83 45 a1 7b 82 19 78 b3 c8 cb e6 7b 59 8c 06 ..E.{..x....{Y.. 00:20:33.932 00000050 96 91 21 f6 6f 9d 4f eb a9 b8 ef 3c 8a 0f 24 b1 ..!.o.O....<..$. 00:20:33.932 00000060 f0 75 f9 1f e1 55 3c b3 c1 82 24 1b 9f 3e 8d cc .u...U<...$..>.. 00:20:33.932 00000070 09 09 fc 03 8f 5c 68 5b 32 50 ce 61 11 6d ee 93 .....\h[2P.a.m.. 00:20:33.932 00000080 17 81 40 ea e0 d5 66 c4 2f 98 a1 7d 0d 12 5a 9b ..@...f./..}..Z. 00:20:33.932 00000090 30 08 07 23 d9 0f bb 0e 2d a3 9b f4 80 c0 7f c0 0..#....-....... 00:20:33.932 000000a0 d6 da a9 b5 a9 ac 39 25 75 ed 15 01 82 f1 ef 20 ......9%u...... 00:20:33.932 000000b0 d2 b8 08 0d c6 42 3d 4c 65 2a 40 6a c8 7c 58 ae .....B=Le*@j.|X. 00:20:33.932 000000c0 15 ae 40 b3 bd c3 9f db 22 7e c1 83 4a 73 d7 fd ..@....."~..Js.. 00:20:33.932 000000d0 11 e2 44 51 20 8a 6d 9a 6e 28 af 5a 4f 34 b4 25 ..DQ .m.n(.ZO4.% 00:20:33.932 000000e0 c4 78 4d 4b ec 49 27 4c ab dd 3f 49 1f d6 c7 8c .xMK.I'L..?I.... 00:20:33.932 000000f0 2d 59 90 e3 ed 0d 93 82 d8 e4 04 42 c8 bb c7 76 -Y.........B...v 00:20:33.932 00000100 36 05 34 d9 c7 98 36 05 e2 0f 9a a3 b5 25 c0 79 6.4...6......%.y 00:20:33.932 00000110 62 20 e0 dc 38 13 94 e2 17 84 d4 74 40 c2 b7 24 b ..8......t@..$ 00:20:33.932 00000120 e9 5b 1b b5 b2 23 dc f6 2b e8 ea 70 55 8b 21 f6 .[...#..+..pU.!. 00:20:33.932 00000130 87 b5 d4 f4 a6 30 91 57 43 bf 84 15 f6 eb 1c b2 .....0.WC....... 00:20:33.932 00000140 ab f8 44 30 3e 1b 52 74 f7 6c a1 3f 55 8e fc 34 ..D0>.Rt.l.?U..4 00:20:33.932 00000150 93 1c 4f 91 e8 36 07 d8 6b a7 76 fd 30 c4 12 c9 ..O..6..k.v.0... 00:20:33.932 00000160 9e ab 8a 4e 7f 8c e3 df ee 9a ff eb 6f 74 60 08 ...N........ot`. 00:20:33.932 00000170 29 5a 8d a7 24 b5 ed 06 fe e8 d2 94 1d eb c8 15 )Z..$........... 00:20:33.932 00000180 90 c4 f2 ad ff 8d 08 2c 99 85 b1 d3 f8 79 79 8b .......,.....yy. 00:20:33.932 00000190 ab fc 84 17 c5 ca 47 aa 8f aa 83 e9 3f 48 09 8c ......G.....?H.. 00:20:33.932 000001a0 bc b0 08 cc 24 77 d2 7f c1 22 94 bc 80 88 dc 24 ....$w...".....$ 00:20:33.932 000001b0 f3 d6 1f cf 75 32 37 32 40 bf 00 a8 98 94 fa ad ....u272@....... 00:20:33.932 000001c0 41 9b f8 cb 7e 5b 3c 8e 47 38 10 c1 8f b2 36 b9 A...~[<.G8....6. 00:20:33.932 000001d0 24 11 c4 c0 91 0f 6c 86 52 9c e0 9b 9c d8 01 09 $.....l.R....... 00:20:33.932 000001e0 c4 9c f6 8c 48 6c 20 74 a9 df 93 ab 82 3d f6 07 ....Hl t.....=.. 00:20:33.932 000001f0 a8 a7 16 6d 58 da 53 d7 a2 06 00 e0 97 76 90 e6 ...mX.S......v.. 00:20:33.932 00000200 20 3b bb 14 65 14 fa da b9 99 9f ef ac f9 92 66 ;..e..........f 00:20:33.932 00000210 6a b8 89 bf a3 c5 f8 2c ee 96 10 37 f4 7d 48 a1 j......,...7.}H. 00:20:33.932 00000220 58 d7 d9 d7 31 f1 e8 49 47 14 a6 9c 7e 07 5a dd X...1..IG...~.Z. 00:20:33.932 00000230 61 90 ad c0 26 b8 5f 37 de ce 0f 8e 04 55 34 b9 a...&._7.....U4. 00:20:33.932 00000240 fe ec 6b dd 2d db ea a1 d4 6f 19 5c 4d 8c cc 04 ..k.-....o.\M... 00:20:33.932 00000250 5b a2 8e 96 ff d6 33 9c 9f 1c f3 02 de 1a cd c8 [.....3......... 00:20:33.932 00000260 e0 aa e7 bc 09 7b 14 07 a2 41 a0 af b1 33 e9 6f .....{...A...3.o 00:20:33.932 00000270 2a c6 c4 84 64 9a 92 87 db bd 9b d4 0d 73 10 00 *...d........s.. 00:20:33.932 00000280 20 23 d3 1c e5 6a c8 d7 9c c5 38 62 d8 d3 9f 1e #...j....8b.... 00:20:33.933 00000290 e9 90 31 9f ca 9f 0d d1 de fc 11 78 ba 07 91 93 ..1........x.... 00:20:33.933 000002a0 73 c1 e6 e4 83 c3 fd dd c6 e0 f3 1f 66 68 de 7c s...........fh.| 00:20:33.933 000002b0 36 15 f0 a8 29 8b 39 07 66 8c 53 a5 cf 55 f1 12 6...).9.f.S..U.. 00:20:33.933 000002c0 03 5c cc ad e0 af 6b b4 4a a0 dc 27 f2 2d b0 cf .\....k.J..'.-.. 00:20:33.933 000002d0 a0 bf 51 bb 2b 6d 1d f2 77 10 21 77 84 c9 77 15 ..Q.+m..w.!w..w. 00:20:33.933 000002e0 7b d0 e2 8f 8e 75 27 3b 2a ff fa 8a a3 c8 3d bc {....u';*.....=. 00:20:33.933 000002f0 64 d2 57 22 77 f2 ca f6 00 a1 81 a3 02 c9 7c e2 d.W"w.........|. 00:20:33.933 [2024-09-27 15:25:21.111002] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=2, dhgroup=4, seq=3428451783, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.933 [2024-09-27 15:25:21.148822] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.933 [2024-09-27 15:25:21.148860] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.933 [2024-09-27 15:25:21.148876] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.933 [2024-09-27 15:25:21.148882] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.933 [2024-09-27 15:25:21.255307] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.933 [2024-09-27 15:25:21.255326] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.933 [2024-09-27 15:25:21.255333] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.933 [2024-09-27 15:25:21.255349] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.933 [2024-09-27 15:25:21.255403] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.933 ctrlr pubkey: 00:20:33.933 00000000 5d 0c 0a 83 2b bb 26 0d b0 00 36 e7 19 5d 42 30 ]...+.&...6..]B0 00:20:33.933 00000010 2b 0e d4 ec af 41 bd a5 2a ac 65 9a 82 aa a1 f9 +....A..*.e..... 00:20:33.933 00000020 03 9d ee fc 76 30 db 83 a8 82 33 3f 07 4f e5 45 ....v0....3?.O.E 00:20:33.933 00000030 1a 88 6f 0d e5 e0 a9 46 48 b0 5c 3c 5c 1f 7f fd ..o....FH.\<\... 00:20:33.933 00000040 f5 a8 b2 3f 71 05 7a 66 72 23 a7 d2 05 48 73 f0 ...?q.zfr#...Hs. 00:20:33.933 00000050 8a fb b0 c6 81 8c 03 6a b4 d4 bb 4e c9 42 4b 40 .......j...N.BK@ 00:20:33.933 00000060 d4 84 94 4c ef 8e 65 6d 43 81 aa c0 3a e7 99 f6 ...L..emC...:... 00:20:33.933 00000070 57 56 c5 30 a3 69 0e 53 44 ff b0 ad 1a ba 8b d9 WV.0.i.SD....... 00:20:33.933 00000080 2e 34 f0 3a 38 82 c0 11 3a e7 b7 4f 30 65 11 9a .4.:8...:..O0e.. 00:20:33.933 00000090 dd 58 21 a9 c0 cb de 97 85 dc df 82 a1 11 f6 25 .X!............% 00:20:33.933 000000a0 fc ff f3 1f c4 d7 f0 94 42 78 cc 84 cc 42 5f e1 ........Bx...B_. 00:20:33.933 000000b0 32 91 6c 89 7d 2f 0a cf 45 0d e9 cf e7 66 a5 82 2.l.}/..E....f.. 00:20:33.933 000000c0 35 00 b2 a8 d6 9d a6 5e fe 62 b5 9b 5d b7 16 f1 5......^.b..]... 00:20:33.933 000000d0 09 f0 17 14 70 f1 4d 34 8d 8b fc cc 99 60 de 16 ....p.M4.....`.. 00:20:33.933 000000e0 b5 4f 16 e8 ab f0 32 6e fa 27 32 d2 c4 24 70 e6 .O....2n.'2..$p. 00:20:33.933 000000f0 14 cb 10 5c de ed 83 fd 84 d8 2d 4c 64 31 3f c0 ...\......-Ld1?. 00:20:33.933 00000100 a9 15 be e5 2c 93 dc 67 68 21 10 94 8f 34 42 44 ....,..gh!...4BD 00:20:33.933 00000110 58 22 da 92 a5 52 5f cc 9b 9c 17 ce c1 64 c5 dd X"...R_......d.. 00:20:33.933 00000120 95 21 a7 31 b8 09 a1 19 6f c8 6a 00 f3 59 6c ed .!.1....o.j..Yl. 00:20:33.933 00000130 57 ff 0c e7 18 6a 4d 1f 65 6a 72 bd 0f 6f 0d d1 W....jM.ejr..o.. 00:20:33.933 00000140 3a d9 73 56 00 51 36 bf 03 5a 2e 43 c6 7a 46 f9 :.sV.Q6..Z.C.zF. 00:20:33.933 00000150 7b 26 bf a8 62 53 66 05 8b db 72 7d d8 b6 7d eb {&..bSf...r}..}. 00:20:33.933 00000160 6b 62 c6 c7 36 d8 49 b1 3f 52 3e 9a 46 5d 87 d8 kb..6.I.?R>.F].. 00:20:33.933 00000170 e2 c6 cd 00 6d 1a 5c 36 9a 33 67 d6 76 1f c6 15 ....m.\6.3g.v... 00:20:33.933 00000180 f2 e2 57 50 b7 04 a1 14 9a a3 f1 1a 53 8e 5c 36 ..WP........S.\6 00:20:33.933 00000190 b0 d7 1e 54 3d fc f2 3f 6c e7 65 b2 2e 99 d9 28 ...T=..?l.e....( 00:20:33.933 000001a0 e8 0a be 47 c9 6b e2 b7 74 1d 57 43 8a ee 7c 3f ...G.k..t.WC..|? 00:20:33.933 000001b0 df d0 20 42 b8 9c 14 5a 4c f1 7b 3f 94 94 b0 f4 .. B...ZL.{?.... 00:20:33.933 000001c0 77 75 39 49 40 4b 36 c0 9c 8d f5 16 fe 05 36 b6 wu9I@K6.......6. 00:20:33.933 000001d0 35 53 7d 10 37 99 0b 5e 65 a5 6c 4c 0d e2 be 5a 5S}.7..^e.lL...Z 00:20:33.933 000001e0 22 71 b0 e1 6a 69 bf 2f c7 37 2d 6f 3e 34 9e 92 "q..ji./.7-o>4.. 00:20:33.933 000001f0 c2 42 61 d4 a5 2c 27 ab 2c 54 4f fe 69 d7 30 63 .Ba..,'.,TO.i.0c 00:20:33.933 00000200 96 0e 40 c2 f7 c4 20 19 65 59 fe 46 32 c4 3e 36 ..@... .eY.F2.>6 00:20:33.933 00000210 41 c8 0b 61 d3 48 92 f1 19 26 a8 e2 49 da ca c8 A..a.H...&..I... 00:20:33.933 00000220 06 93 8f 5e 8f 3f 84 1b 3a aa e9 73 1c 55 19 31 ...^.?..:..s.U.1 00:20:33.933 00000230 e3 0b 1b 32 68 20 d3 b8 b4 7d 60 e4 0f 47 a7 82 ...2h ...}`..G.. 00:20:33.933 00000240 53 1d 04 34 8d 07 f5 27 26 79 05 0b 68 8c bd 98 S..4...'&y..h... 00:20:33.933 00000250 03 3d c1 27 f0 2c ef cb 0c e2 5d 5c 59 5c 84 2b .=.'.,....]\Y\.+ 00:20:33.933 00000260 ab ec 6b 05 e2 df 58 18 53 39 41 f9 a1 86 1e e1 ..k...X.S9A..... 00:20:33.933 00000270 29 48 a6 33 93 09 81 7a 9b 51 e6 8b 54 73 49 b7 )H.3...z.Q..TsI. 00:20:33.933 00000280 dc 3b 6a b0 fc bc 54 8a 30 a2 c7 00 c5 01 c0 d7 .;j...T.0....... 00:20:33.933 00000290 cf 19 28 f4 78 89 2e 3b 0c 8e 53 67 cb 0b b8 c9 ..(.x..;..Sg.... 00:20:33.933 000002a0 c2 84 75 8e f7 f1 a0 9e 43 fb 12 aa db 37 4d ba ..u.....C....7M. 00:20:33.933 000002b0 73 7c 50 2b fa 3a 6c 64 30 c9 99 8d 66 b3 87 42 s|P+.:ld0...f..B 00:20:33.933 000002c0 c1 d2 cd e5 a8 bd 47 d6 7e 6b 5a 6e 6b b6 cb b3 ......G.~kZnk... 00:20:33.933 000002d0 0b 70 e0 5c 60 ab 8b 0f e8 a6 cf b4 35 4e d8 96 .p.\`.......5N.. 00:20:33.933 000002e0 49 74 72 41 e8 5b a4 2d 79 ce 18 a9 ec a5 e5 fd ItrA.[.-y....... 00:20:33.933 000002f0 e5 6c 5f 9f 66 99 4a 7c 71 0f 9e 59 e7 7f 06 0e .l_.f.J|q..Y.... 00:20:33.933 host pubkey: 00:20:33.933 00000000 a0 91 26 5e ec 2a ec 56 5c e3 53 58 6c 8f 64 81 ..&^.*.V\.SXl.d. 00:20:33.933 00000010 b3 d4 96 c2 b2 64 59 28 d0 2c 0a 0d 46 f5 9e a5 .....dY(.,..F... 00:20:33.933 00000020 72 93 f7 a8 43 28 d6 c1 32 73 ad 7c 67 fa 15 0d r...C(..2s.|g... 00:20:33.933 00000030 43 5e 7f 79 97 80 6c f8 58 be 0e e1 96 b7 a8 79 C^.y..l.X......y 00:20:33.933 00000040 55 cf e8 52 f7 16 31 16 2d 18 4a 30 7d 09 19 44 U..R..1.-.J0}..D 00:20:33.933 00000050 de 61 46 9a b5 40 c7 46 0a a8 ed 32 22 7d 68 d5 .aF..@.F...2"}h. 00:20:33.933 00000060 2a a7 1f b0 63 77 88 b5 4a cf 7f 04 6d c5 3d b0 *...cw..J...m.=. 00:20:33.933 00000070 0f b2 22 36 f4 5f 28 33 87 79 ac 70 9f 79 ae 11 .."6._(3.y.p.y.. 00:20:33.933 00000080 22 94 f6 5f bc cd de 8a 94 c3 53 df a6 a3 71 05 ".._......S...q. 00:20:33.933 00000090 2a 21 b3 95 98 4c ce 32 aa 18 c0 64 b7 19 31 6d *!...L.2...d..1m 00:20:33.933 000000a0 b3 5d a8 98 e4 b9 62 07 dc fd 1b ab e0 46 81 aa .]....b......F.. 00:20:33.933 000000b0 28 29 81 6c 27 2f 83 3d 8c 2a a0 bb 11 6c b7 5d ().l'/.=.*...l.] 00:20:33.933 000000c0 d8 78 0e e8 1f 17 4a 2c 61 f4 6b a1 fb bf f2 84 .x....J,a.k..... 00:20:33.933 000000d0 47 9e 29 dd 18 b0 5d ff 9b 1f 4b c6 b6 df a1 73 G.)...]...K....s 00:20:33.933 000000e0 6f 49 5a 6e e3 66 2f 1c 78 ae 53 e5 f5 71 83 83 oIZn.f/.x.S..q.. 00:20:33.933 000000f0 85 aa 29 8d 7a d3 68 1d 3f 28 ec 0b e6 fe 37 18 ..).z.h.?(....7. 00:20:33.933 00000100 ed c9 62 27 6f 47 68 a8 9d 49 75 4d 79 03 48 60 ..b'oGh..IuMy.H` 00:20:33.933 00000110 ec 08 b0 fd c5 9f 61 47 19 49 ad e0 c2 b1 36 00 ......aG.I....6. 00:20:33.933 00000120 69 54 12 00 7c 2b b2 d6 05 6f 4c 7f 3f 35 8d 4e iT..|+...oL.?5.N 00:20:33.933 00000130 06 db 3a ec 3d 08 b7 cb 6b b4 49 a6 3c 4a b0 7b ..:.=...k.I.I.{ 00:20:33.933 00000210 9f e7 1b 78 89 e9 bd cd 2b 11 85 20 46 f7 8a 15 ...x....+.. F... 00:20:33.933 00000220 bd dc 89 ce f5 da d8 76 59 2e 6b 31 07 ef 6f 15 .......vY.k1..o. 00:20:33.933 00000230 37 61 a3 08 18 dd af 4c 96 04 28 c1 40 86 46 a3 7a.....L..(.@.F. 00:20:33.933 00000240 b2 41 ff 97 44 47 22 05 c0 e3 ee 35 cf 08 be 60 .A..DG"....5...` 00:20:33.933 00000250 c3 ed 29 96 62 f1 83 6c cb 89 0d e6 a7 58 e2 cc ..).b..l.....X.. 00:20:33.933 00000260 f8 f3 54 35 28 bf d6 e7 94 59 d3 96 08 5a 5e 2e ..T5(....Y...Z^. 00:20:33.933 00000270 9b 93 e3 0b 51 d7 53 2a 94 ab 68 35 4e 95 8d e3 ....Q.S*..h5N... 00:20:33.933 00000280 09 fe f6 cf ee f9 14 bb 9d 9d 34 21 cf 82 38 e9 ..........4!..8. 00:20:33.933 00000290 4e f4 83 48 e0 b7 bc 73 7e fe 85 4c c4 26 2e 78 N..H...s~..L.&.x 00:20:33.933 000002a0 3f 60 25 87 a4 71 e5 5f 02 57 b6 a3 23 e2 bc e7 ?`%..q._.W..#... 00:20:33.933 000002b0 7c 1e 13 37 f8 43 1f 02 af 23 96 4d 50 94 e5 bd |..7.C...#.MP... 00:20:33.933 000002c0 c9 1a fe ae 17 b9 db 92 06 a6 fc 2c 74 3b 4d 4d ...........,t;MM 00:20:33.933 000002d0 6d 1e 01 e8 9a 71 fe 68 4a ec d1 c0 bf 7b df bc m....q.hJ....{.. 00:20:33.933 000002e0 be 88 db d4 14 50 dd 4b 26 1e f5 1f b6 0c 29 c0 .....P.K&.....). 00:20:33.933 000002f0 13 45 ff 23 0d af 82 21 b7 27 43 5a e9 70 f0 c6 .E.#...!.'CZ.p.. 00:20:33.933 dh secret: 00:20:33.933 00000000 01 fa 8f 32 2c 66 80 51 99 3c 06 dd 6c df df 7b ...2,f.Q.<..l..{ 00:20:33.934 00000010 55 04 ad f6 3a a2 37 ea e9 6e e0 42 e5 ae 35 fe U...:.7..n.B..5. 00:20:33.934 00000020 b7 3c b1 aa a7 11 cf a1 dc 0b b1 fa 7f 48 33 51 .<...........H3Q 00:20:33.934 00000030 a9 6e 6b 04 86 d6 9f a9 c2 aa 0d 3b 0a be fa 7c .nk........;...| 00:20:33.934 00000040 d5 da 0b 53 2c ed 36 53 11 a2 94 33 43 a4 49 98 ...S,.6S...3C.I. 00:20:33.934 00000050 45 dd 84 a1 de 06 79 c0 0b 34 7a 9c a8 ff 95 c1 E.....y..4z..... 00:20:33.934 00000060 b7 f9 75 9c df f4 27 ff 0b 29 42 f5 24 71 f3 00 ..u...'..)B.$q.. 00:20:33.934 00000070 e3 d3 3b 93 bc 45 33 4b f5 a3 ca d6 bb b7 48 f4 ..;..E3K......H. 00:20:33.934 00000080 1b ff 97 8c fa cf 97 ac a2 c1 dd 87 0b 83 6d c2 ..............m. 00:20:33.934 00000090 83 35 eb d4 42 65 37 5e ac 60 33 0b 8a f8 99 f2 .5..Be7^.`3..... 00:20:33.934 000000a0 d4 a5 db 68 fc f9 22 bc 1f b1 dd 09 37 86 1f ec ...h..".....7... 00:20:33.934 000000b0 82 45 a2 1e e9 0f 04 3f 8a a3 2b 07 cb 2e 2e 8d .E.....?..+..... 00:20:33.934 000000c0 6f a7 e8 3b 92 94 79 91 0e f7 de ae 3b 8d 11 e8 o..;..y.....;... 00:20:33.934 000000d0 8c 25 91 48 bf d5 98 b3 e9 71 19 82 ff 4b 15 82 .%.H.....q...K.. 00:20:33.934 000000e0 69 e2 21 a1 42 f6 0e e0 3f 8d 94 e5 90 ed 47 d1 i.!.B...?.....G. 00:20:33.934 000000f0 d3 30 5d fb e7 1c 86 fd b3 03 fb 3b 14 7d be a1 .0]........;.}.. 00:20:33.934 00000100 23 bd 7f b7 d5 a3 f3 48 57 c9 a5 a4 9d 3f a1 b5 #......HW....?.. 00:20:33.934 00000110 2c 36 58 e1 67 a7 36 4d bf 47 b3 6c 60 00 71 dc ,6X.g.6M.G.l`.q. 00:20:33.934 00000120 7e 57 e6 e0 37 cb 65 22 d1 30 10 a6 b7 76 e7 50 ~W..7.e".0...v.P 00:20:33.934 00000130 aa c2 2e ed dc af 6f 92 f6 34 13 bc fc 5e ce 7b ......o..4...^.{ 00:20:33.934 00000140 e3 ef 77 34 d7 b0 72 51 4f 6a 1d 53 f1 9e 58 80 ..w4..rQOj.S..X. 00:20:33.934 00000150 07 42 c0 74 fd 17 c0 b1 09 2a df 51 ac e0 17 b8 .B.t.....*.Q.... 00:20:33.934 00000160 22 a3 73 bd 62 9a fc a7 ef c9 41 e6 57 a7 30 a9 ".s.b.....A.W.0. 00:20:33.934 00000170 e6 63 c9 6e 88 7d 54 59 d7 2b f6 f9 c6 db 9e c9 .c.n.}TY.+...... 00:20:33.934 00000180 d8 8e 97 3f 29 30 e3 37 1b 94 32 5d 1b 6f 38 72 ...?)0.7..2].o8r 00:20:33.934 00000190 a2 10 ff 20 e4 00 b0 89 40 78 3f 75 7d f0 a5 91 ... ....@x?u}... 00:20:33.934 000001a0 b4 75 09 4e 97 18 89 bc ce ad bf e9 b8 88 4b 96 .u.N..........K. 00:20:33.934 000001b0 02 0d b2 76 86 7f 04 8f 25 61 d0 9f 3d fa bb bf ...v....%a..=... 00:20:33.934 000001c0 a6 a7 39 76 c7 57 f1 6a 22 57 34 16 27 4e 0f ef ..9v.W.j"W4.'N.. 00:20:33.934 000001d0 e4 eb 34 e3 b4 cb 7f 2b 0f 6c 46 6d 51 5d cf c2 ..4....+.lFmQ].. 00:20:33.934 000001e0 e7 7b 5c 35 bb 1c a2 57 a6 7a 07 16 7c 01 2b 9d .{\5...W.z..|.+. 00:20:33.934 000001f0 1e a2 af 98 11 d4 a4 31 48 61 b6 1c bf 65 3a 25 .......1Ha...e:% 00:20:33.934 00000200 20 68 86 d0 1c b2 d0 14 72 c1 56 8f 76 17 80 6d h......r.V.v..m 00:20:33.934 00000210 51 13 6e f5 1b 02 e3 83 cc 83 91 3c 61 54 f5 91 Q.n........b...^... 00:20:33.935 00000160 42 5b e5 ce 9c f3 52 a4 9b 02 ae 22 1c 70 8a 2a B[....R....".p.* 00:20:33.935 00000170 ce 98 5a 77 7c d2 dc 0d f9 ac ee 37 85 89 c4 8d ..Zw|......7.... 00:20:33.935 00000180 2c 17 85 1a 00 cc 49 36 5c 53 da 72 aa 70 44 0f ,.....I6\S.r.pD. 00:20:33.935 00000190 68 98 5a 2f 04 53 1b 0f 15 ec 87 87 13 8a 6f 27 h.Z/.S........o' 00:20:33.935 000001a0 a0 40 5e e3 9d 24 75 25 02 92 15 d7 3b 35 38 d8 .@^..$u%....;58. 00:20:33.935 000001b0 b6 5c 87 c8 21 9c e8 db 31 3d ae 78 72 2b 34 c0 .\..!...1=.xr+4. 00:20:33.935 000001c0 5a f3 41 ed 98 87 57 d3 cc 4a 97 9b fc 6d 9b f4 Z.A...W..J...m.. 00:20:33.935 000001d0 e6 78 c1 d2 18 ba fa d9 68 f0 f4 83 95 2f 4f 97 .x......h..../O. 00:20:33.935 000001e0 aa 24 68 8a 40 4b cf f1 21 f1 3c 26 f9 3f 86 b8 .$h.@K..!.<&.?.. 00:20:33.935 000001f0 04 b4 ab 4e 75 00 a0 2a 74 ce 16 7b c2 d5 ba 15 ...Nu..*t..{.... 00:20:33.935 00000200 bc 8a e9 3a 6f d8 6b 53 e0 d8 0c 54 f6 c2 14 6b ...:o.kS...T...k 00:20:33.935 00000210 bb c5 a4 31 a9 38 82 a4 0e 39 8a dc 2b f6 79 5f ...1.8...9..+.y_ 00:20:33.935 00000220 45 18 56 30 c4 8a 4d 61 d7 75 dc 1d 33 16 ac 0d E.V0..Ma.u..3... 00:20:33.935 00000230 32 e3 f6 bf 82 b9 13 bd 09 79 b0 fe ce 22 4d 8e 2........y..."M. 00:20:33.935 00000240 8c 47 ff 3f 85 97 fb ba a5 d7 af 79 ce b6 31 0f .G.?.......y..1. 00:20:33.935 00000250 71 99 d5 54 02 bd 70 fa d5 2f 44 31 52 70 58 95 q..T..p../D1RpX. 00:20:33.935 00000260 2b b0 b0 82 09 ea 39 95 1d 0d e0 9b 46 72 be 8f +.....9.....Fr.. 00:20:33.935 00000270 13 5e 31 77 2d 8d 12 2d e5 a4 5e f4 7a e5 fc 76 .^1w-..-..^.z..v 00:20:33.935 00000280 5d 07 dc be 77 c5 bf aa 59 27 6e 02 e8 86 59 ea ]...w...Y'n...Y. 00:20:33.935 00000290 90 8f fb 60 35 3d 79 94 08 f4 77 c6 86 b3 a5 83 ...`5=y...w..... 00:20:33.935 000002a0 da e5 b0 84 15 d6 ce a5 d5 86 da 76 ad 16 11 47 ...........v...G 00:20:33.935 000002b0 2d 70 25 30 5c 78 81 ea 79 89 ce ab 70 85 d7 a6 -p%0\x..y...p... 00:20:33.935 000002c0 bb ed a1 ff 3f 09 7c 34 37 75 0c ea 7d ac f1 e6 ....?.|47u..}... 00:20:33.935 000002d0 69 34 11 75 d5 40 f0 33 d9 79 de 41 df 92 07 84 i4.u.@.3.y.A.... 00:20:33.935 000002e0 c2 3c 17 42 66 85 a0 23 b9 52 21 6f a9 22 e7 d8 .<.Bf..#.R!o.".. 00:20:33.935 000002f0 a2 8a d4 c9 2f 9c ed e6 55 f4 b3 3b ba e5 ac f4 ..../...U..;.... 00:20:33.935 00000300 77 2d 80 05 e8 bd ed 3c 00 3e 58 77 61 dc ac 0c w-.....<.>Xwa... 00:20:33.935 00000310 b9 06 62 1a 61 2d e7 82 62 4b 54 00 16 e6 59 83 ..b.a-..bKT...Y. 00:20:33.935 00000320 5d c7 be 18 59 21 1f 88 4c d2 36 08 82 67 86 fd ]...Y!..L.6..g.. 00:20:33.935 00000330 40 cc 1e 39 ef 45 26 92 2b 87 85 45 28 e4 37 31 @..9.E&.+..E(.71 00:20:33.935 00000340 ea 8a f1 8d 5b bb 41 ff 92 bb 9a b0 95 61 14 9c ....[.A......a.. 00:20:33.935 00000350 0b 82 96 2b 85 6d 73 48 2d 96 48 16 45 9e 8f e5 ...+.msH-.H.E... 00:20:33.935 00000360 fa 30 75 97 83 03 66 7b 4a 84 ca 34 e1 39 eb 1b .0u...f{J..4.9.. 00:20:33.935 00000370 87 80 23 8b 7f 63 7d 25 38 9f 6d d4 f3 97 31 9e ..#..c}%8.m...1. 00:20:33.935 00000380 a0 07 e3 59 bd 9e c6 d3 78 b6 7d be 64 b3 80 d8 ...Y....x.}.d... 00:20:33.935 00000390 72 35 3f c3 c5 b8 56 b9 da ad 5c 09 0c 31 8b 70 r5?...V...\..1.p 00:20:33.935 000003a0 f7 81 ff 17 34 5f f3 62 4c ae 20 f7 43 96 d6 e5 ....4_.bL. .C... 00:20:33.935 000003b0 14 bc 1f ef 97 9f 77 c7 3d 1c 63 d6 93 a4 b5 c0 ......w.=.c..... 00:20:33.935 000003c0 12 fa 86 d7 bd 73 b1 62 60 8e dc b4 65 66 1a 68 .....s.b`...ef.h 00:20:33.935 000003d0 ae db 4d 39 bc 39 41 b7 51 f3 45 92 26 27 d0 e4 ..M9.9A.Q.E.&'.. 00:20:33.935 000003e0 b2 ee 85 56 99 b6 bc 10 a4 0f 2b a0 80 ff 20 38 ...V......+... 8 00:20:33.935 000003f0 66 df c8 5b 6a 13 c2 cc bd 8f ee 6c f1 9c 3e c7 f..[j......l..>. 00:20:33.935 dh secret: 00:20:33.935 00000000 81 8f 78 c4 dc fa eb cf 85 c9 01 0b 5e d0 50 bb ..x.........^.P. 00:20:33.935 00000010 ff 70 d7 2c 09 37 67 6e d5 14 b7 a9 f7 36 a5 cd .p.,.7gn.....6.. 00:20:33.935 00000020 04 71 94 fd c2 75 49 6f ba 55 51 77 11 de 33 23 .q...uIo.UQw..3# 00:20:33.935 00000030 90 25 e5 22 81 b3 3e 32 7d 8f f0 bc b7 66 68 61 .%."..>2}....fha 00:20:33.935 00000040 e1 42 7d 75 60 07 95 e4 3f 1e 3f a0 86 f7 a2 4a .B}u`...?.?....J 00:20:33.935 00000050 86 55 f6 78 c2 82 c1 e4 ac 45 ca ca e3 f6 1f bc .U.x.....E...... 00:20:33.935 00000060 e7 bd e2 c1 76 c8 d5 b8 d1 dc bd 34 e8 89 84 61 ....v......4...a 00:20:33.935 00000070 95 3a 50 58 95 fd be 54 49 ef 01 b6 4d 04 7c 74 .:PX...TI...M.|t 00:20:33.935 00000080 7e 35 46 94 af cb 86 29 f5 c2 f0 26 80 b6 f8 e3 ~5F....)...&.... 00:20:33.935 00000090 00 22 bd 94 af e8 e5 14 ac 5b 78 4f 97 2b b4 46 .".......[xO.+.F 00:20:33.935 000000a0 36 98 45 ac 39 24 37 95 a8 ce b6 a7 30 5d f8 0f 6.E.9$7.....0].. 00:20:33.935 000000b0 ea 7d a7 a8 da 13 b0 01 ea 85 95 59 30 5f e2 ad .}.........Y0_.. 00:20:33.935 000000c0 d5 98 92 d5 c4 15 db 76 21 0d 36 fc 1b ab 6f 3b .......v!.6...o; 00:20:33.935 000000d0 e4 cb db c7 47 28 22 40 79 83 47 43 35 24 f4 8b ....G("@y.GC5$.. 00:20:33.935 000000e0 1a 34 46 4d d9 cd 90 e3 88 ca 28 03 8b 94 a8 e5 .4FM......(..... 00:20:33.935 000000f0 95 93 18 55 d1 a0 ea cf 4b 0b 18 e3 ec f1 ec 38 ...U....K......8 00:20:33.935 00000100 13 ad 1b c1 8f 0b 6f 7b 5d 99 37 0b 36 2d b0 72 ......o{].7.6-.r 00:20:33.935 00000110 c3 8d 34 6a da 75 92 1a 60 6a c6 ad ab d9 3f 0d ..4j.u..`j....?. 00:20:33.935 00000120 e5 ff f5 a0 c2 ca 3c 5e 8f 40 8d bd de 5a f6 08 ......<^.@...Z.. 00:20:33.935 00000130 b1 a4 1a f8 8f a8 40 23 b3 0e 95 69 8d 96 40 c9 ......@#...i..@. 00:20:33.935 00000140 3c 4d e1 14 74 e0 5e e3 65 6e 45 92 6d 4c a2 23 $.. 00:20:33.935 00000250 2e 6b 12 41 51 ec e2 a5 a3 c0 18 93 b2 2d 63 94 .k.AQ........-c. 00:20:33.935 00000260 6c 56 21 86 55 8c 6f a3 31 ad c5 b2 7e fb 46 68 lV!.U.o.1...~.Fh 00:20:33.935 00000270 c3 13 6e 03 a8 96 d2 af a5 9f c6 bc 7d ad 23 21 ..n.........}.#! 00:20:33.935 00000280 11 c3 1b e2 07 d1 af 5f a8 70 ae cd 66 e6 a3 f9 ......._.p..f... 00:20:33.935 00000290 ce 19 dd 66 02 31 d6 01 c2 28 fc 74 fc 2f 89 23 ...f.1...(.t./.# 00:20:33.935 000002a0 7d 82 da 0d 77 42 d8 67 1b 21 4a be 65 ef cf ba }...wB.g.!J.e... 00:20:33.935 000002b0 66 67 8d 23 fa f8 64 8c a9 7a 42 b0 9c 37 a9 f0 fg.#..d..zB..7.. 00:20:33.935 000002c0 64 45 1b ca da c3 03 2c 06 31 04 a4 4b a8 f9 e6 dE.....,.1..K... 00:20:33.935 000002d0 a2 21 ab e2 cb 56 70 04 e9 56 dc a0 fe da 69 7c .!...Vp..V....i| 00:20:33.935 000002e0 59 e3 09 b7 93 d7 51 2e e9 a5 f5 75 8b 1b 4f 6a Y.....Q....u..Oj 00:20:33.935 000002f0 1e f4 77 54 a1 99 61 ee 9c 6d 98 25 f3 6b c6 b5 ..wT..a..m.%.k.. 00:20:33.935 00000300 b9 9d a7 77 ed b5 17 6a e3 f8 b7 05 bf 51 d3 4a ...w...j.....Q.J 00:20:33.935 00000310 55 7e b8 ff df 86 d7 8a 24 f2 67 fa 9d 9b 67 39 U~......$.g...g9 00:20:33.935 00000320 4b f8 21 49 13 15 b7 02 bd 16 ce 8b c2 f3 cf c4 K.!I............ 00:20:33.935 00000330 a0 3c 46 0e c2 45 74 32 33 ad 11 09 73 e3 00 4d ..Q]u# 00:20:33.936 00000090 6a 9f 84 f4 77 3e fb 6f cc d5 d1 7c c1 0c e8 70 j...w>.o...|...p 00:20:33.936 000000a0 48 7d dd 0a a0 d5 fe 9b 9a 10 d6 9b bf 68 df 3b H}...........h.; 00:20:33.936 000000b0 19 05 4d a4 51 f3 96 24 6f 3a 3a b2 ee 95 22 98 ..M.Q..$o::...". 00:20:33.936 000000c0 d8 30 ea d8 9f a1 44 5a d2 17 9a d9 02 c8 8d 76 .0....DZ.......v 00:20:33.936 000000d0 1d 0b 88 10 e9 eb cc 36 bc 33 c0 b8 f7 e0 b3 2b .......6.3.....+ 00:20:33.936 000000e0 27 d8 aa aa a3 c5 e6 5c 93 12 02 3e 8e e1 11 8b '......\...>.... 00:20:33.936 000000f0 9f 3b 82 21 33 f7 73 ed fd 4d 52 f3 f9 f4 e9 03 .;.!3.s..MR..... 00:20:33.936 00000100 75 99 a5 48 00 24 e8 6d d4 f4 ec e6 7f 74 62 a8 u..H.$.m.....tb. 00:20:33.936 00000110 84 91 ef ea 55 c7 ba 42 4c 7d 83 88 8b 76 de fa ....U..BL}...v.. 00:20:33.936 00000120 88 bb 27 6d 16 3b 13 29 b5 1c 2e 41 4c f8 76 a9 ..'m.;.)...AL.v. 00:20:33.936 00000130 83 41 e7 d8 27 00 45 f9 91 08 ee 46 67 3c ce 03 .A..'.E....Fg<.. 00:20:33.936 00000140 d2 ff d5 19 72 02 60 df 53 76 c0 04 a2 d7 c9 f8 ....r.`.Sv...... 00:20:33.936 00000150 84 fd 52 08 ff b2 14 2e db 0a b0 50 b2 79 a8 c0 ..R........P.y.. 00:20:33.936 00000160 aa 4e 43 4b 26 35 66 d3 fa 44 f9 23 81 de ae 94 .NCK&5f..D.#.... 00:20:33.936 00000170 d1 c6 63 bf 71 dd ad 6b 2f b5 33 ee e6 3b 79 21 ..c.q..k/.3..;y! 00:20:33.936 00000180 fd 3b 91 89 ad 91 02 c5 f4 02 5c bc 65 7a 96 93 .;........\.ez.. 00:20:33.936 00000190 52 59 73 04 5c 93 b7 7c 80 53 37 ae a7 d2 eb 99 RYs.\..|.S7..... 00:20:33.936 000001a0 3e ef c8 4e 82 b4 ba 41 b0 4d ab df d2 f9 83 36 >..N...A.M.....6 00:20:33.936 000001b0 1c 81 fc a1 25 24 b1 dc d6 ad d5 7e 20 0b 36 61 ....%$.....~ .6a 00:20:33.936 000001c0 7f 32 17 61 0d e2 76 34 bf 4c b8 91 3d e5 cf 4d .2.a..v4.L..=..M 00:20:33.936 000001d0 ab b7 c6 3a ec 2b 33 f4 82 3d a1 d7 e2 5f e2 e9 ...:.+3..=..._.. 00:20:33.936 000001e0 1b 89 71 d8 4d c5 d8 8e b1 fd ed be 4b 44 d9 b2 ..q.M.......KD.. 00:20:33.936 000001f0 40 4b aa 6d 15 59 00 89 1f 1a b5 8a d7 cb 19 25 @K.m.Y.........% 00:20:33.936 00000200 13 21 89 0d ae 1b 23 0d 38 3b 4b b3 86 23 8c 1c .!....#.8;K..#.. 00:20:33.936 00000210 0a 91 ec d4 8f 9a 67 ff 3c 0d 4a 8d 76 2d c4 14 ......g.<.J.v-.. 00:20:33.936 00000220 97 ca dc 75 4d 6f a3 98 0f fd 63 da 2c 49 bc e7 ...uMo....c.,I.. 00:20:33.936 00000230 2a 64 09 c3 ae 3d 02 55 cd f8 12 3d 94 81 13 1a *d...=.U...=.... 00:20:33.936 00000240 72 d0 bd cd 7d 02 ab 30 ac 74 08 2a e3 9b ac 3c r...}..0.t.*...< 00:20:33.936 00000250 82 9e 27 59 f0 75 5f d2 26 02 30 60 fb 83 ca 52 ..'Y.u_.&.0`...R 00:20:33.936 00000260 f6 e1 ff 6d 3a 94 78 2e bf b0 9e 30 0e 6b b9 63 ...m:.x....0.k.c 00:20:33.936 00000270 e1 4f d4 f4 78 37 ae 49 0d 9e 49 a8 31 67 48 d9 .O..x7.I..I.1gH. 00:20:33.937 00000280 77 5b 81 de 0c 76 a2 16 95 89 59 ac 67 f3 0d 78 w[...v....Y.g..x 00:20:33.937 00000290 1c ad 38 6a b7 f3 16 d3 1d 70 82 69 cd f0 7a c2 ..8j.....p.i..z. 00:20:33.937 000002a0 56 95 33 8d ef d8 93 cf 71 95 1a de 1e 37 61 20 V.3.....q....7a 00:20:33.937 000002b0 61 cd 73 83 91 e2 bb de a2 2f a5 b2 f1 88 1f 69 a.s....../.....i 00:20:33.937 000002c0 e2 8b 0b 15 55 4a 93 3e 83 9e 41 46 ca 17 68 a5 ....UJ.>..AF..h. 00:20:33.937 000002d0 e5 ca 33 a0 13 bf b1 04 99 f8 5a 03 0c 24 eb 86 ..3.......Z..$.. 00:20:33.937 000002e0 cf 2e e6 0e af 6d a9 d4 27 24 d5 11 69 50 54 c1 .....m..'$..iPT. 00:20:33.937 000002f0 5c 26 51 db 8e 10 26 21 3a 26 d5 ae 8e a9 e3 23 \&Q...&!:&.....# 00:20:33.937 00000300 97 94 f1 5a b0 fc fd f3 ed 33 47 28 a6 93 a7 0a ...Z.....3G(.... 00:20:33.937 00000310 b3 50 a0 fd 5e f8 4a ae 80 3e 50 8f 17 07 c8 7a .P..^.J..>P....z 00:20:33.937 00000320 ea f5 44 69 2a e3 bb 7f 66 e4 80 87 5b 23 a4 7f ..Di*...f...[#.. 00:20:33.937 00000330 e4 ed 8e fb fb ac ce 59 95 45 b5 1f de ad c5 d7 .......Y.E...... 00:20:33.937 00000340 8e 25 e6 42 47 22 17 d0 08 de 51 09 4d 55 14 0a .%.BG"....Q.MU.. 00:20:33.937 00000350 d8 ac 9f f6 a5 d5 f4 c9 bb 3d c8 c6 84 f3 d3 c6 .........=...... 00:20:33.937 00000360 b3 8a d7 a1 54 88 17 ea ca ca 59 8d 79 6b ae db ....T.....Y.yk.. 00:20:33.937 00000370 45 b3 d5 fd 16 ff c6 43 db 49 08 c6 2b 01 f7 be E......C.I..+... 00:20:33.937 00000380 58 21 82 6e e5 c1 a3 8e ae 41 4e 18 c0 b3 d8 39 X!.n.....AN....9 00:20:33.937 00000390 02 15 fe 86 11 30 c6 70 d8 07 1c bc cc 56 49 ac .....0.p.....VI. 00:20:33.937 000003a0 38 79 77 61 23 db 4a 93 b0 28 f6 5b bd 3f a9 82 8ywa#.J..(.[.?.. 00:20:33.937 000003b0 7e 1b 20 e6 18 0c d2 ab b9 32 ce 18 87 a5 e7 d0 ~. ......2...... 00:20:33.937 000003c0 a8 f4 bb 6c 33 4e fb 2b bc c6 8b 08 2f 41 65 bb ...l3N.+..../Ae. 00:20:33.937 000003d0 c3 63 bf 20 d2 ac b4 ff 73 ec 00 80 af 38 c3 3b .c. ....s....8.; 00:20:33.937 000003e0 d6 cf 5c 84 0a 3e ef 4e 05 ea 64 cc 26 1f b0 27 ..\..>.N..d.&..' 00:20:33.937 000003f0 a6 1d 2d 4a 84 7a 85 63 e3 a6 78 94 51 dc 26 de ..-J.z.c..x.Q.&. 00:20:33.937 dh secret: 00:20:33.937 00000000 e6 76 7e 2a 78 2b 42 0f e4 08 fb cd e6 a4 d9 fe .v~*x+B......... 00:20:33.937 00000010 57 62 7c e2 70 2f 11 33 49 f1 74 95 6c c7 61 21 Wb|.p/.3I.t.l.a! 00:20:33.937 00000020 99 fe 1f bc 29 87 cb 4c 79 55 c0 56 4b f8 9b 9b ....)..LyU.VK... 00:20:33.937 00000030 2f 98 1b 3e 29 8f 01 70 44 66 ee a3 af 69 80 5a /..>)..pDf...i.Z 00:20:33.937 00000040 90 92 7f c6 0c 82 a9 ef 52 c4 dd 17 82 a4 b0 57 ........R......W 00:20:33.937 00000050 3a b5 a9 08 20 6f 69 58 0b 77 db b5 71 b0 6f 06 :... oiX.w..q.o. 00:20:33.937 00000060 c2 28 43 bc 6d 00 34 bb 91 53 0d 61 7d bd 58 f8 .(C.m.4..S.a}.X. 00:20:33.937 00000070 d2 a5 19 56 7c cf fe ec 42 75 ec 49 10 6e 26 ad ...V|...Bu.I.n&. 00:20:33.937 00000080 1c f0 ba 5f 50 2c 81 08 89 23 08 87 76 e3 21 13 ..._P,...#..v.!. 00:20:33.937 00000090 0e 76 7e a2 5f 8e 54 ca 35 49 b0 2b c8 0b c0 46 .v~._.T.5I.+...F 00:20:33.937 000000a0 4f 6d 97 d0 36 dc 00 d8 cb 07 cb 1c be 7b 7b 8b Om..6........{{. 00:20:33.937 000000b0 53 82 f9 d5 84 7d 53 7b 39 ba 67 be 7e ef 3b 5e S....}S{9.g.~.;^ 00:20:33.937 000000c0 73 fc 84 ab e0 f4 5a 5e c9 a2 2d 7d 45 4c 9e aa s.....Z^..-}EL.. 00:20:33.937 000000d0 2c dc 29 0d 89 05 eb e5 8d 0b 22 26 43 c1 bf 56 ,.)......."&C..V 00:20:33.937 000000e0 df 46 d1 17 05 ab 87 bd e9 39 5e 01 89 b9 b7 5d .F.......9^....] 00:20:33.937 000000f0 03 78 9d 70 8b d2 c8 9c dc fd 54 dd a5 0d bc df .x.p......T..... 00:20:33.937 00000100 d6 db 23 13 a8 6b 57 da 45 da 19 e8 45 6a 89 8e ..#..kW.E...Ej.. 00:20:33.937 00000110 9c 37 09 ce b3 35 7b 5c 49 82 a0 b0 1d 0d e6 42 .7...5{\I......B 00:20:33.937 00000120 ed 52 7d e7 58 67 51 9b aa 31 4e cd 13 0e 7f d3 .R}.XgQ..1N..... 00:20:33.937 00000130 ec 21 d6 ed 69 1e ca e2 05 fd 12 d5 63 88 2e 39 .!..i.......c..9 00:20:33.937 00000140 f9 5a 64 7c 81 8f b9 55 03 11 56 f3 d7 fa a6 ca .Zd|...U..V..... 00:20:33.937 00000150 02 a3 6a de 8c 5e 88 92 e7 21 54 08 99 2d 14 11 ..j..^...!T..-.. 00:20:33.937 00000160 ea 44 44 81 2f 0c 28 7b 7b 93 b0 3b 57 66 c9 a3 .DD./.({{..;Wf.. 00:20:33.937 00000170 45 64 e6 46 e3 20 02 a2 dc 68 e6 78 4c 44 20 d4 Ed.F. ...h.xLD . 00:20:33.937 00000180 53 01 9b 51 3a 36 46 bb 4a c2 f8 40 25 af 29 00 S..Q:6F.J..@%.). 00:20:33.937 00000190 79 fb 83 55 bf 1e d6 75 4d 70 0e 87 8b d9 cb 32 y..U...uMp.....2 00:20:33.937 000001a0 f2 f5 d3 e7 1a 36 af e6 53 7a 7b 37 3d b0 92 3f .....6..Sz{7=..? 00:20:33.937 000001b0 8f e6 98 32 3f 2c bf e3 80 ab 35 cb 50 29 e6 a8 ...2?,....5.P).. 00:20:33.937 000001c0 db a7 6a ee fa 25 e0 17 c1 b9 a1 c1 09 23 3a 2c ..j..%.......#:, 00:20:33.937 000001d0 2d 93 c3 dd 51 03 9c 63 5b b5 67 a1 5e 9a 1c 5e -...Q..c[.g.^..^ 00:20:33.937 000001e0 73 e4 6f 68 bc d3 eb a1 91 00 74 71 75 f6 d3 54 s.oh......tqu..T 00:20:33.937 000001f0 e6 a4 ef 1c 2f 64 cd b1 cf 95 cf 3a 8c ca 4c 9a ..../d.....:..L. 00:20:33.937 00000200 79 31 04 ad 0d c8 18 8c 63 ee 84 21 7e 93 2e 54 y1......c..!~..T 00:20:33.937 00000210 fb 8c c2 f3 e4 5f 64 4d cb c7 a2 ba aa 12 d2 d2 ....._dM........ 00:20:33.937 00000220 9f fb 99 4c 53 3c 02 67 d2 17 f4 6c 0f ab 57 e5 ...LS<.g...l..W. 00:20:33.937 00000230 28 f0 7a 76 8c e2 cc af 47 2b b0 02 e6 30 7c 88 (.zv....G+...0|. 00:20:33.937 00000240 36 e0 e3 ad dc 40 8b b8 0c 3e 29 1c 30 0e 8e 87 6....@...>).0... 00:20:33.937 00000250 71 c2 05 ae 86 1b a5 9e f3 81 3d 82 49 24 3f 63 q.........=.I$?c 00:20:33.937 00000260 fa 64 8a f8 0d df 39 06 09 f9 b2 8f 50 de 9b 7a .d....9.....P..z 00:20:33.937 00000270 1c d6 fb e1 44 48 82 6f f2 32 51 bc 18 50 e6 1b ....DH.o.2Q..P.. 00:20:33.937 00000280 26 ef c7 0b 7d 0f 7d 01 70 5a 54 ac 78 ea b3 02 &...}.}.pZT.x... 00:20:33.937 00000290 f6 e8 c3 2d d3 67 0c 75 73 34 1a a5 bb 49 34 80 ...-.g.us4...I4. 00:20:33.937 000002a0 48 54 ee a3 2e bf c9 6d 39 ec b4 48 f3 21 31 f9 HT.....m9..H.!1. 00:20:33.937 000002b0 77 68 95 aa 50 67 c4 9f b1 ac 70 43 d1 59 75 54 wh..Pg....pC.YuT 00:20:33.937 000002c0 02 8b 16 7f 26 8e 7c cf 8d b0 a0 c4 35 b6 b1 16 ....&.|.....5... 00:20:33.937 000002d0 22 ef 05 e4 ed 42 c0 3b 8e 45 f2 0d 12 28 8e 4e "....B.;.E...(.N 00:20:33.937 000002e0 75 76 5a 7f df 77 03 cb 1d fa 4e ec 7e e5 ee 3c uvZ..w....N.~..< 00:20:33.937 000002f0 b8 d3 d0 fc c2 5c 4a de 8f 1c 52 4c 9a 89 49 6c .....\J...RL..Il 00:20:33.937 00000300 a9 1f 53 f0 70 83 9c b1 1f 3f 5f 03 2b bf 1b 8b ..S.p....?_.+... 00:20:33.937 00000310 71 a8 8f 27 ae 45 44 a2 d3 fd 68 23 9e ec 3d 63 q..'.ED...h#..=c 00:20:33.937 00000320 8f c7 59 46 d0 3f dd 69 eb 78 99 6a bc f2 f2 1f ..YF.?.i.x.j.... 00:20:33.937 00000330 f3 c5 bf 62 2b 0c 27 5c 17 c5 26 6c be 14 f2 68 ...b+.'\..&l...h 00:20:33.937 00000340 c3 ed 17 27 a1 9e 2e 1e e6 64 3f 9b 33 ca 34 7f ...'.....d?.3.4. 00:20:33.937 00000350 b3 9a d2 32 8b 26 5b cd 68 67 cd 23 90 50 f4 73 ...2.&[.hg.#.P.s 00:20:33.937 00000360 37 d3 47 c5 0f 81 95 dc a9 a7 45 f0 f4 20 87 e2 7.G.......E.. .. 00:20:33.937 00000370 1c e0 ad 6c a6 72 73 1f a0 45 da 45 e9 c0 6a 07 ...l.rs..E.E..j. 00:20:33.937 00000380 13 8f 77 81 20 44 19 34 7d e2 8a 60 34 40 21 6f ..w. D.4}..`4@!o 00:20:33.937 00000390 31 89 b0 71 3c ed fb be 14 c8 83 95 8b b8 e8 ca 1..q<........... 00:20:33.937 000003a0 a6 c3 22 83 0e a5 55 3a 49 07 12 60 89 ab 11 9d .."...U:I..`.... 00:20:33.937 000003b0 b7 4e 38 6a 05 1c bf a4 89 11 36 ca d9 e4 56 2f .N8j......6...V/ 00:20:33.937 000003c0 bd a0 2c 7a a2 26 26 c8 58 95 4f c0 ca c3 30 f0 ..,z.&&.X.O...0. 00:20:33.937 000003d0 eb f4 07 4f 76 3c dd 74 86 70 1d 29 fa 1b dc a7 ...Ov<.t.p.).... 00:20:33.937 000003e0 62 2f 8b 98 10 70 07 cf e5 5e 13 e2 cb 15 10 f2 b/...p...^...... 00:20:33.937 000003f0 a6 01 6b ae ab 76 70 6b da b5 18 2c 7b 26 de 40 ..k..vpk...,{&.@ 00:20:33.937 [2024-09-27 15:25:21.944228] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=2, dhgroup=5, seq=3428451786, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.937 [2024-09-27 15:25:21.944330] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.937 [2024-09-27 15:25:22.027442] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.937 [2024-09-27 15:25:22.027487] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.937 [2024-09-27 15:25:22.027498] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.937 [2024-09-27 15:25:22.027524] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.937 [2024-09-27 15:25:22.218834] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.937 [2024-09-27 15:25:22.218855] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.937 [2024-09-27 15:25:22.218862] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.937 [2024-09-27 15:25:22.218908] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.937 [2024-09-27 15:25:22.218930] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.937 ctrlr pubkey: 00:20:33.937 00000000 d4 58 3a 3c 0d 91 b3 cb 69 ee cf c0 18 2f 7e 46 .X:<....i..../~F 00:20:33.937 00000010 7a 7a 62 07 53 43 26 0b a8 f0 9f 09 39 da 24 ef zzb.SC&.....9.$. 00:20:33.937 00000020 18 e7 2b b3 62 d3 f0 27 c1 88 6a dd 91 29 c0 fe ..+.b..'..j..).. 00:20:33.937 00000030 b6 e1 9f a1 e6 d4 fc ac 6f d5 7f 37 99 6e 7a dd ........o..7.nz. 00:20:33.937 00000040 41 9a 6a 2a 95 74 13 8d b8 d9 0b d2 17 0a f0 1c A.j*.t.......... 00:20:33.937 00000050 e8 c7 21 7f 3b 1b e8 9d 2b 01 3e d9 cf d2 5f bf ..!.;...+.>..._. 00:20:33.937 00000060 0e 13 7a e8 0d 7a 8c 25 04 1e 29 4c 8a 7b d9 ef ..z..z.%..)L.{.. 00:20:33.937 00000070 5d 4c fc 8d a3 80 8f 0e cb 2b 5e 12 89 dc 15 56 ]L.......+^....V 00:20:33.937 00000080 a5 3d a6 08 67 5b 61 0f bb 18 58 d5 84 fd 4d b7 .=..g[a...X...M. 00:20:33.937 00000090 ec da a6 50 1a fe 9b 35 f9 c2 9a 33 ab 4c 70 53 ...P...5...3.LpS 00:20:33.937 000000a0 2d 08 92 cd 17 d9 87 74 50 1b a3 34 bf 8c 09 d7 -......tP..4.... 00:20:33.937 000000b0 66 cd 19 84 ac 5e ef 9f d4 dd 50 56 28 de 38 2f f....^....PV(.8/ 00:20:33.937 000000c0 80 78 7d 23 f6 fc ff fc e0 89 35 0c cd 11 ec 37 .x}#......5....7 00:20:33.937 000000d0 27 ef 0f 41 c3 25 5b ae 39 0b d7 6e 90 74 e2 5d '..A.%[.9..n.t.] 00:20:33.937 000000e0 ee ed 94 80 01 64 82 f7 de 87 18 ad a7 f4 18 8d .....d.......... 00:20:33.937 000000f0 c6 42 ae fa c2 2a bf 68 8f 4f a0 50 f3 1a 68 f3 .B...*.h.O.P..h. 00:20:33.937 00000100 00 e1 c8 47 a1 69 45 79 81 d5 e6 c8 8f fb 96 aa ...G.iEy........ 00:20:33.937 00000110 19 1a 14 df ae 9d 88 fc 63 3e bd 93 22 83 83 63 ........c>.."..c 00:20:33.937 00000120 a5 b7 bb fb e4 e2 8b 06 ec 5c 95 33 c0 d0 28 d4 .........\.3..(. 00:20:33.937 00000130 31 8c 47 23 86 51 d4 9f ec 7d d9 43 f6 2b 78 42 1.G#.Q...}.C.+xB 00:20:33.937 00000140 a8 ab 86 43 20 9e 59 ec bf 29 6c ba b9 14 07 06 ...C .Y..)l..... 00:20:33.937 00000150 b8 9a c5 89 79 ef fa c2 b9 9f 11 57 cf a6 90 ce ....y......W.... 00:20:33.937 00000160 db de 7a 70 3b 09 8d 57 97 6e 7b 72 8c a9 bb 8b ..zp;..W.n{r.... 00:20:33.938 00000170 13 cc 77 e0 b7 df 1f b4 02 46 87 5c cb cb b1 9c ..w......F.\.... 00:20:33.938 00000180 53 67 62 49 2c 95 0b cc 13 f3 43 5a f1 f8 1d ce SgbI,.....CZ.... 00:20:33.938 00000190 95 17 dd 8e f5 4b 48 da c8 06 b4 41 14 50 41 24 .....KH....A.PA$ 00:20:33.938 000001a0 b8 b1 fb 16 46 13 7b bd c5 71 4b 35 72 91 9f 87 ....F.{..qK5r... 00:20:33.938 000001b0 51 d1 06 6f 3f 1b 1d 06 57 38 2c 14 94 0f 17 66 Q..o?...W8,....f 00:20:33.938 000001c0 bc 3c 6d af e2 1b 87 e7 31 f7 b7 1e 71 8f 54 3d ..1 00:20:33.938 00000290 89 81 6e d1 78 2b fe ce 0c 33 b8 e4 9c c6 d1 9b ..n.x+...3...... 00:20:33.938 000002a0 06 0a 4c 4c 9e 94 0a 9e 87 4a 65 61 a8 2c 6c d5 ..LL.....Jea.,l. 00:20:33.938 000002b0 6b 9a e6 d3 0c 29 2d f8 7c bb 1f b6 e3 fa b0 15 k....)-.|....... 00:20:33.938 000002c0 76 b1 d1 29 79 78 c3 c4 a8 d0 a3 79 e9 43 09 2a v..)yx.....y.C.* 00:20:33.938 000002d0 a2 fc d6 30 e5 5c f3 11 67 75 cd 04 1d bb 8d a2 ...0.\..gu...... 00:20:33.938 000002e0 47 e8 22 fc 60 9a 0c 3a 91 f2 df 0a 6d ad 4b da G.".`..:....m.K. 00:20:33.938 000002f0 1e f5 19 a5 9b 91 7d 82 83 f1 77 90 7e 3b 2b 96 ......}...w.~;+. 00:20:33.938 00000300 1f c8 26 53 50 bc 25 a3 69 5c 7f 40 7c f5 1c 7e ..&SP.%.i\.@|..~ 00:20:33.938 00000310 a9 ca f4 4a ad db 18 63 e1 13 e8 c8 2d b5 08 58 ...J...c....-..X 00:20:33.938 00000320 52 18 48 e2 9c 49 0d c1 4f 5f ea 66 d9 ea 63 42 R.H..I..O_.f..cB 00:20:33.938 00000330 03 f3 68 8b 48 ec 5f d2 8a 07 b7 d0 ff 71 66 6f ..h.H._......qfo 00:20:33.938 00000340 3f c9 ff 36 a6 bf af 89 50 3b 5b 77 62 be c8 d1 ?..6....P;[wb... 00:20:33.938 00000350 5a 8a 12 2e f8 4c 79 2c 5e 72 46 95 e3 30 9a 34 Z....Ly,^rF..0.4 00:20:33.938 00000360 b8 d3 2f c2 8c f5 2f 47 8d 7b c8 35 5a 44 36 ae ../.../G.{.5ZD6. 00:20:33.938 00000370 5d 13 a2 f5 8c f5 14 ff e2 95 25 01 7b 0b 59 0a ].........%.{.Y. 00:20:33.938 00000380 73 66 36 b4 04 87 a6 76 15 88 88 2d 6b 7f ae 88 sf6....v...-k... 00:20:33.938 00000390 a7 7a ec 1b 13 85 4f 13 50 fa 18 cb 1c 09 26 bb .z....O.P.....&. 00:20:33.938 000003a0 be d2 3f 98 d5 74 11 cf 2a bd 1f bc 7d 21 59 2e ..?..t..*...}!Y. 00:20:33.938 000003b0 c4 ba 6c 4f 20 47 93 35 4a 87 27 4b 1d a5 cd 3c ..lO G.5J.'K...< 00:20:33.938 000003c0 94 26 1f 47 09 4f e3 53 fa 61 37 45 d0 02 98 a7 .&.G.O.S.a7E.... 00:20:33.938 000003d0 08 af 81 91 b5 31 7e 23 61 e9 c3 e8 bf 2f 22 c3 .....1~#a..../". 00:20:33.938 000003e0 77 12 18 fe 70 d9 0f 30 c4 1f 01 b3 24 e8 4a 9f w...p..0....$.J. 00:20:33.938 000003f0 66 2d db ef 3c 48 92 f8 56 8d ea f3 ad 6a b0 97 f-.....n| 00:20:33.938 00000010 a1 8d 95 96 eb d3 e1 43 8c 24 dc bc 19 5c 70 43 .......C.$...\pC 00:20:33.938 00000020 10 f8 97 80 36 67 83 a9 df 06 ac 6a 0d 4c 45 66 ....6g.....j.LEf 00:20:33.938 00000030 1c 92 de aa 51 19 2c 99 c1 40 ae 60 55 13 c2 4a ....Q.,..@.`U..J 00:20:33.938 00000040 57 77 c3 0e a7 b1 a2 fe 7c 1f 7a 06 52 09 17 e6 Ww......|.z.R... 00:20:33.938 00000050 81 c6 2d 05 10 d3 ac f0 7f b7 92 c9 a9 ba ab 88 ..-............. 00:20:33.938 00000060 93 5a f0 5a 86 91 1f a5 f0 61 dd 06 a5 f8 44 37 .Z.Z.....a....D7 00:20:33.938 00000070 b8 0e e1 98 ca f3 48 1c 3d 2d 85 8d 22 9e 2f a9 ......H.=-.."./. 00:20:33.938 00000080 dd 6c c7 52 7b 9e c6 3a 09 94 9b ba af 42 fe 75 .l.R{..:.....B.u 00:20:33.938 00000090 49 44 6d 1d e6 61 8f 0c a0 7c 07 fc ba df c6 d8 IDm..a...|...... 00:20:33.938 000000a0 1f 8c eb 4a 14 55 52 12 7e f7 30 8b 5d c1 10 de ...J.UR.~.0.]... 00:20:33.938 000000b0 c8 89 7a 12 e2 92 9f 48 50 bf b6 1d b2 67 ce 11 ..z....HP....g.. 00:20:33.938 000000c0 96 75 ee 28 45 d2 ab 74 89 27 ef d7 75 eb 22 f7 .u.(E..t.'..u.". 00:20:33.938 000000d0 cb 48 d2 5c 6a dc cb e3 e6 18 47 ba 89 84 e1 36 .H.\j.....G....6 00:20:33.938 000000e0 69 c9 46 e1 99 cc 8b 48 1e 4a f2 3b 79 47 87 94 i.F....H.J.;yG.. 00:20:33.938 000000f0 4b 6e 2b 72 a5 de 98 ab 75 66 bf b2 57 bb 02 8d Kn+r....uf..W... 00:20:33.938 00000100 1b 90 09 8f db d8 38 10 59 01 e7 c8 8e f0 fc 11 ......8.Y....... 00:20:33.938 00000110 b8 cb 15 a2 51 00 b1 e5 75 d4 b4 ec e9 0e 77 05 ....Q...u.....w. 00:20:33.938 00000120 bf 35 d8 b8 30 cf 3d 40 77 2a c3 ca 65 46 1e 19 .5..0.=@w*..eF.. 00:20:33.938 00000130 31 60 92 92 b3 f6 8e cc a1 cb e5 b9 80 6e 3a 7a 1`...........n:z 00:20:33.938 00000140 10 7b 2c c3 f5 2f 18 a4 de 53 64 05 79 07 fe a2 .{,../...Sd.y... 00:20:33.938 00000150 e9 ed 30 46 cd 83 fb 92 c4 19 7d 27 8e a2 87 94 ..0F......}'.... 00:20:33.938 00000160 63 6b 80 63 52 73 cc 10 a2 80 10 d2 1d 70 ce 6b ck.cRs.......p.k 00:20:33.938 00000170 bf 95 1c 8c 9c 7d 73 fc 71 ed 2f 67 1f a4 65 ba .....}s.q./g..e. 00:20:33.938 00000180 12 32 83 e9 8d 01 75 a8 26 d7 a7 67 8b 7d ca c0 .2....u.&..g.}.. 00:20:33.938 00000190 53 c8 87 5c 7b 6c b5 bd 11 a7 cf 62 36 47 6b 71 S..\{l.....b6Gkq 00:20:33.938 000001a0 23 01 62 cb 5a 6f 7e b7 27 84 74 34 f2 2b a2 f8 #.b.Zo~.'.t4.+.. 00:20:33.938 000001b0 70 a2 3d 60 a6 a2 0b 8c a1 15 fc b4 77 c2 6c da p.=`........w.l. 00:20:33.938 000001c0 4f 00 ca 65 0f df 96 79 cf f6 3a 99 3c 3d 38 b2 O..e...y..:.<=8. 00:20:33.938 000001d0 7c 4d 32 11 63 39 ef 9e 02 dc 33 0f cb 90 fa 76 |M2.c9....3....v 00:20:33.938 000001e0 b5 68 c9 92 f9 4c 35 ed cc d5 d4 ea 5b 0c 76 f8 .h...L5.....[.v. 00:20:33.938 000001f0 2d 60 04 97 80 46 3d c9 37 89 44 83 d9 13 83 b0 -`...F=.7.D..... 00:20:33.938 00000200 32 10 6b e2 b8 37 50 d6 8b 58 78 50 2a 61 04 35 2.k..7P..XxP*a.5 00:20:33.938 00000210 a9 3a e3 23 37 de 67 5e e7 1c 70 a7 e0 b3 e3 cb .:.#7.g^..p..... 00:20:33.938 00000220 20 c0 1f 58 5c 15 7d 6d 1f 99 d1 e0 04 33 21 07 ..X\.}m.....3!. 00:20:33.938 00000230 a6 96 c3 78 aa a6 8d 6e 23 9b be e6 38 81 88 1d ...x...n#...8... 00:20:33.938 00000240 45 69 90 c2 20 4c 88 d7 06 dc 87 7f a9 2e 03 77 Ei.. L.........w 00:20:33.938 00000250 16 d7 e8 9b 59 05 7f 21 bc f6 af cb f2 1c 17 3c ....Y..!.......< 00:20:33.938 00000260 78 7e 28 43 09 15 ca a7 8f 6d c1 44 86 e0 e3 9d x~(C.....m.D.... 00:20:33.938 00000270 14 f2 b0 5c 0f d5 a1 3f 69 98 11 79 47 61 7a cf ...\...?i..yGaz. 00:20:33.938 00000280 a8 2b 9d 72 71 d5 1d ae c2 c1 d0 b0 fc 4d 71 f4 .+.rq........Mq. 00:20:33.938 00000290 05 40 e6 02 4e 19 92 1e d7 03 e8 60 37 cb a4 f8 .@..N......`7... 00:20:33.938 000002a0 05 30 6b e3 b5 d1 04 81 37 20 6b 2b 3f f4 27 e3 .0k.....7 k+?.'. 00:20:33.938 000002b0 0f 93 d1 9d d8 17 61 2a 0c 61 39 e3 77 a7 df 71 ......a*.a9.w..q 00:20:33.938 000002c0 ef f3 96 98 a1 20 2b c7 34 cb 86 33 94 06 c8 ac ..... +.4..3.... 00:20:33.938 000002d0 a1 db d0 c0 ed 3b de 55 c1 8c 0d 5a fd c1 d5 9b .....;.U...Z.... 00:20:33.938 000002e0 a3 68 06 b3 36 08 20 63 c0 08 7b cc 14 5e aa 1d .h..6. c..{..^.. 00:20:33.938 000002f0 a7 7d 7a 6e 97 a4 37 2f 7b 27 0d f5 52 92 d1 c9 .}zn..7/{'..R... 00:20:33.938 00000300 ee d3 49 87 e5 33 82 40 a6 25 3e 24 16 d7 7b 34 ..I..3.@.%>$..{4 00:20:33.938 00000310 43 dd 4d 11 06 ed 6d 27 90 6b c4 25 90 94 4b 84 C.M...m'.k.%..K. 00:20:33.938 00000320 30 f4 f6 d4 85 21 55 a5 36 c3 5c fd dc 5c d4 3b 0....!U.6.\..\.; 00:20:33.938 00000330 b0 fe 90 4c 5e 86 16 f1 70 a3 23 c3 6a 6c 57 e0 ...L^...p.#.jlW. 00:20:33.938 00000340 75 b1 52 72 3a 01 be 80 81 bb b6 74 fa ec fc 7d u.Rr:......t...} 00:20:33.938 00000350 69 f1 de c9 3d d1 34 af e8 8a 8e 48 e1 04 32 26 i...=.4....H..2& 00:20:33.938 00000360 69 ee f1 13 7f 21 5a 69 98 f7 8b 52 83 64 5a 47 i....!Zi...R.dZG 00:20:33.938 00000370 2c 85 d7 31 e1 f8 77 07 aa 0e a0 e4 5e 7d c3 11 ,..1..w.....^}.. 00:20:33.938 00000380 c2 fb c2 cd 36 71 82 fc 73 60 76 4e 88 54 6c 52 ....6q..s`vN.TlR 00:20:33.938 00000390 d1 6f 92 45 16 a1 09 c0 82 f6 71 e6 54 1d e7 6b .o.E......q.T..k 00:20:33.938 000003a0 91 c2 61 b6 fc d6 a8 c7 2a c2 c2 a1 88 21 6f 5f ..a.....*....!o_ 00:20:33.938 000003b0 3a f4 d8 02 d2 38 67 83 de 98 81 ed b1 57 4c 64 :....8g......WLd 00:20:33.938 000003c0 1f 38 81 9d 81 d1 cb d2 33 7b e8 b5 a1 4c 27 64 .8......3{...L'd 00:20:33.938 000003d0 06 86 27 b9 c8 7d 29 4c ed e7 68 7c d0 f7 1a f8 ..'..})L..h|.... 00:20:33.938 000003e0 f2 a4 b0 8e 71 28 1d bc 6c 0c fb 4b 86 75 f3 0c ....q(..l..K.u.. 00:20:33.938 000003f0 f5 bb 5b e6 52 ea fd dd db 05 ad 53 79 d3 4e 7f ..[.R......Sy.N. 00:20:33.938 dh secret: 00:20:33.938 00000000 fb 4f b2 c9 2a c3 36 85 54 68 15 d0 3e 81 c8 47 .O..*.6.Th..>..G 00:20:33.938 00000010 95 99 a7 b0 47 9a 39 2e f8 54 06 9c e3 66 5b e7 ....G.9..T...f[. 00:20:33.938 00000020 84 43 3e 3f db 90 a7 e7 0c 34 c2 1a 38 73 3d 6b .C>?.....4..8s=k 00:20:33.938 00000030 e5 b5 d5 fd ee 23 93 b6 98 ad 59 27 b9 2b b7 98 .....#....Y'.+.. 00:20:33.938 00000040 07 29 b2 45 4d e6 f2 f4 7b a4 3f a6 1f b4 cd f4 .).EM...{.?..... 00:20:33.938 00000050 70 e5 57 b2 b9 69 bc 5e 5e 34 9f f6 19 28 90 76 p.W..i.^^4...(.v 00:20:33.938 00000060 7e 02 95 18 57 63 38 b2 c9 28 e5 9c 12 79 34 e6 ~...Wc8..(...y4. 00:20:33.938 00000070 b5 7a 86 16 00 11 c3 5c 97 15 36 46 f1 da 9c 81 .z.....\..6F.... 00:20:33.938 00000080 2f 58 f4 44 a8 31 e2 63 10 d4 a2 03 8c 39 14 0d /X.D.1.c.....9.. 00:20:33.938 00000090 53 f9 38 a7 96 24 02 26 a7 22 d0 84 08 a5 ff 22 S.8..$.&."....." 00:20:33.938 000000a0 87 df e4 43 18 f6 cb 08 df fe 09 b5 d6 6a a4 4a ...C.........j.J 00:20:33.938 000000b0 01 d8 fb 13 56 5e 3b dc 8a 28 27 b1 8f f8 e0 1f ....V^;..('..... 00:20:33.939 000000c0 66 b0 ed 85 f3 fa d9 61 8b 1a 07 5e 9b e8 b5 0e f......a...^.... 00:20:33.939 000000d0 f2 ff e0 04 6f e2 dc 1a 60 9d 0c 46 9c 0a a9 e1 ....o...`..F.... 00:20:33.939 000000e0 f2 f0 d5 bb 4e 74 29 8d d6 55 66 be de 8e a6 13 ....Nt)..Uf..... 00:20:33.939 000000f0 8a cf 85 99 20 20 58 5b 5d 9a d9 68 a3 f7 e1 c0 .... X[]..h.... 00:20:33.939 00000100 de 2e 5d 7d 28 14 64 f5 da df 37 5d f6 39 2d 72 ..]}(.d...7].9-r 00:20:33.939 00000110 88 94 53 25 a7 1f 1e 12 14 e3 d2 a2 45 13 b5 2f ..S%........E../ 00:20:33.939 00000120 3e c8 d8 e7 15 42 35 47 7c ab cf 51 91 dc 34 0b >....B5G|..Q..4. 00:20:33.939 00000130 83 b9 21 1f b3 06 72 69 92 df 3c aa 0b 35 fd b7 ..!...ri..<..5.. 00:20:33.939 00000140 c3 dd 0d 0f 99 7e f6 05 62 7f 52 a3 76 a4 40 23 .....~..b.R.v.@# 00:20:33.939 00000150 d2 94 7b 1a fb d9 0d 8c 31 57 0d 59 74 e4 44 e9 ..{.....1W.Yt.D. 00:20:33.939 00000160 11 d7 0e 6e 8c c6 77 91 4b 6d ef 74 c4 69 4d 7c ...n..w.Km.t.iM| 00:20:33.939 00000170 e1 86 b0 1c 0a 98 5b f0 a3 e8 58 d3 39 a9 2f 12 ......[...X.9./. 00:20:33.939 00000180 58 2f 4f 54 24 5d 48 fc 60 c0 31 27 35 4a 65 58 X/OT$]H.`.1'5JeX 00:20:33.939 00000190 f7 de 3d 33 fc 06 39 e3 13 19 87 48 0e ce a3 77 ..=3..9....H...w 00:20:33.939 000001a0 3f 2b 70 91 91 c4 f3 94 54 5e 8d bf f8 d8 de 8c ?+p.....T^...... 00:20:33.939 000001b0 7a 34 60 ef e2 21 8a ce 01 93 35 6f ca 8a 6e 8e z4`..!....5o..n. 00:20:33.939 000001c0 bb 3f 85 b3 5d 8d cb 16 fb 38 52 4a 25 6d 33 56 .?..]....8RJ%m3V 00:20:33.939 000001d0 eb 54 57 05 4c c0 f0 74 c1 24 ee 69 ff 76 cc d3 .TW.L..t.$.i.v.. 00:20:33.939 000001e0 32 c8 ca f1 01 ec da 16 fd 97 e8 6c 37 83 fa fe 2..........l7... 00:20:33.939 000001f0 78 d6 64 2a 29 22 44 fe 58 5f 4f a7 38 bb a6 23 x.d*)"D.X_O.8..# 00:20:33.939 00000200 99 31 67 09 4a 88 54 d3 77 6d 99 01 48 32 a4 6a .1g.J.T.wm..H2.j 00:20:33.939 00000210 9a af 2e b4 b2 be c2 6e fe a8 70 d0 7b 2f 4d 45 .......n..p.{/ME 00:20:33.939 00000220 40 04 60 e0 60 91 4f a1 81 ed 59 36 55 43 98 c5 @.`.`.O...Y6UC.. 00:20:33.939 00000230 16 92 70 8f 12 98 f0 99 43 47 c1 68 50 35 67 6b ..p.....CG.hP5gk 00:20:33.939 00000240 ce bf 60 42 6a 38 56 e5 4f 17 7f a3 5d 31 c8 83 ..`Bj8V.O...]1.. 00:20:33.939 00000250 27 6b 5b ff 20 c5 cd 3b c3 9d a2 6f 85 65 19 12 'k[. ..;...o.e.. 00:20:33.939 00000260 91 b3 f5 04 88 87 28 df 0e 2c a7 5a 12 97 f9 17 ......(..,.Z.... 00:20:33.939 00000270 be b5 87 be 19 c3 4d 3a 5b 47 d1 74 2c 08 a6 3b ......M:[G.t,..; 00:20:33.939 00000280 4f 21 04 50 3a db 14 fb 80 b8 cd 63 a3 c2 af 39 O!.P:......c...9 00:20:33.939 00000290 4a 9f 13 40 16 b4 be e8 4b 01 5e cf 0d 47 37 24 J..@....K.^..G7$ 00:20:33.939 000002a0 d6 d3 87 44 bd 00 25 c9 75 2f df eb cc 41 c5 21 ...D..%.u/...A.! 00:20:33.939 000002b0 64 9a 07 32 d7 34 47 aa 5a 52 d6 4f 28 19 37 f1 d..2.4G.ZR.O(.7. 00:20:33.939 000002c0 4e ca 3b 36 c3 ff 5f 0c e0 ae 1d 77 e0 f8 1b 62 N.;6.._....w...b 00:20:33.939 000002d0 53 a8 ae 87 80 21 b3 f1 dd ff e7 4c 81 4d cd 2b S....!.....L.M.+ 00:20:33.939 000002e0 13 67 7c 59 88 2f 8f fc 78 a9 a1 88 57 c9 b2 5d .g|Y./..x...W..] 00:20:33.939 000002f0 10 18 27 2d 60 34 67 75 42 e6 06 d3 1a 7e b3 dc ..'-`4guB....~.. 00:20:33.939 00000300 f4 e4 8c ef f3 9f 0a 53 26 2c 78 63 44 ea ea d6 .......S&,xcD... 00:20:33.939 00000310 db 5c f8 ab 9f 84 c1 96 0e d6 26 8e 24 7c 59 1f .\........&.$|Y. 00:20:33.939 00000320 3a 70 b8 9e d6 dc b9 12 b1 e2 10 11 e2 45 4e 86 :p...........EN. 00:20:33.939 00000330 d0 1a a4 29 a3 92 15 e6 f9 78 2e d2 c3 cc 7d 86 ...).....x....}. 00:20:33.939 00000340 b7 a9 fd c9 92 b1 80 41 e4 0e be e7 f4 54 39 b0 .......A.....T9. 00:20:33.939 00000350 5d 59 d0 ff d3 9e 66 1c d1 85 ec ad 0d 55 4c a7 ]Y....f......UL. 00:20:33.939 00000360 db d0 fe 24 5a f1 b9 50 f4 c3 50 00 cf 11 a0 48 ...$Z..P..P....H 00:20:33.939 00000370 9e fd 8c 74 62 66 bf 69 ea b5 fe 97 4e 3d 82 07 ...tbf.i....N=.. 00:20:33.939 00000380 92 e2 80 33 d6 c0 5e 3b 49 35 54 a1 62 ac 21 bf ...3..^;I5T.b.!. 00:20:33.939 00000390 a6 59 49 00 50 67 bb e7 6d bf 15 58 b4 45 c2 f0 .YI.Pg..m..X.E.. 00:20:33.939 000003a0 41 50 10 d7 36 56 5d 0c 79 44 ef 60 e5 ba dc 0f AP..6V].yD.`.... 00:20:33.939 000003b0 9a 29 3a 42 4b 68 9b a3 44 f3 c0 96 ca c5 21 4e .):BKh..D.....!N 00:20:33.939 000003c0 aa 41 ee 47 2c 4b da d7 10 fa 4b ce 8b 6e de 87 .A.G,K....K..n.. 00:20:33.939 000003d0 e5 6a 29 3c fc 7a f0 3a 82 2d 47 63 08 6f 7f c0 .j)<.z.:.-Gc.o.. 00:20:33.939 000003e0 83 16 d9 06 f4 2e 84 1c 34 c7 b6 ac 7b af f9 9b ........4...{... 00:20:33.939 000003f0 ab e3 96 02 5d 9a 94 75 2f b5 51 95 bd 2f 61 76 ....]..u/.Q../av 00:20:33.939 [2024-09-27 15:25:22.329805] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=2, dhgroup=5, seq=3428451787, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.939 [2024-09-27 15:25:22.388759] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.939 [2024-09-27 15:25:22.388809] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.939 [2024-09-27 15:25:22.388821] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.939 [2024-09-27 15:25:22.388836] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.939 [2024-09-27 15:25:22.388854] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.939 [2024-09-27 15:25:22.495051] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.939 [2024-09-27 15:25:22.495098] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.939 [2024-09-27 15:25:22.495123] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.939 [2024-09-27 15:25:22.495155] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.939 [2024-09-27 15:25:22.495247] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.939 ctrlr pubkey: 00:20:33.939 00000000 d4 58 3a 3c 0d 91 b3 cb 69 ee cf c0 18 2f 7e 46 .X:<....i..../~F 00:20:33.939 00000010 7a 7a 62 07 53 43 26 0b a8 f0 9f 09 39 da 24 ef zzb.SC&.....9.$. 00:20:33.939 00000020 18 e7 2b b3 62 d3 f0 27 c1 88 6a dd 91 29 c0 fe ..+.b..'..j..).. 00:20:33.939 00000030 b6 e1 9f a1 e6 d4 fc ac 6f d5 7f 37 99 6e 7a dd ........o..7.nz. 00:20:33.939 00000040 41 9a 6a 2a 95 74 13 8d b8 d9 0b d2 17 0a f0 1c A.j*.t.......... 00:20:33.939 00000050 e8 c7 21 7f 3b 1b e8 9d 2b 01 3e d9 cf d2 5f bf ..!.;...+.>..._. 00:20:33.939 00000060 0e 13 7a e8 0d 7a 8c 25 04 1e 29 4c 8a 7b d9 ef ..z..z.%..)L.{.. 00:20:33.939 00000070 5d 4c fc 8d a3 80 8f 0e cb 2b 5e 12 89 dc 15 56 ]L.......+^....V 00:20:33.939 00000080 a5 3d a6 08 67 5b 61 0f bb 18 58 d5 84 fd 4d b7 .=..g[a...X...M. 00:20:33.939 00000090 ec da a6 50 1a fe 9b 35 f9 c2 9a 33 ab 4c 70 53 ...P...5...3.LpS 00:20:33.939 000000a0 2d 08 92 cd 17 d9 87 74 50 1b a3 34 bf 8c 09 d7 -......tP..4.... 00:20:33.939 000000b0 66 cd 19 84 ac 5e ef 9f d4 dd 50 56 28 de 38 2f f....^....PV(.8/ 00:20:33.939 000000c0 80 78 7d 23 f6 fc ff fc e0 89 35 0c cd 11 ec 37 .x}#......5....7 00:20:33.939 000000d0 27 ef 0f 41 c3 25 5b ae 39 0b d7 6e 90 74 e2 5d '..A.%[.9..n.t.] 00:20:33.939 000000e0 ee ed 94 80 01 64 82 f7 de 87 18 ad a7 f4 18 8d .....d.......... 00:20:33.939 000000f0 c6 42 ae fa c2 2a bf 68 8f 4f a0 50 f3 1a 68 f3 .B...*.h.O.P..h. 00:20:33.939 00000100 00 e1 c8 47 a1 69 45 79 81 d5 e6 c8 8f fb 96 aa ...G.iEy........ 00:20:33.939 00000110 19 1a 14 df ae 9d 88 fc 63 3e bd 93 22 83 83 63 ........c>.."..c 00:20:33.939 00000120 a5 b7 bb fb e4 e2 8b 06 ec 5c 95 33 c0 d0 28 d4 .........\.3..(. 00:20:33.939 00000130 31 8c 47 23 86 51 d4 9f ec 7d d9 43 f6 2b 78 42 1.G#.Q...}.C.+xB 00:20:33.939 00000140 a8 ab 86 43 20 9e 59 ec bf 29 6c ba b9 14 07 06 ...C .Y..)l..... 00:20:33.939 00000150 b8 9a c5 89 79 ef fa c2 b9 9f 11 57 cf a6 90 ce ....y......W.... 00:20:33.939 00000160 db de 7a 70 3b 09 8d 57 97 6e 7b 72 8c a9 bb 8b ..zp;..W.n{r.... 00:20:33.939 00000170 13 cc 77 e0 b7 df 1f b4 02 46 87 5c cb cb b1 9c ..w......F.\.... 00:20:33.939 00000180 53 67 62 49 2c 95 0b cc 13 f3 43 5a f1 f8 1d ce SgbI,.....CZ.... 00:20:33.939 00000190 95 17 dd 8e f5 4b 48 da c8 06 b4 41 14 50 41 24 .....KH....A.PA$ 00:20:33.939 000001a0 b8 b1 fb 16 46 13 7b bd c5 71 4b 35 72 91 9f 87 ....F.{..qK5r... 00:20:33.939 000001b0 51 d1 06 6f 3f 1b 1d 06 57 38 2c 14 94 0f 17 66 Q..o?...W8,....f 00:20:33.939 000001c0 bc 3c 6d af e2 1b 87 e7 31 f7 b7 1e 71 8f 54 3d ..1 00:20:33.939 00000290 89 81 6e d1 78 2b fe ce 0c 33 b8 e4 9c c6 d1 9b ..n.x+...3...... 00:20:33.939 000002a0 06 0a 4c 4c 9e 94 0a 9e 87 4a 65 61 a8 2c 6c d5 ..LL.....Jea.,l. 00:20:33.939 000002b0 6b 9a e6 d3 0c 29 2d f8 7c bb 1f b6 e3 fa b0 15 k....)-.|....... 00:20:33.939 000002c0 76 b1 d1 29 79 78 c3 c4 a8 d0 a3 79 e9 43 09 2a v..)yx.....y.C.* 00:20:33.939 000002d0 a2 fc d6 30 e5 5c f3 11 67 75 cd 04 1d bb 8d a2 ...0.\..gu...... 00:20:33.939 000002e0 47 e8 22 fc 60 9a 0c 3a 91 f2 df 0a 6d ad 4b da G.".`..:....m.K. 00:20:33.939 000002f0 1e f5 19 a5 9b 91 7d 82 83 f1 77 90 7e 3b 2b 96 ......}...w.~;+. 00:20:33.939 00000300 1f c8 26 53 50 bc 25 a3 69 5c 7f 40 7c f5 1c 7e ..&SP.%.i\.@|..~ 00:20:33.939 00000310 a9 ca f4 4a ad db 18 63 e1 13 e8 c8 2d b5 08 58 ...J...c....-..X 00:20:33.939 00000320 52 18 48 e2 9c 49 0d c1 4f 5f ea 66 d9 ea 63 42 R.H..I..O_.f..cB 00:20:33.939 00000330 03 f3 68 8b 48 ec 5f d2 8a 07 b7 d0 ff 71 66 6f ..h.H._......qfo 00:20:33.939 00000340 3f c9 ff 36 a6 bf af 89 50 3b 5b 77 62 be c8 d1 ?..6....P;[wb... 00:20:33.939 00000350 5a 8a 12 2e f8 4c 79 2c 5e 72 46 95 e3 30 9a 34 Z....Ly,^rF..0.4 00:20:33.939 00000360 b8 d3 2f c2 8c f5 2f 47 8d 7b c8 35 5a 44 36 ae ../.../G.{.5ZD6. 00:20:33.939 00000370 5d 13 a2 f5 8c f5 14 ff e2 95 25 01 7b 0b 59 0a ].........%.{.Y. 00:20:33.940 00000380 73 66 36 b4 04 87 a6 76 15 88 88 2d 6b 7f ae 88 sf6....v...-k... 00:20:33.940 00000390 a7 7a ec 1b 13 85 4f 13 50 fa 18 cb 1c 09 26 bb .z....O.P.....&. 00:20:33.940 000003a0 be d2 3f 98 d5 74 11 cf 2a bd 1f bc 7d 21 59 2e ..?..t..*...}!Y. 00:20:33.940 000003b0 c4 ba 6c 4f 20 47 93 35 4a 87 27 4b 1d a5 cd 3c ..lO G.5J.'K...< 00:20:33.940 000003c0 94 26 1f 47 09 4f e3 53 fa 61 37 45 d0 02 98 a7 .&.G.O.S.a7E.... 00:20:33.940 000003d0 08 af 81 91 b5 31 7e 23 61 e9 c3 e8 bf 2f 22 c3 .....1~#a..../". 00:20:33.940 000003e0 77 12 18 fe 70 d9 0f 30 c4 1f 01 b3 24 e8 4a 9f w...p..0....$.J. 00:20:33.940 000003f0 66 2d db ef 3c 48 92 f8 56 8d ea f3 ad 6a b0 97 f-...c.L 00:20:33.940 00000040 7d 6e c8 3f 5a e0 34 a6 91 83 dd 4e c4 ee 1e a5 }n.?Z.4....N.... 00:20:33.940 00000050 b9 f2 73 27 21 43 00 29 bc 8f 0f 7b 77 ac b6 66 ..s'!C.)...{w..f 00:20:33.940 00000060 05 01 d4 4c 7b 52 37 14 22 b4 b9 b1 46 9d d4 b3 ...L{R7."...F... 00:20:33.940 00000070 ac b8 cd 61 52 88 f2 17 1c 8c f2 b3 94 ef fa d1 ...aR........... 00:20:33.940 00000080 ad c2 9e 6a 22 ef 40 ab 7f 80 61 20 99 29 fd cf ...j".@...a .).. 00:20:33.940 00000090 26 f7 58 97 e3 2f c5 6a 2a c5 6d ad fb 7d 54 d9 &.X../.j*.m..}T. 00:20:33.940 000000a0 01 0e 55 04 14 34 3e 5f df 3f 94 ca b5 79 01 00 ..U..4>_.?...y.. 00:20:33.940 000000b0 e7 46 91 be da 51 c8 84 33 bd af 12 b1 aa f6 ea .F...Q..3....... 00:20:33.940 000000c0 ac 25 00 c4 50 66 41 ed ff f9 52 0d 3b 67 01 cb .%..PfA...R.;g.. 00:20:33.940 000000d0 68 fc be f6 af 39 c7 3a fd b9 e5 18 cd 2d 9a 92 h....9.:.....-.. 00:20:33.940 000000e0 ea af 33 74 0c 17 41 e5 90 74 c7 43 8e 60 01 d8 ..3t..A..t.C.`.. 00:20:33.940 000000f0 df e7 d5 21 cd 80 13 96 c5 f4 a5 60 09 12 af 5e ...!.......`...^ 00:20:33.940 00000100 96 ae 9b 5a 52 59 95 9c e0 bd e2 99 04 27 b9 af ...ZRY.......'.. 00:20:33.940 00000110 58 4b cb a5 b0 0b 4e 06 66 e3 60 14 a7 f4 71 87 XK....N.f.`...q. 00:20:33.940 00000120 aa 91 5b d8 45 c0 cd 07 b3 74 4f 0b 64 29 53 54 ..[.E....tO.d)ST 00:20:33.940 00000130 5c e0 c0 f1 02 97 44 eb 87 74 ff d8 23 06 e1 ae \.....D..t..#... 00:20:33.940 00000140 ed 1b 90 6c 5d 19 7c c5 35 21 9a b0 5e b2 14 5b ...l].|.5!..^..[ 00:20:33.940 00000150 38 fd 01 d0 b3 25 6a 96 72 b8 97 9c 9d d5 8f ad 8....%j.r....... 00:20:33.940 00000160 31 ba 7f be 84 fa 98 5f 7c 91 34 f0 31 49 11 3d 1......_|.4.1I.= 00:20:33.940 00000170 b8 c2 6c 52 31 96 1a 26 d5 af b0 7f 65 86 74 9d ..lR1..&....e.t. 00:20:33.940 00000180 d0 c1 d8 18 64 48 7c fe 94 f5 f6 6f 5e 48 90 8d ....dH|....o^H.. 00:20:33.940 00000190 a0 f7 9c 23 33 d6 a0 db cf 5f c2 68 67 29 5e fb ...#3...._.hg)^. 00:20:33.940 000001a0 ca 7c 47 ba fc cf 51 3c d3 1c 2b 2a dd b8 47 50 .|G...Q<..+*..GP 00:20:33.940 000001b0 b7 56 b8 70 69 9d a0 8a 31 6a 30 e8 2f a4 0e 53 .V.pi...1j0./..S 00:20:33.940 000001c0 55 71 54 cb fb 58 bd ba 4b 3a a1 72 2e a7 97 cc UqT..X..K:.r.... 00:20:33.940 000001d0 d5 44 3d 42 73 b1 87 c4 2c 4d 65 f6 38 27 57 25 .D=Bs...,Me.8'W% 00:20:33.940 000001e0 81 c6 4a a9 6b 26 96 1c 86 10 5e ca a7 c1 39 43 ..J.k&....^...9C 00:20:33.940 000001f0 a8 eb a5 70 20 4f 0c e8 bf 3c f1 49 14 f4 71 ed ...p O...<.I..q. 00:20:33.940 00000200 92 43 6d dc 52 87 62 35 22 1d 6b 23 bd a8 46 fd .Cm.R.b5".k#..F. 00:20:33.940 00000210 8b a1 a0 ff 64 3e 09 cb 41 4d ec 52 ae f2 f9 82 ....d>..AM.R.... 00:20:33.940 00000220 96 44 eb de db 29 12 de 73 e6 48 31 61 19 1d f8 .D...)..s.H1a... 00:20:33.940 00000230 70 19 a0 21 2a a3 7b 1d 1a de c5 a0 67 a5 0e 17 p..!*.{.....g... 00:20:33.940 00000240 c3 81 64 1b 8f e3 64 8d 2e 56 04 c1 08 4e 1f df ..d...d..V...N.. 00:20:33.940 00000250 62 08 58 6a 70 16 b6 f4 da 13 11 dd e4 a9 21 2c b.Xjp.........!, 00:20:33.940 00000260 0e 5e 55 3c aa a5 bf 43 11 9b f1 8c c9 17 79 1a .^U<...C......y. 00:20:33.940 00000270 ad a8 85 bb 67 f1 a1 c3 09 f8 6c d5 5a e8 dc 71 ....g.....l.Z..q 00:20:33.940 00000280 5e 68 f9 5a 64 61 1a 42 4d 7e 06 c3 e0 44 da 69 ^h.Zda.BM~...D.i 00:20:33.940 00000290 71 1d 0c 25 4d 49 7b 29 3d bf 3c 2b a4 62 e1 49 q..%MI{)=.<+.b.I 00:20:33.940 000002a0 08 03 a1 92 d2 ae 20 92 e7 25 ab 82 b4 f1 fc f9 ...... ..%...... 00:20:33.940 000002b0 32 5f 16 75 f9 62 48 ff 05 bb 02 78 ce cd 85 c0 2_.u.bH....x.... 00:20:33.940 000002c0 64 90 0f fc 38 16 93 5f 2c ee 1b e1 23 fe 7a 92 d...8.._,...#.z. 00:20:33.940 000002d0 a2 2c 02 e4 d8 dc 0a 3d 0f 44 ca c6 93 fe bc 4d .,.....=.D.....M 00:20:33.940 000002e0 4c 47 6f 92 78 22 a0 b5 e2 35 b6 ee 3e b7 c5 c6 LGo.x"...5..>... 00:20:33.940 000002f0 9a 11 db d7 42 8d e4 aa d0 f4 20 18 8c 94 f1 a8 ....B..... ..... 00:20:33.940 00000300 16 ce 31 70 22 a4 7e d7 39 da e6 77 0d 97 43 4d ..1p".~.9..w..CM 00:20:33.940 00000310 b0 cb 30 ee e2 af 62 41 df a6 83 01 41 71 8e 17 ..0...bA....Aq.. 00:20:33.940 00000320 cc 96 9f 4d 62 82 7c fc 9d 56 ab 41 8e 4a 38 96 ...Mb.|..V.A.J8. 00:20:33.940 00000330 0b 0a b8 d1 1d 94 b7 9e fc fb bc 8f fa 81 be 14 ................ 00:20:33.940 00000340 67 92 4a 8c 18 01 09 48 55 28 32 75 95 b9 86 f8 g.J....HU(2u.... 00:20:33.940 00000350 c5 94 e0 b2 0b 6c c0 01 da ad 94 97 3c d2 71 05 .....l......<.q. 00:20:33.940 00000360 7e de e2 ca 4c b8 a8 69 0c 3f 25 dc 23 b4 c5 03 ~...L..i.?%.#... 00:20:33.940 00000370 1c 8d 45 00 cb 15 68 24 a1 98 d6 42 24 ad 5c 0e ..E...h$...B$.\. 00:20:33.940 00000380 11 b6 0e 53 8d e8 6a d2 23 2f ce da 7a 6c 70 41 ...S..j.#/..zlpA 00:20:33.940 00000390 b7 2f 51 46 89 6c ad 35 16 ca 35 87 72 7e f2 6d ./QF.l.5..5.r~.m 00:20:33.940 000003a0 95 ee 4e 6c 7f 62 f8 a1 5d d4 31 6b 61 d4 28 d0 ..Nl.b..].1ka.(. 00:20:33.940 000003b0 c7 af 7e 75 c7 87 5b e5 87 d8 b8 48 ff a5 51 55 ..~u..[....H..QU 00:20:33.940 000003c0 fe b6 55 eb 08 09 be 92 a9 e7 9b 55 e1 d8 a4 58 ..U........U...X 00:20:33.940 000003d0 6d ae d2 47 38 35 4a 6a 3c d7 c6 3c 45 c6 f3 8c m..G85Jj<...-....Bb..p..I 00:20:33.940 00000260 da 69 e3 c9 d8 51 d0 24 89 a8 2b 8b 5a 18 a2 03 .i...Q.$..+.Z... 00:20:33.940 00000270 db 31 82 eb 39 f2 24 05 a1 97 dc a4 84 6d 48 34 .1..9.$......mH4 00:20:33.940 00000280 6f 73 c3 97 eb 4e 63 71 01 8d 94 6a ba 1a ff d8 os...Ncq...j.... 00:20:33.940 00000290 eb ec c7 6f dd 41 f7 e4 78 0e 5b 88 1d 49 d1 a0 ...o.A..x.[..I.. 00:20:33.940 000002a0 12 0b fd 68 ce 1e 6a b9 eb 80 20 8d 44 bc 78 64 ...h..j... .D.xd 00:20:33.940 000002b0 55 2e a1 b9 bd 9e b9 c0 96 51 ba a3 5c 66 96 f8 U........Q..\f.. 00:20:33.940 000002c0 4e d3 c8 e4 4b b8 f1 98 63 fe 7f 46 a2 4b c0 80 N...K...c..F.K.. 00:20:33.940 000002d0 ac 88 7e 0f 27 86 87 f9 57 55 cf f6 ef bd 71 e4 ..~.'...WU....q. 00:20:33.941 000002e0 66 35 d4 b4 d9 1d 20 2f f7 58 9b b2 2b 6c 79 13 f5.... /.X..+ly. 00:20:33.941 000002f0 48 93 21 6b 61 17 14 85 4c 21 84 c9 76 cb 00 55 H.!ka...L!..v..U 00:20:33.941 00000300 18 ca 14 d6 4f d5 e4 74 b2 40 5c e3 f1 96 e2 17 ....O..t.@\..... 00:20:33.941 00000310 4a 5f aa 4d 2b f0 e9 ce 77 97 5e f7 dc ed 88 bf J_.M+...w.^..... 00:20:33.941 00000320 85 25 53 6c dd 96 07 4d d7 11 7d 3b b0 bc 9d 9a .%Sl...M..};.... 00:20:33.941 00000330 b7 7c 8c c6 78 fa f9 9f ab 81 58 ff 96 b7 40 6a .|..x.....X...@j 00:20:33.941 00000340 19 af 57 3c 3c 10 20 00 47 39 72 2a 62 2c f5 36 ..W<<. .G9r*b,.6 00:20:33.941 00000350 9d 9d 51 4a 3b ad f1 34 1a 6d 33 77 81 ec 29 bb ..QJ;..4.m3w..). 00:20:33.941 00000360 21 11 15 eb f3 73 db 1a 90 34 f0 33 69 e6 ea 7e !....s...4.3i..~ 00:20:33.941 00000370 57 70 9f 24 0d 78 e1 1a eb 00 e9 d2 a1 6d 42 1d Wp.$.x.......mB. 00:20:33.941 00000380 4b 20 c5 80 95 0c 6e 71 42 0e d3 87 61 49 c5 50 K ....nqB...aI.P 00:20:33.941 00000390 99 b7 c6 d8 b4 09 8e 4a 07 87 37 d7 ad ce 63 4a .......J..7...cJ 00:20:33.941 000003a0 03 23 61 c6 7d 3c 4a 11 38 c8 50 d1 73 5c 14 74 .#a.}.HA. 00:20:33.941 00000050 b1 ea d5 57 b5 f8 3f 75 90 de f3 78 68 fd ce 69 ...W..?u...xh..i 00:20:33.941 00000060 6d 4c e2 9b a8 17 b0 0d c5 c7 d0 13 b7 43 14 8c mL...........C.. 00:20:33.941 00000070 5b c6 dc 17 8f b5 9e f9 20 e2 f0 15 7b 0d 86 f8 [....... ...{... 00:20:33.941 00000080 40 a4 e0 23 e6 04 d3 06 85 1b 9a 56 37 71 55 be @..#.......V7qU. 00:20:33.941 00000090 14 92 ea 28 49 e8 77 12 28 77 95 8a 65 e3 31 a0 ...(I.w.(w..e.1. 00:20:33.941 000000a0 33 2d 8a cc 80 7e 0a a2 9a ba 04 95 ef e9 53 99 3-...~........S. 00:20:33.941 000000b0 e3 78 da 40 66 7b 14 8b f3 fe 26 e0 67 dd 97 ad .x.@f{....&.g... 00:20:33.941 000000c0 6b a8 4e a8 18 02 b0 49 ac 20 a3 77 71 90 07 df k.N....I. .wq... 00:20:33.941 000000d0 21 ce be c6 5b eb 62 88 6c 2e 3b b3 40 9f dd d9 !...[.b.l.;.@... 00:20:33.941 000000e0 1c 97 9b ef f3 63 2f 93 33 fc ce d4 d9 cc 02 94 .....c/.3....... 00:20:33.941 000000f0 89 d0 92 8d 84 16 fd 13 f3 a6 c1 6d f8 bc 35 1e ...........m..5. 00:20:33.941 00000100 ca 59 b8 81 64 aa 39 2f 93 4f 38 95 aa 5e 66 0a .Y..d.9/.O8..^f. 00:20:33.941 00000110 d6 e9 21 65 03 73 ab 00 82 71 09 46 7e 5c ad 08 ..!e.s...q.F~\.. 00:20:33.941 00000120 98 d0 66 a4 d4 1c 72 b7 75 7a 9b a7 95 df a3 4a ..f...r.uz.....J 00:20:33.941 00000130 b3 f7 d0 8b c1 7a 7d 7d da af 45 f8 df f6 60 ef .....z}}..E...`. 00:20:33.941 00000140 cf f7 3f 16 38 26 ad 2e 50 3a c9 63 5f 0b f7 04 ..?.8&..P:.c_... 00:20:33.941 00000150 97 62 20 8e e1 b5 f4 3a 80 72 64 b9 57 b0 91 f0 .b ....:.rd.W... 00:20:33.941 00000160 f2 ee 07 e7 3e af 26 35 44 c4 13 c9 cb a7 0b 62 ....>.&5D......b 00:20:33.941 00000170 4d af e9 28 4c 53 50 f0 37 44 a4 a4 34 91 3c b3 M..(LSP.7D..4.<. 00:20:33.941 00000180 c5 2e 63 e5 f2 c7 38 ee 4c c2 6c 47 d3 ba ef 25 ..c...8.L.lG...% 00:20:33.941 00000190 dc 5e 3b 38 bf 0b cb 24 87 ba af fe 95 a5 e5 40 .^;8...$.......@ 00:20:33.941 000001a0 7a a8 89 09 d5 ed 66 c9 7b af 47 34 88 e4 9e 00 z.....f.{.G4.... 00:20:33.941 000001b0 2c f9 4a 6a e4 c2 ad f3 4d 2b d0 55 1d 75 d8 96 ,.Jj....M+.U.u.. 00:20:33.941 000001c0 12 00 8a 60 5a 2a 1d ca a0 ee de c9 a1 2c 44 a2 ...`Z*.......,D. 00:20:33.941 000001d0 4b 4c 72 a1 8c 05 bb b7 62 bb fd b0 5f c6 55 d6 KLr.....b..._.U. 00:20:33.941 000001e0 4c 93 34 c4 ab a8 65 72 29 b7 77 86 ef e9 c0 d8 L.4...er).w..... 00:20:33.941 000001f0 78 e1 51 b6 9a 9f 9f b8 3f 12 c8 53 75 81 07 de x.Q.....?..Su... 00:20:33.941 00000200 38 bf 07 a8 60 c6 0f 89 6e 56 f3 93 82 14 45 11 8...`...nV....E. 00:20:33.941 00000210 0d f1 30 c8 05 7e fa 82 7d 5f f1 19 4b 7c ce 8e ..0..~..}_..K|.. 00:20:33.941 00000220 98 67 8e a2 21 22 b4 18 78 c9 b7 64 f1 15 fc 6e .g..!"..x..d...n 00:20:33.941 00000230 f1 72 e5 01 cf fb bc fe f3 45 cd 23 1b 33 ba 63 .r.......E.#.3.c 00:20:33.941 00000240 c4 81 92 a1 e2 0f 05 6b 9d 02 37 79 58 5c c4 20 .......k..7yX\. 00:20:33.941 00000250 8f 2f ed ae 00 99 9f e6 58 35 64 13 d5 ba 55 54 ./......X5d...UT 00:20:33.941 00000260 90 ed c8 ee 97 15 d4 07 ae 03 7f c2 15 60 45 ec .............`E. 00:20:33.941 00000270 87 e8 8c d6 9a 78 02 76 a2 04 92 dd bb 36 e3 b9 .....x.v.....6.. 00:20:33.941 00000280 ec 70 7b de 80 21 0d 7f b6 4d a3 b8 2c 79 4c ee .p{..!...M..,yL. 00:20:33.941 00000290 31 7a 2f a0 98 85 b3 fe 7d 73 32 d2 45 b3 d8 14 1z/.....}s2.E... 00:20:33.941 000002a0 13 4d b8 2f fe ba 2a 14 80 1e 7f 38 5c 47 c9 21 .M./..*....8\G.! 00:20:33.941 000002b0 4a 13 e9 bc bd 70 98 ac 0e 1f 16 fa 54 49 f5 15 J....p......TI.. 00:20:33.941 000002c0 f9 be 4c b8 d3 d5 0b 84 ea 9c f8 a8 ba 3a 14 c7 ..L..........:.. 00:20:33.941 000002d0 67 0c 95 a3 2a 24 50 ea 29 32 9b 47 68 bb 29 e7 g...*$P.)2.Gh.). 00:20:33.941 000002e0 89 06 fa e4 00 ac 66 34 2d 6b ce 96 60 f8 56 ee ......f4-k..`.V. 00:20:33.941 000002f0 ab 7e 64 15 5c 87 cb a0 32 ca 26 bb 3e 9a 68 34 .~d.\...2.&.>.h4 00:20:33.941 00000300 31 58 f8 24 e1 cf b2 5a ae 4d 4b 53 7f 85 29 4d 1X.$...Z.MKS..)M 00:20:33.941 00000310 d0 4e 9c 16 fc 54 9d 96 9b 14 33 e6 ea 66 6b 57 .N...T....3..fkW 00:20:33.941 00000320 c1 2d 9d 99 a2 7c a8 13 d2 fb 22 a6 b4 c2 4b 9f .-...|...."...K. 00:20:33.941 00000330 f5 30 19 07 ef 83 bb 31 85 5b 87 aa 61 1f ed 5e .0.....1.[..a..^ 00:20:33.941 00000340 ff 66 0f 7d 0d 42 a8 a4 d1 04 f0 f2 a2 da 3a c9 .f.}.B........:. 00:20:33.941 00000350 80 f9 a2 55 3b 23 e2 e3 b9 48 b0 f9 e2 34 d8 bf ...U;#...H...4.. 00:20:33.941 00000360 6e 4d d6 d8 be 5d 16 e0 94 7b a5 4f a1 9c 3e 53 nM...]...{.O..>S 00:20:33.941 00000370 f5 f4 33 d7 0b 8b 65 17 7d 88 fd b1 0e 14 0d 11 ..3...e.}....... 00:20:33.941 00000380 2a 39 8d 22 46 65 97 14 07 84 38 5a b9 5d 37 8c *9."Fe....8Z.]7. 00:20:33.941 00000390 fb bd 9e 42 92 b0 69 c4 6d bb 8e 72 5c 1b 68 db ...B..i.m..r\.h. 00:20:33.941 000003a0 cc e2 d7 9b 3d 56 15 62 73 ad 9d 03 51 85 37 e0 ....=V.bs...Q.7. 00:20:33.941 000003b0 71 41 b9 4e ca a3 52 d3 b4 ff a2 39 5a 85 bc 4b qA.N..R....9Z..K 00:20:33.941 000003c0 f5 1b bd 12 46 3d 3d b1 b4 54 ac 49 6a a2 8d 33 ....F==..T.Ij..3 00:20:33.941 000003d0 9f 57 17 e9 1a 5a 5e 33 e0 74 ff 3d 68 de 40 70 .W...Z^3.t.=h.@p 00:20:33.941 000003e0 c1 e5 dd 47 7b 0f 3d ac 21 0b d3 1c 49 10 82 bf ...G{.=.!...I... 00:20:33.941 000003f0 91 71 6a a2 85 5a 81 6d 34 d8 e0 d5 85 68 21 47 .qj..Z.m4....h!G 00:20:33.941 host pubkey: 00:20:33.941 00000000 ad 7b f0 65 73 7d 23 d0 c8 3e cd 47 9f bf d4 32 .{.es}#..>.G...2 00:20:33.941 00000010 0b 83 fc 80 5d 31 98 e3 42 57 c3 9e d0 f6 74 65 ....]1..BW....te 00:20:33.941 00000020 f1 ec b6 07 68 a3 78 54 76 2b 1f 7f 43 45 05 d9 ....h.xTv+..CE.. 00:20:33.941 00000030 52 84 86 4c a1 d9 10 be f8 c5 1a b3 21 bf 86 50 R..L........!..P 00:20:33.941 00000040 1b 68 9d c0 55 9e 7c 8e ee b3 d4 28 25 91 91 6e .h..U.|....(%..n 00:20:33.941 00000050 e1 35 0a 5f e6 3c 0a 07 97 1b 3a 09 80 db 57 97 .5._.<....:...W. 00:20:33.941 00000060 bb e0 06 1a 7c c9 cd 91 74 6b c6 e0 18 32 f8 2e ....|...tk...2.. 00:20:33.941 00000070 0e d9 8e 38 6a 86 80 26 70 d8 1a 13 59 7a 5c d8 ...8j..&p...Yz\. 00:20:33.941 00000080 5b e4 1b 1b 64 e4 8b 2e 78 4d 4a 43 36 c0 a4 1a [...d...xMJC6... 00:20:33.941 00000090 25 fb 9a 38 98 61 73 3d 8b fe 32 51 43 6a f0 fd %..8.as=..2QCj.. 00:20:33.941 000000a0 78 1d 40 49 e1 d2 d8 7e 88 e4 7b ed a7 79 6a ae x.@I...~..{..yj. 00:20:33.941 000000b0 5c c9 07 9a 55 96 89 6a 23 4b 08 a6 4d 9d 38 14 \...U..j#K..M.8. 00:20:33.941 000000c0 a5 8d a0 18 36 5b f1 a1 cd b7 1c 76 a0 bd 9b 28 ....6[.....v...( 00:20:33.941 000000d0 91 42 9b 3d b5 62 90 ce 99 27 df 2f 57 2b ea 11 .B.=.b...'./W+.. 00:20:33.941 000000e0 5f 88 57 a3 c9 e1 3e 60 02 66 e7 9c 85 2c a0 a3 _.W...>`.f...,.. 00:20:33.941 000000f0 91 00 63 50 ee a3 b1 de bf 1d 91 6b c0 0a a1 88 ..cP.......k.... 00:20:33.941 00000100 e3 49 2e 13 30 66 b6 00 39 f2 80 4c 3e be 48 68 .I..0f..9..L>.Hh 00:20:33.941 00000110 f0 cb 47 8a 17 46 37 b0 31 1f 69 d1 4b 48 ea 5a ..G..F7.1.i.KH.Z 00:20:33.941 00000120 e5 fa d1 94 67 22 1a bd d3 73 10 2a d6 4f 3a 4b ....g"...s.*.O:K 00:20:33.941 00000130 e3 8a ac b8 5a 3c e9 08 29 81 4e d6 54 d2 16 53 ....Z<..).N.T..S 00:20:33.941 00000140 50 5a 3d 54 16 dd 99 d4 a4 ce e5 df c0 ec 10 1a PZ=T............ 00:20:33.941 00000150 02 99 dd 23 fd 38 ee dd 46 d8 52 69 1e 19 fc 04 ...#.8..F.Ri.... 00:20:33.941 00000160 7b 9a 21 51 86 17 04 e1 04 f1 6b a5 be e3 3d b3 {.!Q......k...=. 00:20:33.941 00000170 0a f9 be 0b 98 76 13 f2 08 37 9f 11 da 9d c8 c6 .....v...7...... 00:20:33.941 00000180 c4 a5 f6 2b e1 3b 34 94 54 77 dd f2 2b 8d 45 44 ...+.;4.Tw..+.ED 00:20:33.941 00000190 2d 0b 72 6b 24 d2 c0 e9 05 94 6d a5 e6 1f 30 cb -.rk$.....m...0. 00:20:33.941 000001a0 1e 4c 81 b3 91 c6 05 1d b9 e4 a2 fa 23 a9 e9 4c .L..........#..L 00:20:33.941 000001b0 e5 9b 78 94 1a 92 ee 6c 81 5b e4 00 0c 4f 9f ae ..x....l.[...O.. 00:20:33.941 000001c0 f2 7a 9f ae a2 9b f4 5e 6a 97 b0 80 64 27 74 7a .z.....^j...d'tz 00:20:33.941 000001d0 a5 b9 7a 19 d9 13 b4 e8 ec 49 79 24 5e 2c 24 96 ..z......Iy$^,$. 00:20:33.941 000001e0 2e cc 67 e4 32 af 48 a6 8d 0c 75 5d 6a 43 46 30 ..g.2.H...u]jCF0 00:20:33.941 000001f0 f6 f4 e7 96 84 3e a1 47 25 0d 16 e4 16 fc be cd .....>.G%....... 00:20:33.941 00000200 28 4c 56 9e e7 d3 f0 f0 a6 5b e2 70 f5 79 8f da (LV......[.p.y.. 00:20:33.941 00000210 ba 9e 31 70 dd 70 5f c1 aa 04 de 70 36 32 b3 7e ..1p.p_....p62.~ 00:20:33.942 00000220 e2 e5 38 bc 17 ad 75 14 34 74 cd fa 2a 05 32 87 ..8...u.4t..*.2. 00:20:33.942 00000230 d4 2a b6 f0 14 6d e0 10 d1 15 1c 1d 16 1c a7 c6 .*...m.......... 00:20:33.942 00000240 f4 02 3d a1 77 54 7e 66 1f a0 c3 10 a1 ff b3 e3 ..=.wT~f........ 00:20:33.942 00000250 66 ac 7a 8b 59 a6 09 24 b8 a2 cc ea c3 28 25 00 f.z.Y..$.....(%. 00:20:33.942 00000260 d3 45 17 0f aa 6d b1 d2 2a d1 0b d8 aa 2f 99 74 .E...m..*..../.t 00:20:33.942 00000270 61 8c 2d 92 ce 8e a4 25 29 02 83 d5 10 45 b5 76 a.-....%)....E.v 00:20:33.942 00000280 91 e9 f2 7a 32 b6 d4 ce 92 43 56 b9 e4 90 c7 07 ...z2....CV..... 00:20:33.942 00000290 a5 55 eb 5d bf c0 4b 4e 53 d5 6d f8 29 8a f6 23 .U.]..KNS.m.)..# 00:20:33.942 000002a0 ff 3a f0 10 ab 0f 7b 7d 2c cd a4 f5 d8 5c 99 f9 .:....{},....\.. 00:20:33.942 000002b0 a1 ed 5c 75 ef 03 90 c9 9d f4 3a ad 6f e2 dc 03 ..\u......:.o... 00:20:33.942 000002c0 1c 51 bb 04 69 a2 30 20 6c ae b9 1d 98 79 44 a3 .Q..i.0 l....yD. 00:20:33.942 000002d0 0f a1 ea b5 d2 36 d9 67 f7 86 6d 77 8a 2c a5 e4 .....6.g..mw.,.. 00:20:33.942 000002e0 01 85 66 68 4d 17 c7 3b 62 1b 64 db f8 e2 68 79 ..fhM..;b.d...hy 00:20:33.942 000002f0 a3 63 4c d0 85 4a ff c1 9f be 55 20 d6 cd df a8 .cL..J....U .... 00:20:33.942 00000300 11 84 93 43 a7 d2 a4 bf 50 7f 56 08 13 4c d2 63 ...C....P.V..L.c 00:20:33.942 00000310 bd a1 28 fc 4d 78 ba 45 8c 04 6d 93 df 72 5c a6 ..(.Mx.E..m..r\. 00:20:33.942 00000320 9c ed e7 b9 34 59 83 9d 1b fb 3c 22 a6 a9 4c d6 ....4Y....<"..L. 00:20:33.942 00000330 8a 4b ee 21 2f 5d 36 be f7 70 03 04 6e 69 9b 24 .K.!/]6..p..ni.$ 00:20:33.942 00000340 b9 e3 d6 4b 1f 63 3b a0 5a c7 0d 1b 1b 93 5a ae ...K.c;.Z.....Z. 00:20:33.942 00000350 b6 81 c8 2d 03 95 b0 41 83 2b 6b e8 2c da 62 e0 ...-...A.+k.,.b. 00:20:33.942 00000360 aa 71 6f 9f c0 81 c6 03 43 5d 6c 40 1e e9 96 e1 .qo.....C]l@.... 00:20:33.942 00000370 9f 80 dd 42 8c 4a fc 0d f4 ef e3 a1 c9 1c a6 5a ...B.J.........Z 00:20:33.942 00000380 b1 88 02 66 f4 2b b5 cd 89 2a 5d 82 ce 17 e6 ce ...f.+...*]..... 00:20:33.942 00000390 96 7a ef 3a 2b 55 55 df 46 6e 7d c1 f8 23 a4 81 .z.:+UU.Fn}..#.. 00:20:33.942 000003a0 4d 8f f2 83 6f 15 5c 8f a9 b5 93 0d 51 e5 58 84 M...o.\.....Q.X. 00:20:33.942 000003b0 70 76 5d b0 6a 78 aa 40 0a 07 31 6d 1f dd ff f5 pv].jx.@..1m.... 00:20:33.942 000003c0 67 75 ee c8 6c 10 f8 68 ea 22 bd e3 2c 67 d2 c6 gu..l..h."..,g.. 00:20:33.942 000003d0 5e 5f b2 16 b9 5c 51 2a c3 1d d9 b8 a4 2a 8a 71 ^_...\Q*.....*.q 00:20:33.942 000003e0 2b 87 a9 99 78 12 c9 e0 2a af f6 b3 99 30 db fd +...x...*....0.. 00:20:33.942 000003f0 7b 18 79 eb 27 6c 3a b0 7d ec 4d e5 d8 e8 99 df {.y.'l:.}.M..... 00:20:33.942 dh secret: 00:20:33.942 00000000 0f 31 63 af 1d b6 72 0e 42 33 94 39 4f 5b a0 e6 .1c...r.B3.9O[.. 00:20:33.942 00000010 a1 cd de 04 68 b9 bb 07 68 01 2d a8 d6 5d c3 9d ....h...h.-..].. 00:20:33.942 00000020 33 0c ed 7f 79 27 d2 24 3b 82 35 d9 18 55 f2 26 3...y'.$;.5..U.& 00:20:33.942 00000030 7d 29 46 1c 52 a1 62 a6 dc 96 5f 5c f5 81 19 4a })F.R.b..._\...J 00:20:33.942 00000040 ed fd f8 19 ed 69 3c fe 41 9b f9 3b 8b 51 2e 1a .....i<.A..;.Q.. 00:20:33.942 00000050 68 09 6e 54 39 d2 dc f0 4a d2 ca 85 22 15 09 c8 h.nT9...J..."... 00:20:33.942 00000060 45 50 bb e3 75 a9 1f 62 b4 6f 43 9d a7 64 25 21 EP..u..b.oC..d%! 00:20:33.942 00000070 d0 bd a1 e1 2b 06 26 17 e7 a2 04 f7 dd d5 d8 e2 ....+.&......... 00:20:33.942 00000080 f6 3b 6a 36 3f 98 56 f4 3c c7 4c c8 7c 7d 50 0f .;j6?.V.<.L.|}P. 00:20:33.942 00000090 de 6c 16 39 cb 21 d5 89 bb fc ee 94 93 a8 d1 22 .l.9.!........." 00:20:33.942 000000a0 22 c5 fc 98 86 dc ae 32 7f 01 ff 8a e4 ea 8f a8 "......2........ 00:20:33.942 000000b0 23 c4 25 02 86 84 01 1f b0 53 31 79 55 0f 74 ee #.%......S1yU.t. 00:20:33.942 000000c0 f5 e3 06 82 4d 03 bc a2 21 53 2f a4 04 19 11 bd ....M...!S/..... 00:20:33.942 000000d0 38 c8 d4 a5 c5 29 5e 7e d9 b1 1d 70 d5 6f 0e 80 8....)^~...p.o.. 00:20:33.942 000000e0 b3 a5 7f ef d6 fc 04 c4 61 9a 2e 76 b0 5e 91 58 ........a..v.^.X 00:20:33.942 000000f0 bc 5a 18 10 71 4c eb 5f b1 15 75 44 d8 f7 33 0f .Z..qL._..uD..3. 00:20:33.942 00000100 50 58 47 ba 68 cc 96 1a 48 da 2b 18 7e e3 bd 0b PXG.h...H.+.~... 00:20:33.942 00000110 57 02 35 6b 47 0d c3 38 05 f9 70 85 e6 e1 33 eb W.5kG..8..p...3. 00:20:33.942 00000120 97 15 48 a2 a3 46 5f a8 dc 4c 0a a0 ab aa 46 6f ..H..F_..L....Fo 00:20:33.942 00000130 c1 d0 df 6d 06 c7 28 07 42 45 0e c4 d3 bd 12 18 ...m..(.BE...... 00:20:33.942 00000140 c1 f0 60 24 d7 10 8f 03 3e 31 d4 96 9b ef 92 d4 ..`$....>1...... 00:20:33.942 00000150 31 f2 44 97 f9 63 57 09 b2 8d c9 29 10 1f 08 1b 1.D..cW....).... 00:20:33.942 00000160 fb fa aa 85 f6 31 6a bb b9 82 a8 dd 65 7d 53 65 .....1j.....e}Se 00:20:33.942 00000170 15 36 13 b5 4f 69 54 e1 8d 00 e7 9c 04 1d 11 86 .6..OiT......... 00:20:33.942 00000180 11 f7 03 64 ff bd 80 ac dc 4b 5e 38 af 25 10 5c ...d.....K^8.%.\ 00:20:33.942 00000190 7a 0a 29 5e d0 77 05 17 04 d6 ed 39 fb d6 6e ad z.)^.w.....9..n. 00:20:33.942 000001a0 72 ac 45 fd 54 b8 21 6f 60 70 94 29 a2 74 2c 78 r.E.T.!o`p.).t,x 00:20:33.942 000001b0 33 24 61 67 36 67 80 90 1c e9 bd 4f 31 6f a5 30 3$ag6g.....O1o.0 00:20:33.942 000001c0 4b 17 07 46 6d 0c b9 dc d5 fb 8b 83 4e 95 73 10 K..Fm.......N.s. 00:20:33.942 000001d0 9c f4 c5 49 7b 7b 42 5c 39 9d 48 cb 59 d2 56 e7 ...I{{B\9.H.Y.V. 00:20:33.942 000001e0 f8 5e 46 3e 7a d7 e2 74 ac 0e c2 92 ad fe 86 45 .^F>z..t.......E 00:20:33.942 000001f0 51 31 5a 88 b8 cd 0e de 3a ae 3c ef d8 3e 2e cf Q1Z.....:.<..>.. 00:20:33.942 00000200 ab 5a 77 fc 1b db 85 28 55 02 11 10 00 ae 85 26 .Zw....(U......& 00:20:33.942 00000210 c5 c0 66 30 ea 71 61 39 31 33 18 8e e5 fb f1 1d ..f0.qa913...... 00:20:33.942 00000220 be 96 37 03 ab bb b7 b3 e9 82 16 80 77 cc 03 81 ..7.........w... 00:20:33.942 00000230 1d 9f ba c6 65 d3 ad 6c 10 33 ea a7 13 38 32 66 ....e..l.3...82f 00:20:33.942 00000240 1e 09 72 6b 90 8e 07 02 d1 2a dc cf 74 2a cc a7 ..rk.....*..t*.. 00:20:33.942 00000250 e0 32 b5 96 44 7b 1c ec ff 8c 23 79 94 f0 c4 8c .2..D{....#y.... 00:20:33.942 00000260 96 1b 9c f9 c6 91 a3 5f 4d 7f 37 70 a7 70 92 bf ......._M.7p.p.. 00:20:33.942 00000270 d9 83 6e 7b 2a 85 ca 3d 10 86 be 89 2b c0 d1 bb ..n{*..=....+... 00:20:33.942 00000280 5b e9 b4 45 a2 3b a1 dc c4 72 65 ac 1b 13 61 b6 [..E.;...re...a. 00:20:33.942 00000290 58 d2 e7 11 83 cc 75 04 03 65 01 23 57 61 24 08 X.....u..e.#Wa$. 00:20:33.942 000002a0 02 f2 1e 2c 1f f7 b5 88 dd 78 a0 17 4e 87 d7 e8 ...,.....x..N... 00:20:33.942 000002b0 e4 ed c7 08 af 13 a2 cf 53 57 83 f9 3d 36 a4 96 ........SW..=6.. 00:20:33.942 000002c0 c1 0a 3e 0e 7e 6e 21 82 cf b1 0c dd 1f d0 6f 1d ..>.~n!.......o. 00:20:33.942 000002d0 c2 93 67 ba b2 f4 aa c7 01 a7 f7 25 1b 31 47 78 ..g........%.1Gx 00:20:33.942 000002e0 ef 94 09 93 49 1f e5 2e 7c 31 f7 2e 09 03 24 d1 ....I...|1....$. 00:20:33.942 000002f0 cb f4 90 a5 00 a0 5c b9 fd c2 8a dd d1 aa c2 b4 ......\......... 00:20:33.942 00000300 2d 2e 40 8a 00 ec f6 8b 32 96 79 50 a1 65 2d 86 -.@.....2.yP.e-. 00:20:33.942 00000310 83 f7 ac 99 31 6e f1 6c 5b 37 c0 7a 74 12 00 7d ....1n.l[7.zt..} 00:20:33.942 00000320 29 77 46 5a 18 24 86 1a 07 3a 37 fa 27 2b e4 bd )wFZ.$...:7.'+.. 00:20:33.942 00000330 43 4d b4 88 ff b8 8a df ec 3a 68 fb 84 af 82 65 CM.......:h....e 00:20:33.942 00000340 94 7e 66 31 c9 7b 75 87 96 81 1b 30 90 ad 49 94 .~f1.{u....0..I. 00:20:33.942 00000350 da 6e b6 ec e0 54 44 1f a1 d9 3f cc 6a 92 ce 07 .n...TD...?.j... 00:20:33.942 00000360 d2 70 1e 65 20 81 7b 58 5e b3 18 c9 9a a1 67 e8 .p.e .{X^.....g. 00:20:33.942 00000370 36 07 49 25 a7 73 c7 e0 a1 21 6d 60 1c e4 28 b2 6.I%.s...!m`..(. 00:20:33.942 00000380 af 46 f8 db a4 2d 67 a5 dd 14 90 65 8f cd 86 7e .F...-g....e...~ 00:20:33.942 00000390 e4 04 46 89 aa 5e a1 8b 44 d8 cd 48 f6 b2 9f 9b ..F..^..D..H.... 00:20:33.942 000003a0 53 0f 31 98 cc a7 f1 fc e3 d7 d0 f1 de 5f 07 48 S.1.........._.H 00:20:33.942 000003b0 bb ee a2 06 96 a1 82 76 4f b8 17 9b f7 89 5a 0a .......vO.....Z. 00:20:33.942 000003c0 ae c8 ac 10 91 2c 14 77 be 9b b2 db 9b ba 49 99 .....,.w......I. 00:20:33.942 000003d0 ef 92 34 82 a8 82 09 2d 40 72 f9 19 53 ae 7c 06 ..4....-@r..S.|. 00:20:33.942 000003e0 26 fa c4 3c 3b 50 09 02 d6 2a cd 1c 00 47 c2 28 &..<;P...*...G.( 00:20:33.942 000003f0 1a 70 27 8b 16 49 96 21 f8 31 b3 14 9e f8 f3 1d .p'..I.!.1...... 00:20:33.942 [2024-09-27 15:25:22.981018] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=2, dhgroup=5, seq=3428451789, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.942 [2024-09-27 15:25:23.040698] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.942 [2024-09-27 15:25:23.040746] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.942 [2024-09-27 15:25:23.040763] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.942 [2024-09-27 15:25:23.040782] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.942 [2024-09-27 15:25:23.040797] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.942 [2024-09-27 15:25:23.146621] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.942 [2024-09-27 15:25:23.146639] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.942 [2024-09-27 15:25:23.146647] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.942 [2024-09-27 15:25:23.146657] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.942 [2024-09-27 15:25:23.146711] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.942 ctrlr pubkey: 00:20:33.942 00000000 65 c3 62 5e ed 7a e9 1e 50 a9 20 ae c6 6c fd 6e e.b^.z..P. ..l.n 00:20:33.942 00000010 97 75 5c 29 75 bf e4 01 0b d5 c0 79 53 2f a1 fc .u\)u......yS/.. 00:20:33.942 00000020 64 c1 ad 94 30 d0 08 59 09 19 3b e6 71 e1 aa 79 d...0..Y..;.q..y 00:20:33.942 00000030 14 8b 8c ae 14 1f 64 23 fc 00 0a b3 d1 85 52 09 ......d#......R. 00:20:33.942 00000040 c8 2c db 8a 4c 49 24 89 c3 ee d5 3e f4 48 41 d4 .,..LI$....>.HA. 00:20:33.942 00000050 b1 ea d5 57 b5 f8 3f 75 90 de f3 78 68 fd ce 69 ...W..?u...xh..i 00:20:33.942 00000060 6d 4c e2 9b a8 17 b0 0d c5 c7 d0 13 b7 43 14 8c mL...........C.. 00:20:33.942 00000070 5b c6 dc 17 8f b5 9e f9 20 e2 f0 15 7b 0d 86 f8 [....... ...{... 00:20:33.942 00000080 40 a4 e0 23 e6 04 d3 06 85 1b 9a 56 37 71 55 be @..#.......V7qU. 00:20:33.942 00000090 14 92 ea 28 49 e8 77 12 28 77 95 8a 65 e3 31 a0 ...(I.w.(w..e.1. 00:20:33.942 000000a0 33 2d 8a cc 80 7e 0a a2 9a ba 04 95 ef e9 53 99 3-...~........S. 00:20:33.942 000000b0 e3 78 da 40 66 7b 14 8b f3 fe 26 e0 67 dd 97 ad .x.@f{....&.g... 00:20:33.942 000000c0 6b a8 4e a8 18 02 b0 49 ac 20 a3 77 71 90 07 df k.N....I. .wq... 00:20:33.942 000000d0 21 ce be c6 5b eb 62 88 6c 2e 3b b3 40 9f dd d9 !...[.b.l.;.@... 00:20:33.942 000000e0 1c 97 9b ef f3 63 2f 93 33 fc ce d4 d9 cc 02 94 .....c/.3....... 00:20:33.942 000000f0 89 d0 92 8d 84 16 fd 13 f3 a6 c1 6d f8 bc 35 1e ...........m..5. 00:20:33.942 00000100 ca 59 b8 81 64 aa 39 2f 93 4f 38 95 aa 5e 66 0a .Y..d.9/.O8..^f. 00:20:33.942 00000110 d6 e9 21 65 03 73 ab 00 82 71 09 46 7e 5c ad 08 ..!e.s...q.F~\.. 00:20:33.942 00000120 98 d0 66 a4 d4 1c 72 b7 75 7a 9b a7 95 df a3 4a ..f...r.uz.....J 00:20:33.942 00000130 b3 f7 d0 8b c1 7a 7d 7d da af 45 f8 df f6 60 ef .....z}}..E...`. 00:20:33.942 00000140 cf f7 3f 16 38 26 ad 2e 50 3a c9 63 5f 0b f7 04 ..?.8&..P:.c_... 00:20:33.942 00000150 97 62 20 8e e1 b5 f4 3a 80 72 64 b9 57 b0 91 f0 .b ....:.rd.W... 00:20:33.942 00000160 f2 ee 07 e7 3e af 26 35 44 c4 13 c9 cb a7 0b 62 ....>.&5D......b 00:20:33.942 00000170 4d af e9 28 4c 53 50 f0 37 44 a4 a4 34 91 3c b3 M..(LSP.7D..4.<. 00:20:33.943 00000180 c5 2e 63 e5 f2 c7 38 ee 4c c2 6c 47 d3 ba ef 25 ..c...8.L.lG...% 00:20:33.943 00000190 dc 5e 3b 38 bf 0b cb 24 87 ba af fe 95 a5 e5 40 .^;8...$.......@ 00:20:33.943 000001a0 7a a8 89 09 d5 ed 66 c9 7b af 47 34 88 e4 9e 00 z.....f.{.G4.... 00:20:33.943 000001b0 2c f9 4a 6a e4 c2 ad f3 4d 2b d0 55 1d 75 d8 96 ,.Jj....M+.U.u.. 00:20:33.943 000001c0 12 00 8a 60 5a 2a 1d ca a0 ee de c9 a1 2c 44 a2 ...`Z*.......,D. 00:20:33.943 000001d0 4b 4c 72 a1 8c 05 bb b7 62 bb fd b0 5f c6 55 d6 KLr.....b..._.U. 00:20:33.943 000001e0 4c 93 34 c4 ab a8 65 72 29 b7 77 86 ef e9 c0 d8 L.4...er).w..... 00:20:33.943 000001f0 78 e1 51 b6 9a 9f 9f b8 3f 12 c8 53 75 81 07 de x.Q.....?..Su... 00:20:33.943 00000200 38 bf 07 a8 60 c6 0f 89 6e 56 f3 93 82 14 45 11 8...`...nV....E. 00:20:33.943 00000210 0d f1 30 c8 05 7e fa 82 7d 5f f1 19 4b 7c ce 8e ..0..~..}_..K|.. 00:20:33.943 00000220 98 67 8e a2 21 22 b4 18 78 c9 b7 64 f1 15 fc 6e .g..!"..x..d...n 00:20:33.943 00000230 f1 72 e5 01 cf fb bc fe f3 45 cd 23 1b 33 ba 63 .r.......E.#.3.c 00:20:33.943 00000240 c4 81 92 a1 e2 0f 05 6b 9d 02 37 79 58 5c c4 20 .......k..7yX\. 00:20:33.943 00000250 8f 2f ed ae 00 99 9f e6 58 35 64 13 d5 ba 55 54 ./......X5d...UT 00:20:33.943 00000260 90 ed c8 ee 97 15 d4 07 ae 03 7f c2 15 60 45 ec .............`E. 00:20:33.943 00000270 87 e8 8c d6 9a 78 02 76 a2 04 92 dd bb 36 e3 b9 .....x.v.....6.. 00:20:33.943 00000280 ec 70 7b de 80 21 0d 7f b6 4d a3 b8 2c 79 4c ee .p{..!...M..,yL. 00:20:33.943 00000290 31 7a 2f a0 98 85 b3 fe 7d 73 32 d2 45 b3 d8 14 1z/.....}s2.E... 00:20:33.943 000002a0 13 4d b8 2f fe ba 2a 14 80 1e 7f 38 5c 47 c9 21 .M./..*....8\G.! 00:20:33.943 000002b0 4a 13 e9 bc bd 70 98 ac 0e 1f 16 fa 54 49 f5 15 J....p......TI.. 00:20:33.943 000002c0 f9 be 4c b8 d3 d5 0b 84 ea 9c f8 a8 ba 3a 14 c7 ..L..........:.. 00:20:33.943 000002d0 67 0c 95 a3 2a 24 50 ea 29 32 9b 47 68 bb 29 e7 g...*$P.)2.Gh.). 00:20:33.943 000002e0 89 06 fa e4 00 ac 66 34 2d 6b ce 96 60 f8 56 ee ......f4-k..`.V. 00:20:33.943 000002f0 ab 7e 64 15 5c 87 cb a0 32 ca 26 bb 3e 9a 68 34 .~d.\...2.&.>.h4 00:20:33.943 00000300 31 58 f8 24 e1 cf b2 5a ae 4d 4b 53 7f 85 29 4d 1X.$...Z.MKS..)M 00:20:33.943 00000310 d0 4e 9c 16 fc 54 9d 96 9b 14 33 e6 ea 66 6b 57 .N...T....3..fkW 00:20:33.943 00000320 c1 2d 9d 99 a2 7c a8 13 d2 fb 22 a6 b4 c2 4b 9f .-...|...."...K. 00:20:33.943 00000330 f5 30 19 07 ef 83 bb 31 85 5b 87 aa 61 1f ed 5e .0.....1.[..a..^ 00:20:33.943 00000340 ff 66 0f 7d 0d 42 a8 a4 d1 04 f0 f2 a2 da 3a c9 .f.}.B........:. 00:20:33.943 00000350 80 f9 a2 55 3b 23 e2 e3 b9 48 b0 f9 e2 34 d8 bf ...U;#...H...4.. 00:20:33.943 00000360 6e 4d d6 d8 be 5d 16 e0 94 7b a5 4f a1 9c 3e 53 nM...]...{.O..>S 00:20:33.943 00000370 f5 f4 33 d7 0b 8b 65 17 7d 88 fd b1 0e 14 0d 11 ..3...e.}....... 00:20:33.943 00000380 2a 39 8d 22 46 65 97 14 07 84 38 5a b9 5d 37 8c *9."Fe....8Z.]7. 00:20:33.943 00000390 fb bd 9e 42 92 b0 69 c4 6d bb 8e 72 5c 1b 68 db ...B..i.m..r\.h. 00:20:33.943 000003a0 cc e2 d7 9b 3d 56 15 62 73 ad 9d 03 51 85 37 e0 ....=V.bs...Q.7. 00:20:33.943 000003b0 71 41 b9 4e ca a3 52 d3 b4 ff a2 39 5a 85 bc 4b qA.N..R....9Z..K 00:20:33.943 000003c0 f5 1b bd 12 46 3d 3d b1 b4 54 ac 49 6a a2 8d 33 ....F==..T.Ij..3 00:20:33.943 000003d0 9f 57 17 e9 1a 5a 5e 33 e0 74 ff 3d 68 de 40 70 .W...Z^3.t.=h.@p 00:20:33.943 000003e0 c1 e5 dd 47 7b 0f 3d ac 21 0b d3 1c 49 10 82 bf ...G{.=.!...I... 00:20:33.943 000003f0 91 71 6a a2 85 5a 81 6d 34 d8 e0 d5 85 68 21 47 .qj..Z.m4....h!G 00:20:33.943 host pubkey: 00:20:33.943 00000000 6d 07 a5 ff 33 5a e9 dd 5d b0 f5 a5 4a c7 43 89 m...3Z..]...J.C. 00:20:33.943 00000010 67 b9 cf f3 bd 9b d9 b8 b2 76 dc 01 73 a1 bc 65 g........v..s..e 00:20:33.943 00000020 db b5 55 72 55 98 f4 4d e2 07 10 92 4f ad 7e 73 ..UrU..M....O.~s 00:20:33.943 00000030 b6 66 39 82 d1 96 e9 72 8a eb 04 e4 d1 3c 0d fa .f9....r.....<.. 00:20:33.943 00000040 da d7 0c f8 be fa c9 ec e9 51 6c ce 2e 6a 17 4d .........Ql..j.M 00:20:33.943 00000050 5d 82 04 c2 d1 34 75 51 e8 26 64 31 9f b4 42 3b ]....4uQ.&d1..B; 00:20:33.943 00000060 8d ec b7 d7 7c 15 2b 47 6b cf e0 c5 36 e8 fa 78 ....|.+Gk...6..x 00:20:33.943 00000070 08 db 74 ba 40 0f b1 e4 54 5f e0 7b e5 16 ee ba ..t.@...T_.{.... 00:20:33.943 00000080 85 ef c2 18 bb 06 3e 44 19 03 88 1c 6a ee a2 bd ......>D....j... 00:20:33.943 00000090 d9 20 6b 60 62 77 b3 e4 6d eb 0b 49 93 23 b8 74 . k`bw..m..I.#.t 00:20:33.943 000000a0 6b 13 76 f1 2a f9 fe 07 cd cc 60 22 ec 26 cf 6e k.v.*.....`".&.n 00:20:33.943 000000b0 9d 00 5f e7 2b 1a 81 35 70 0c 6d 12 3b e7 72 f7 .._.+..5p.m.;.r. 00:20:33.943 000000c0 d2 f1 7e 3d 74 05 05 9f c8 39 55 5f 96 3f 6b 15 ..~=t....9U_.?k. 00:20:33.943 000000d0 78 8b 29 cd 49 72 4f 04 98 94 85 89 96 74 f0 99 x.).IrO......t.. 00:20:33.943 000000e0 40 40 d4 aa 9c 35 95 ae 75 14 40 f0 b4 63 b9 ad @@...5..u.@..c.. 00:20:33.943 000000f0 be 0f bd ce f2 03 7f f9 76 be 33 a0 3e ab d5 de ........v.3.>... 00:20:33.943 00000100 76 2f 0d 7d 15 0d ac ad 3a dc 3d f4 18 c5 96 77 v/.}....:.=....w 00:20:33.943 00000110 41 e2 bb 8e 7f d3 cf 71 c7 cc 9b cf 7e fa 32 4a A......q....~.2J 00:20:33.943 00000120 59 ea ed 8e 88 49 2b 7d fc 6e c8 c5 7d 52 f5 e5 Y....I+}.n..}R.. 00:20:33.943 00000130 6d d3 77 b5 92 0f 84 a7 6f 6f 84 4a 27 6e c6 eb m.w.....oo.J'n.. 00:20:33.943 00000140 0d 2e 8c 2c 42 1b 05 8d 37 b6 d5 a9 fd bf 30 08 ...,B...7.....0. 00:20:33.943 00000150 65 89 28 39 00 fa 18 ad 78 4a d4 8c a3 07 5a 87 e.(9....xJ....Z. 00:20:33.943 00000160 da 32 50 20 30 d4 f6 3f 17 1a bf 2d fc 35 a0 aa .2P 0..?...-.5.. 00:20:33.943 00000170 89 62 c0 54 b8 ce 83 de d2 ca de 11 65 d4 92 8c .b.T........e... 00:20:33.943 00000180 c9 04 af 37 eb 33 a5 f8 e1 c0 c3 9f 48 97 d8 91 ...7.3......H... 00:20:33.943 00000190 90 40 e4 a7 7e 01 df b9 7c 63 31 1f c5 6a cb 36 .@..~...|c1..j.6 00:20:33.943 000001a0 9b cc 8e 75 7c 73 7d f1 bc 0c af 02 71 66 88 e6 ...u|s}.....qf.. 00:20:33.943 000001b0 8a 1e 51 07 77 79 07 af 6e 9f ba 85 aa 1b 24 0d ..Q.wy..n.....$. 00:20:33.943 000001c0 94 38 2c ae d8 77 41 db c0 4b 47 c1 10 f7 fa 90 .8,..wA..KG..... 00:20:33.943 000001d0 3d 47 8f e6 54 c3 48 f4 ef cb 92 8c 36 da 10 6c =G..T.H.....6..l 00:20:33.943 000001e0 0f c2 df 48 14 a5 b6 04 d3 48 24 16 d8 ca 74 de ...H.....H$...t. 00:20:33.943 000001f0 a8 f4 df 31 50 f2 83 7d a2 56 d3 bd 6e 23 d7 e0 ...1P..}.V..n#.. 00:20:33.943 00000200 70 6e ab 29 4c 5e 74 84 f4 73 23 0f 8d 46 4a 0c pn.)L^t..s#..FJ. 00:20:33.943 00000210 bb 42 60 84 a0 e3 2a 59 b3 e7 ff 4e 32 b6 1f a9 .B`...*Y...N2... 00:20:33.943 00000220 9e 0c 01 3d dc 3a be 1e 57 ba 09 8b d4 54 b5 7b ...=.:..W....T.{ 00:20:33.943 00000230 7c 7f c3 a3 38 45 e8 ef 60 94 d6 43 02 53 6c 9e |...8E..`..C.Sl. 00:20:33.943 00000240 25 8c 91 f4 c8 a9 06 94 16 a1 ab 15 13 e4 73 e6 %.............s. 00:20:33.943 00000250 2f 5e 9b d5 9b 92 76 e6 d9 45 80 14 7c 9e b8 22 /^....v..E..|.." 00:20:33.943 00000260 01 8f e0 d3 21 3b 01 72 ea 4c 37 24 c8 ec c7 b7 ....!;.r.L7$.... 00:20:33.943 00000270 b6 6c 71 48 3d 45 37 52 14 61 46 97 98 6e fc 64 .lqH=E7R.aF..n.d 00:20:33.943 00000280 88 80 5f f7 ec ab c1 79 08 61 8b 53 b9 0d 85 16 .._....y.a.S.... 00:20:33.943 00000290 15 e1 6b 2b 3b ee 5a bb c1 d6 c9 95 dd 22 c4 0c ..k+;.Z......".. 00:20:33.943 000002a0 9f 26 75 2a 41 01 8e 71 12 dd 90 6a da 90 31 48 .&u*A..q...j..1H 00:20:33.943 000002b0 d3 f9 c6 b8 20 10 ca 92 a8 6f 30 96 10 0d bd 9e .... ....o0..... 00:20:33.943 000002c0 20 d2 b0 ec f8 0d 06 5b 02 af ff 64 ca 85 d6 b1 ......[...d.... 00:20:33.943 000002d0 7b f7 94 92 49 f1 ca 63 1b a6 ec 8a e0 18 82 fe {...I..c........ 00:20:33.943 000002e0 06 e9 b0 41 7f 37 0e 8f 99 d1 71 df c1 43 5d 6a ...A.7....q..C]j 00:20:33.943 000002f0 6b 40 e5 fa df 38 ed 41 90 b6 f0 0b e5 85 18 9c k@...8.A........ 00:20:33.943 00000300 5e b6 18 e6 c6 d1 8a 18 8a 3b 06 97 5d 9c 86 ec ^........;..]... 00:20:33.943 00000310 19 fc aa cc b8 0f 43 af ee db 98 be bd 1b 9e 2f ......C......../ 00:20:33.943 00000320 78 f8 e0 e9 06 e1 0e 23 ae d5 a7 56 33 b0 d6 6d x......#...V3..m 00:20:33.943 00000330 0d 3a 33 2f 01 6e 14 30 85 c8 13 c4 cf 19 8b 6f .:3/.n.0.......o 00:20:33.943 00000340 ac a1 53 40 0b 30 2d 5c d9 97 2d ca 6a e3 88 41 ..S@.0-\..-.j..A 00:20:33.943 00000350 23 3c af c2 bf 76 15 2c 05 1a 90 2f 2b ec 1e 0a #<...v.,.../+... 00:20:33.943 00000360 b9 ce bb 86 a8 7e d2 80 16 05 80 fc 81 46 96 66 .....~.......F.f 00:20:33.943 00000370 a6 ca 49 2f a0 0a e9 39 4f 09 9e 79 7a 26 75 37 ..I/...9O..yz&u7 00:20:33.943 00000380 c1 06 2a 0a 66 a7 3e ff f0 3d c8 9b 8e 25 99 e1 ..*.f.>..=...%.. 00:20:33.943 00000390 9e 3e f9 01 6f b8 10 ed e9 8c ec 47 23 d6 b6 c6 .>..o......G#... 00:20:33.943 000003a0 ce 3c 76 26 80 3a e4 9e 21 d1 a2 e9 d1 2d 1b 2e .6...&.. 00:20:33.943 00000070 30 29 00 41 9c 50 92 77 3c 13 d0 d7 ea 10 b8 91 0).A.P.w<....... 00:20:33.943 00000080 ef 8f 0c 0b 02 13 cb 07 7b de a0 d2 c3 a7 b3 d9 ........{....... 00:20:33.943 00000090 3b 4f 92 4d 51 f5 26 89 17 4a f0 11 dd f4 cf f9 ;O.MQ.&..J...... 00:20:33.943 000000a0 dc a3 c9 d5 d2 85 af 7e c8 fe 69 fc 92 8c b2 5c .......~..i....\ 00:20:33.943 000000b0 55 9d 96 92 9f 8e f4 a7 26 d3 9f 43 52 ca 74 90 U.......&..CR.t. 00:20:33.943 000000c0 23 87 61 38 56 06 de 79 79 98 73 9b b3 b3 53 39 #.a8V..yy.s...S9 00:20:33.943 000000d0 9d 95 34 af f5 03 db e8 6a c4 bd 12 cf 23 7e 6b ..4.....j....#~k 00:20:33.943 000000e0 11 77 b5 4e 53 bb 2c 29 94 c7 50 06 56 c0 0d 7b .w.NS.,)..P.V..{ 00:20:33.943 000000f0 24 ec 47 a3 cb 29 b0 ce 08 ad c7 50 d0 47 68 02 $.G..).....P.Gh. 00:20:33.943 00000100 e4 9b 70 6d 3e bb 1a 31 1d 44 38 25 99 55 e8 53 ..pm>..1.D8%.U.S 00:20:33.943 00000110 27 d8 ae 05 84 e4 81 f7 4a 69 72 5e 7c e8 56 86 '.......Jir^|.V. 00:20:33.943 00000120 ab d2 9a 0e dd 6b 99 39 fe 43 b6 d0 69 b7 4d 89 .....k.9.C..i.M. 00:20:33.943 00000130 e1 d9 79 d6 9e d9 91 59 82 96 74 03 06 e4 b2 c5 ..y....Y..t..... 00:20:33.943 00000140 73 6f dd ad ca 4b e5 31 9c ec ed d3 cc 67 ed 04 so...K.1.....g.. 00:20:33.943 00000150 94 12 89 c0 80 a6 b9 d2 36 0b 03 da c8 c4 1b 37 ........6......7 00:20:33.943 00000160 0f e9 2d 2f 74 72 3b e8 01 15 9e 78 ad fb e1 6c ..-/tr;....x...l 00:20:33.943 00000170 55 f4 a8 ff 51 21 10 2e 20 f6 bd a1 ab d3 4f 9c U...Q!.. .....O. 00:20:33.943 00000180 53 c6 88 52 bc c7 77 76 10 76 e8 0a 5f 8a 88 6d S..R..wv.v.._..m 00:20:33.943 00000190 3b 1a 86 e1 73 b9 55 ca 3c f1 a9 b7 65 b3 b8 d4 ;...s.U.<...e... 00:20:33.943 000001a0 56 3c 17 77 34 a8 e6 33 3f e1 88 ef d9 46 bb 4f V<.w4..3?....F.O 00:20:33.943 000001b0 da fc 1f 77 1a 32 6b 66 8e d4 be 31 66 47 ae 37 ...w.2kf...1fG.7 00:20:33.943 000001c0 73 66 f9 33 73 05 07 c2 73 d0 7a 1b 1d ca de 16 sf.3s...s.z..... 00:20:33.943 000001d0 24 f8 f1 91 88 c6 2e 13 2d ef 8f d8 96 3e b8 89 $.......-....>.. 00:20:33.943 000001e0 11 68 6e 94 d4 11 99 42 1b 85 1f bd e8 df df 03 .hn....B........ 00:20:33.943 000001f0 52 79 96 3e 7e 31 46 e6 ee ba ba c3 e1 f4 22 6a Ry.>~1F......."j 00:20:33.944 00000200 3d 6f 90 d2 d7 a8 85 21 0e c8 c1 ae 4c 74 de 01 =o.....!....Lt.. 00:20:33.944 00000210 e3 5c 89 92 3c db fe d0 a9 f2 e9 5e 41 18 d6 22 .\..<......^A.." 00:20:33.944 00000220 79 e9 31 24 c7 37 83 f1 86 cc 20 aa 8a 5a 36 b5 y.1$.7.... ..Z6. 00:20:33.944 00000230 bf 2f c5 d6 b7 48 62 e2 10 98 7f 1a 25 07 9b c3 ./...Hb.....%... 00:20:33.944 00000240 5f 2d d7 4a d2 2c 87 b8 4c 15 5e 1b 9c 2b 02 6a _-.J.,..L.^..+.j 00:20:33.944 00000250 c4 1d e4 41 fd 02 c4 f4 34 63 79 43 9e cb d4 cb ...A....4cyC.... 00:20:33.944 00000260 96 3d f0 da 5e 31 a5 6e 3c 71 5a dc e2 76 fd 52 .=..^1.n.a.] 00:20:33.944 00000010 42 51 25 ec a3 89 11 92 42 2a 59 39 3e a1 38 99 BQ%.....B*Y9>.8. 00:20:33.944 00000020 20 fe 3e e0 98 51 7a 5a ad 31 07 b4 d8 f4 2b 0e .>..QzZ.1....+. 00:20:33.944 00000030 98 92 62 b9 8a 89 fc 73 fb de 2a 42 78 e4 de d7 ..b....s..*Bx... 00:20:33.944 00000040 ce 51 f1 1b b2 ed d7 64 a8 ba d6 38 50 5c c8 cd .Q.....d...8P\.. 00:20:33.944 00000050 99 1b 0e 6e 45 66 87 74 91 2f 0b b6 9d 4e b6 c6 ...nEf.t./...N.. 00:20:33.944 00000060 a1 56 46 2c 2b 70 13 1e eb 0b 65 ad 44 86 48 dd .VF,+p....e.D.H. 00:20:33.944 00000070 f5 7a ed 4d 03 54 43 5c ff bd 43 24 cb 99 4d 03 .z.M.TC\..C$..M. 00:20:33.944 00000080 32 10 9b bf 79 e6 2c 52 fa d6 dc 4b 52 4a 30 6f 2...y.,R...KRJ0o 00:20:33.944 00000090 04 cd 01 7f a2 0a a9 e3 2c 6d 5a 5e 21 b3 e2 62 ........,mZ^!..b 00:20:33.944 000000a0 cc 9e 66 73 42 6b e2 df f7 d5 8a 43 7a e6 ee fd ..fsBk.....Cz... 00:20:33.944 000000b0 e6 ec da c7 79 a8 59 ab f0 f8 11 ac b6 91 1a f3 ....y.Y......... 00:20:33.944 000000c0 9c 6c 8a 2b 0b 5e 9d 65 7c ce df f2 32 c2 12 96 .l.+.^.e|...2... 00:20:33.944 000000d0 c8 9a 9d 75 a6 fc ec f8 5d a1 24 bb 2f da 19 08 ...u....].$./... 00:20:33.944 000000e0 c6 8b 09 20 50 c1 4a 57 e2 36 92 1f 04 56 f6 fc ... P.JW.6...V.. 00:20:33.944 000000f0 9a 98 ac 07 89 34 f8 78 80 e3 20 e3 f8 78 9f df .....4.x.. ..x.. 00:20:33.944 00000100 2a d2 6a c3 c2 9d 82 1b 70 49 b2 f7 54 ba ad 11 *.j.....pI..T... 00:20:33.944 00000110 5d ec e8 21 3f db d2 f9 aa 05 7c 9f 90 88 9e 3e ]..!?.....|....> 00:20:33.944 00000120 90 e7 40 a6 56 e6 f4 df 10 c2 3f 92 c4 04 f5 cf ..@.V.....?..... 00:20:33.944 00000130 b5 7e 77 a1 e4 89 6f c1 fe 83 98 0a 44 60 56 a7 .~w...o.....D`V. 00:20:33.944 00000140 0b 5a 06 7c 34 12 09 ed a3 6a e0 fc 37 f6 13 bf .Z.|4....j..7... 00:20:33.944 00000150 18 d4 3f 10 ee ec 16 06 e8 51 e2 75 97 53 fb 97 ..?......Q.u.S.. 00:20:33.944 00000160 81 b7 8c cd 4d a3 93 00 55 a1 6c 62 7a 2b e9 60 ....M...U.lbz+.` 00:20:33.944 00000170 98 cd b7 a9 a8 fe 5e d7 72 6a f9 ac a6 c3 25 31 ......^.rj....%1 00:20:33.944 00000180 44 1b 88 fe 8b 5b b9 2d 44 67 46 75 55 ca d2 d6 D....[.-DgFuU... 00:20:33.944 00000190 81 ac 5b 43 9e 6c f3 70 f7 09 b0 9e fd 0e 30 e2 ..[C.l.p......0. 00:20:33.944 000001a0 c4 3c e9 cc 4d dd 90 bb 56 d8 47 61 7b ca 54 79 .<..M...V.Ga{.Ty 00:20:33.944 000001b0 8e 6d 72 a2 c0 80 42 93 7b 50 e2 99 1a 6b be 65 .mr...B.{P...k.e 00:20:33.944 000001c0 df b1 3c 2a e0 9b 60 aa 4f 77 4f 6f 2c b9 01 c2 ..<*..`.OwOo,... 00:20:33.944 000001d0 f0 89 bc 09 d5 14 cd e1 a2 2b 97 c1 74 93 58 d5 .........+..t.X. 00:20:33.944 000001e0 ef 7e e9 81 34 f2 a9 0c 2f 0e 10 a5 9b c5 4e 30 .~..4.../.....N0 00:20:33.944 000001f0 46 c4 ff c5 1d a9 4b 51 94 89 ec 67 f2 86 d8 92 F.....KQ...g.... 00:20:33.944 00000200 91 e4 3b b2 59 28 c4 f3 b2 f4 5e f0 bf a4 1d 29 ..;.Y(....^....) 00:20:33.944 00000210 c5 50 e2 b1 a0 6e 0a 0f 50 a6 0c 79 a6 5f a1 74 .P...n..P..y._.t 00:20:33.944 00000220 db 05 79 49 f6 8d 89 35 d8 b6 96 68 c4 bf 10 89 ..yI...5...h.... 00:20:33.944 00000230 e4 f5 94 90 10 b9 00 da 00 fc d5 9e 01 a4 b6 0a ................ 00:20:33.944 00000240 9d 52 96 fc ee d7 e2 28 97 93 aa bd 7b 28 22 90 .R.....(....{(". 00:20:33.944 00000250 ad dd 0c bd 73 30 78 7c 49 cb 6b aa 39 ab 43 6d ....s0x|I.k.9.Cm 00:20:33.944 00000260 be 75 b6 fd 1f bb de df 7e ea 78 c5 9d d4 3d 0c .u......~.x...=. 00:20:33.944 00000270 a0 ff c1 e7 cc f2 13 9c cb 4f f0 ac 6b f6 cc f2 .........O..k... 00:20:33.944 00000280 0d 1e 5d ff b9 7a d8 b8 d7 9e 61 9f 96 d6 5a 5b ..]..z....a...Z[ 00:20:33.944 00000290 57 64 c8 c2 a9 68 57 d9 33 b6 83 a4 67 f5 3e d2 Wd...hW.3...g.>. 00:20:33.944 000002a0 90 43 af d7 ea b8 82 22 6a 8e 58 82 d9 50 ca fa .C....."j.X..P.. 00:20:33.944 000002b0 82 b9 b5 fa 0c 94 87 37 e5 80 58 55 22 90 08 03 .......7..XU"... 00:20:33.944 000002c0 61 ea 94 79 36 15 f7 ca ba 29 a9 69 e7 18 4b d2 a..y6....).i..K. 00:20:33.944 000002d0 aa 09 32 00 61 61 a1 0e 18 97 dd 10 35 42 f9 25 ..2.aa......5B.% 00:20:33.944 000002e0 4b 5f 5e b9 14 22 ce 51 dd 45 82 4c da 7a 6f a0 K_^..".Q.E.L.zo. 00:20:33.944 000002f0 2c ad 13 3e 4a de 9e e5 f9 d7 13 52 97 db 10 6a ,..>J......R...j 00:20:33.944 00000300 a6 eb 09 b7 5a d3 d6 0d 1b 06 29 53 c8 9d 7c 65 ....Z.....)S..|e 00:20:33.944 00000310 07 d0 45 41 ec 3b e0 40 4d 54 bd 2a 5c 01 5f db ..EA.;.@MT.*\._. 00:20:33.944 00000320 c3 18 cf 56 f2 b3 a6 a2 88 e7 e8 1c f4 48 4e e7 ...V.........HN. 00:20:33.944 00000330 e9 c9 5c 18 d4 d0 fa d3 d3 2e 0a 9c 57 3a 4e 4c ..\.........W:NL 00:20:33.944 00000340 a0 8e 2f 68 f1 33 50 03 ee 6b 84 8e 19 0e d7 60 ../h.3P..k.....` 00:20:33.944 00000350 67 2d b0 54 e2 55 53 43 eb 68 92 86 85 36 3c 3f g-.T.USC.h...6.m}.k;..-..C.F 00:20:33.944 00000010 95 e4 a2 75 37 23 bb b8 6c c7 e1 4c d5 f1 0c bf ...u7#..l..L.... 00:20:33.944 00000020 21 ee 26 a4 0c 4b 47 85 f4 c0 2c 63 b6 81 54 9f !.&..KG...,c..T. 00:20:33.944 00000030 69 61 0e 53 55 6e c1 b1 e1 cd ee ee b5 48 c3 88 ia.SUn.......H.. 00:20:33.944 00000040 ac dc b9 11 8c b6 8e ca 36 a4 c0 5b 5b 13 37 34 ........6..[[.74 00:20:33.944 00000050 28 57 31 29 39 be d0 d4 1d b7 98 34 64 09 32 b3 (W1)9......4d.2. 00:20:33.944 00000060 1f ba 84 72 14 34 52 5f ee 36 64 58 a1 41 02 07 ...r.4R_.6dX.A.. 00:20:33.944 00000070 a0 14 cf 6e 32 a6 11 c5 b6 af a7 f9 2c 32 30 6b ...n2.......,20k 00:20:33.944 00000080 de b9 fa 12 3c d4 3d 42 77 16 89 5f a4 8f 99 d8 ....<.=Bw.._.... 00:20:33.944 00000090 3a 63 3b 86 5e d5 c0 93 b6 58 e1 64 6d 51 7b e9 :c;.^....X.dmQ{. 00:20:33.944 000000a0 a9 c2 ad ec a8 63 4a 07 e0 9d 5a c1 3c f3 62 7f .....cJ...Z.<.b. 00:20:33.944 000000b0 c7 67 e8 45 df b8 d5 5b 31 e6 c3 a4 b9 dc 1f c2 .g.E...[1....... 00:20:33.944 000000c0 73 08 b7 c4 4e 80 5c ad 11 c5 a8 35 05 44 08 4e s...N.\....5.D.N 00:20:33.944 000000d0 75 56 1c 8e 8d a9 b1 31 76 f7 44 6f ee 50 46 aa uV.....1v.Do.PF. 00:20:33.944 000000e0 0c a8 4d 26 48 8b 4a e9 f7 bb bf 40 2b 0e 36 97 ..M&H.J....@+.6. 00:20:33.944 000000f0 23 cb 65 a6 46 18 ca ee 39 31 d0 aa 8a b1 0a a7 #.e.F...91...... 00:20:33.944 00000100 00 28 ba 9e fe fa 59 cb d1 02 f1 8b 9a 1b ae cc .(....Y......... 00:20:33.944 00000110 8b 78 ac 11 53 f6 fe b5 78 c0 a2 93 51 32 86 e4 .x..S...x...Q2.. 00:20:33.944 00000120 4d b3 96 97 86 6d 7a e3 0f 7a 2e 32 2e 41 a2 0a M....mz..z.2.A.. 00:20:33.944 00000130 a3 2e 1b be a3 84 d2 6d 06 fb 43 e3 ab 64 f8 e6 .......m..C..d.. 00:20:33.944 00000140 4d 9c f9 2d c5 a5 47 13 58 aa fc 5e b1 6f 9a 14 M..-..G.X..^.o.. 00:20:33.944 00000150 3d ee c9 76 d7 d2 ca dc f5 e9 71 0b f4 20 d0 e5 =..v......q.. .. 00:20:33.944 00000160 b7 6f 4e 7c 0f 2c 16 21 60 49 8e 27 3f f9 73 39 .oN|.,.!`I.'?.s9 00:20:33.944 00000170 f0 c5 cc 2c 9c 3e 7f 9e 06 13 96 08 6b ff 49 c0 ...,.>......k.I. 00:20:33.944 00000180 e5 13 01 7c 1e 13 3a 53 96 6f be 04 92 8c e8 63 ...|..:S.o.....c 00:20:33.944 00000190 de e4 2e 52 45 ae ed 9b e0 c4 71 df 12 ef 8e 79 ...RE.....q....y 00:20:33.944 000001a0 d6 53 ae cc 65 35 e3 b4 5a f7 82 c7 ab 81 66 34 .S..e5..Z.....f4 00:20:33.944 000001b0 4f 78 8b 6e 70 cf b3 bf de 69 3e 18 e3 60 05 0b Ox.np....i>..`.. 00:20:33.944 000001c0 29 90 be cc 43 6e a7 d0 4c c3 02 eb 38 96 1d 7c )...Cn..L...8..| 00:20:33.944 000001d0 bc 62 84 63 78 e1 c8 29 0a 76 6f 5f fb 9a 23 1f .b.cx..).vo_..#. 00:20:33.944 000001e0 70 50 57 3e 41 6e 14 7b 94 a8 44 0f b8 0d de a6 pPW>An.{..D..... 00:20:33.944 000001f0 9f d9 5c 93 91 62 9c e2 1b c9 c7 d0 46 07 71 74 ..\..b......F.qt 00:20:33.944 00000200 74 5d de a5 61 47 c2 25 7e f2 95 d8 e7 d2 fd 1e t]..aG.%~....... 00:20:33.944 00000210 a7 b4 90 83 01 ed 8d 5c 85 86 b9 76 1b e2 91 b9 .......\...v.... 00:20:33.944 00000220 77 7c 99 c4 a7 06 21 8d 11 c0 05 f7 4f 27 84 33 w|....!.....O'.3 00:20:33.945 00000230 d6 42 88 fa 8d 31 de d4 4f 5e 61 aa 79 0e bf 8b .B...1..O^a.y... 00:20:33.945 00000240 92 a4 1d ad df 2c 5d c0 36 44 98 44 c9 8c 2e 40 .....,].6D.D...@ 00:20:33.945 00000250 2b 6a 22 80 24 c7 dc 30 fb a5 c8 f3 06 ee 9c 97 +j".$..0........ 00:20:33.945 00000260 2f 82 99 ad e4 a3 17 8c da d0 49 6a d2 26 da 69 /.........Ij.&.i 00:20:33.945 00000270 05 8d 2d 62 b5 df a8 a9 85 74 db b5 1a d0 bd 9a ..-b.....t...... 00:20:33.945 00000280 1d 0d 7e 49 ba 5d 72 60 27 f2 2c 45 70 db cc 48 ..~I.]r`'.,Ep..H 00:20:33.945 00000290 b0 99 95 2c 96 3c f9 cd 0f ac 2c 4b e9 cf 62 0f ...,.<....,K..b. 00:20:33.945 000002a0 20 a6 e3 3f d0 cf 60 de 6d ae 98 bf 8c e7 c1 ab ..?..`.m....... 00:20:33.945 000002b0 bc b1 8f a1 ae a9 3e d8 aa 5a bd a7 d9 3a 9c a6 ......>..Z...:.. 00:20:33.945 000002c0 9d a7 bc 1e ab b5 cb 3e a1 22 6b d7 1f 1d e2 4a .......>."k....J 00:20:33.945 000002d0 d0 b4 a2 8b 13 9f ac 5b 59 59 50 68 2a 53 14 30 .......[YYPh*S.0 00:20:33.945 000002e0 27 9c 12 4a 6d 8c 6f 18 d1 54 aa 4c 17 25 f5 8d '..Jm.o..T.L.%.. 00:20:33.945 000002f0 63 60 9f a3 bc 00 00 c3 d8 bd 72 7f 96 98 d0 07 c`........r..... 00:20:33.945 00000300 28 3c 1b ed 07 7b 1a 1a ec e4 d6 b5 a5 73 98 8d (<...{.......s.. 00:20:33.945 00000310 ff f2 ac 83 25 49 27 53 21 8f 88 00 49 b1 ff cd ....%I'S!...I... 00:20:33.945 00000320 ff 98 69 2d 3f d8 87 80 31 73 33 9b 16 6e 34 b9 ..i-?...1s3..n4. 00:20:33.945 00000330 48 92 ec 21 56 74 a8 f1 83 6a 91 3f 8a bf ce f0 H..!Vt...j.?.... 00:20:33.945 00000340 b7 ff 35 7a ad e7 91 b1 94 48 4d 95 99 e2 cc 1a ..5z.....HM..... 00:20:33.945 00000350 f5 92 f4 d7 71 2a 75 ec 74 96 7d 53 12 a3 a7 20 ....q*u.t.}S... 00:20:33.945 00000360 f4 36 6a 97 34 e3 e2 44 0a b1 79 d2 96 80 1f d4 .6j.4..D..y..... 00:20:33.945 00000370 35 64 5e f0 dd fa 69 2c ea 93 f6 03 d5 c3 bf 2b 5d^...i,.......+ 00:20:33.945 00000380 2c fd 62 eb d3 f9 9e 6c 2a 68 18 bc fd 19 58 60 ,.b....l*h....X` 00:20:33.945 00000390 bc 10 96 1b e9 4b 80 d4 a5 be c9 2c db 59 3f 4d .....K.....,.Y?M 00:20:33.945 000003a0 22 be d4 5a 91 63 0e d2 6d 5b 85 9e eb ec a3 4f "..Z.c..m[.....O 00:20:33.945 000003b0 a6 5d f5 2f c8 a5 b6 ca 3c 4d 57 92 8a 28 c6 9d .]./.... 00:20:33.945 000003d0 08 dd 64 42 fd 48 5b 24 34 ed 30 70 45 dd 79 76 ..dB.H[$4.0pE.yv 00:20:33.945 000003e0 37 55 a3 80 e7 ed b6 70 f8 64 a3 d7 bf e1 28 de 7U.....p.d....(. 00:20:33.945 000003f0 ae b6 5d 92 6d 1b e7 e6 de ef ec 40 aa 61 62 82 ..].m......@.ab. 00:20:33.945 dh secret: 00:20:33.945 00000000 1c 4d 3f 7f a0 dd 69 6c a6 a7 9a d2 eb 46 07 b1 .M?...il.....F.. 00:20:33.945 00000010 a5 ca 68 42 ea bf 03 41 60 19 8e 40 b1 51 02 1a ..hB...A`..@.Q.. 00:20:33.945 00000020 b1 23 a8 bf 50 42 1d 3b a4 5d 65 39 f9 2e f9 b6 .#..PB.;.]e9.... 00:20:33.945 00000030 87 d1 20 11 75 eb 80 97 09 eb e6 d7 7b a1 33 8a .. .u.......{.3. 00:20:33.945 00000040 85 0a 68 6f d5 9c ee 65 25 eb 21 18 11 b8 d8 b5 ..ho...e%.!..... 00:20:33.945 00000050 23 4d 80 04 77 ab 79 75 f0 9f 19 23 51 70 be 8b #M..w.yu...#Qp.. 00:20:33.945 00000060 c4 5b 29 1e 3c f4 94 fb 42 db 2f 5b 5f ea b5 bb .[).<...B./[_... 00:20:33.945 00000070 93 43 bc 27 24 f3 3a 89 68 04 08 ba 44 e1 c0 67 .C.'$.:.h...D..g 00:20:33.945 00000080 e1 70 2a c7 68 a9 f4 29 c5 90 ec c3 11 6c d8 a4 .p*.h..).....l.. 00:20:33.945 00000090 0d 2a d1 33 63 bf 58 68 f9 c6 84 40 43 95 db 56 .*.3c.Xh...@C..V 00:20:33.945 000000a0 a5 92 84 7d 2e e3 b7 46 15 71 20 80 80 2d 6b 42 ...}...F.q ..-kB 00:20:33.945 000000b0 4d 54 01 86 02 69 7b cd 7d 9a 5b 5f 86 ca 7e 89 MT...i{.}.[_..~. 00:20:33.945 000000c0 a5 f8 0e 42 a5 3e 5e f3 58 5a a8 05 9f d2 55 e7 ...B.>^.XZ....U. 00:20:33.945 000000d0 02 1e d7 0a b4 63 f9 e0 e0 0f 19 56 72 da 40 c6 .....c.....Vr.@. 00:20:33.945 000000e0 54 dc d0 a9 3a f0 34 ff 24 0e 51 f2 c0 bc e3 d5 T...:.4.$.Q..... 00:20:33.945 000000f0 e6 de 19 d4 c3 ac b2 2b 6e 44 31 04 8f a4 5c 79 .......+nD1...\y 00:20:33.945 00000100 0d 79 64 db d3 e2 e9 4e cf a6 93 76 90 f6 ae 6c .yd....N...v...l 00:20:33.945 00000110 f9 15 b9 ba 54 26 8c 1e 1a a6 7c 35 69 19 96 c3 ....T&....|5i... 00:20:33.945 00000120 96 85 d3 2f 79 b2 00 10 e2 6a 49 84 94 76 46 19 .../y....jI..vF. 00:20:33.945 00000130 e6 99 51 30 db 88 cd d1 30 27 3a b3 0c 30 97 16 ..Q0....0':..0.. 00:20:33.945 00000140 93 24 64 aa 5f 16 72 8d 6a be 97 24 ad 6c c9 11 .$d._.r.j..$.l.. 00:20:33.945 00000150 e3 5e 2c 32 c7 02 f7 13 e0 af 95 18 bc f2 1a fb .^,2............ 00:20:33.945 00000160 b1 57 66 e2 fa 68 f8 73 9d f6 7c 49 2a 74 30 89 .Wf..h.s..|I*t0. 00:20:33.945 00000170 f4 75 93 f2 b3 80 a9 13 97 60 6a 93 05 e6 fe 6e .u.......`j....n 00:20:33.945 00000180 e4 24 a5 56 62 a4 fb 8b 7e 35 c7 85 e9 27 1c 7b .$.Vb...~5...'.{ 00:20:33.945 00000190 13 c9 6a 4c e0 59 fd 07 cc bd 67 20 8b 07 61 e5 ..jL.Y....g ..a. 00:20:33.945 000001a0 b8 e5 30 a0 7e 4d 9c cf a4 13 1c 33 d3 04 87 3d ..0.~M.....3...= 00:20:33.945 000001b0 ad f3 ec 11 4f 1c 04 23 98 9d 75 6f 4f 07 cc f5 ....O..#..uoO... 00:20:33.945 000001c0 ea db 6e 5d 88 cd d7 26 c3 68 02 be d0 fc c3 d5 ..n]...&.h...... 00:20:33.945 000001d0 81 0d 42 6f 8a 26 cf f6 50 f8 f9 f6 9c 8d 9c cb ..Bo.&..P....... 00:20:33.945 000001e0 e9 cd a9 a3 93 a5 28 b7 80 7a 87 3b d7 84 38 6c ......(..z.;..8l 00:20:33.945 000001f0 a5 8f cc ce 12 97 42 9b 25 0b d5 0f b1 2e 75 12 ......B.%.....u. 00:20:33.945 00000200 da ac c1 d0 ac 94 4d 90 b3 0f 56 1e 8b ad 06 f2 ......M...V..... 00:20:33.945 00000210 ec 0f cf 35 0e 79 e5 3b 7f f3 aa ca 0f df 12 5c ...5.y.;.......\ 00:20:33.945 00000220 1f 00 af ec e6 45 22 93 04 c5 83 81 1e 49 72 b0 .....E"......Ir. 00:20:33.945 00000230 7e d8 51 01 4b 92 4e 6a a1 27 25 01 fe 42 b6 6c ~.Q.K.Nj.'%..B.l 00:20:33.945 00000240 d5 15 d1 cc b1 4e f4 e9 3c 16 1b b6 18 b7 b2 f0 .....N..<....... 00:20:33.945 00000250 a4 f3 47 88 36 1a aa c6 51 ca f8 62 b0 14 4f c3 ..G.6...Q..b..O. 00:20:33.945 00000260 1b f0 59 70 36 df 56 aa e8 c9 3a c9 11 cf 66 7b ..Yp6.V...:...f{ 00:20:33.945 00000270 f5 32 df 5c be 85 1c a2 8c ed a3 b9 a0 04 c0 34 .2.\...........4 00:20:33.945 00000280 53 b5 97 f7 71 96 5d f0 15 21 4e 9f 2e 18 51 a6 S...q.]..!N...Q. 00:20:33.945 00000290 87 c6 48 99 a2 eb 20 18 fc e5 05 85 0f a8 d6 a1 ..H... ......... 00:20:33.945 000002a0 ee 75 f9 40 a5 3e 51 d3 b0 b4 f6 ac 4c 9d ec 70 .u.@.>Q.....L..p 00:20:33.945 000002b0 1a 87 ff 9c 19 39 39 e0 f9 f9 da 42 1e 1f 3b 9f .....99....B..;. 00:20:33.945 000002c0 93 44 54 9b 21 2b 2c 4d ab 97 9e 1f e1 e4 8f d5 .DT.!+,M........ 00:20:33.945 000002d0 f7 b7 1b 96 59 a9 15 f8 35 85 ca c2 25 72 aa 7f ....Y...5...%r.. 00:20:33.945 000002e0 df 7e ec fb 8a 4b f4 6d 7f 33 f1 28 da fa 0d 45 .~...K.m.3.(...E 00:20:33.945 000002f0 a6 ce 6b 91 16 8f bd e6 26 a5 fc c2 f3 bc bf ca ..k.....&....... 00:20:33.945 00000300 e7 8e 98 47 e5 9c c6 bc 2e 8e dd d9 c9 17 74 3b ...G..........t; 00:20:33.945 00000310 e2 a0 ee 4c 7f 27 02 12 94 c4 b2 57 1b 66 53 f5 ...L.'.....W.fS. 00:20:33.945 00000320 ea 0f e1 00 e7 8b 20 64 7b ad f4 4a 44 c8 b0 b9 ...... d{..JD... 00:20:33.945 00000330 79 a3 d8 eb f8 63 a5 5e 63 b9 cf b1 30 78 2a 5c y....c.^c...0x*\ 00:20:33.945 00000340 8b 9c 72 64 60 aa c6 cf b3 e0 6f 28 81 13 e8 69 ..rd`.....o(...i 00:20:33.945 00000350 84 65 47 08 eb d3 9b fa cd b5 c8 36 f4 f0 eb c5 .eG........6.... 00:20:33.945 00000360 81 c3 b8 b6 6f b6 5d 2e 7f 49 e8 08 47 b3 27 d8 ....o.]..I..G.'. 00:20:33.945 00000370 97 34 f4 70 77 83 b2 83 36 8f 46 48 0c 0a 37 40 .4.pw...6.FH..7@ 00:20:33.945 00000380 b6 b6 3f af eb a5 91 0a 8c 0e 42 e9 eb b6 2a 10 ..?.......B...*. 00:20:33.945 00000390 02 7a 61 b5 4e b2 f3 71 d9 61 8a 61 10 77 ea 99 .za.N..q.a.a.w.. 00:20:33.945 000003a0 ef dd 67 33 c9 36 98 e6 15 0d 0a 47 15 22 e7 c7 ..g3.6.....G.".. 00:20:33.945 000003b0 00 ae 38 fb 5d fd 2b 39 58 30 3c 63 d7 3d 5a c4 ..8.].+9X0.a.] 00:20:33.945 00000010 42 51 25 ec a3 89 11 92 42 2a 59 39 3e a1 38 99 BQ%.....B*Y9>.8. 00:20:33.945 00000020 20 fe 3e e0 98 51 7a 5a ad 31 07 b4 d8 f4 2b 0e .>..QzZ.1....+. 00:20:33.945 00000030 98 92 62 b9 8a 89 fc 73 fb de 2a 42 78 e4 de d7 ..b....s..*Bx... 00:20:33.945 00000040 ce 51 f1 1b b2 ed d7 64 a8 ba d6 38 50 5c c8 cd .Q.....d...8P\.. 00:20:33.945 00000050 99 1b 0e 6e 45 66 87 74 91 2f 0b b6 9d 4e b6 c6 ...nEf.t./...N.. 00:20:33.945 00000060 a1 56 46 2c 2b 70 13 1e eb 0b 65 ad 44 86 48 dd .VF,+p....e.D.H. 00:20:33.945 00000070 f5 7a ed 4d 03 54 43 5c ff bd 43 24 cb 99 4d 03 .z.M.TC\..C$..M. 00:20:33.945 00000080 32 10 9b bf 79 e6 2c 52 fa d6 dc 4b 52 4a 30 6f 2...y.,R...KRJ0o 00:20:33.945 00000090 04 cd 01 7f a2 0a a9 e3 2c 6d 5a 5e 21 b3 e2 62 ........,mZ^!..b 00:20:33.945 000000a0 cc 9e 66 73 42 6b e2 df f7 d5 8a 43 7a e6 ee fd ..fsBk.....Cz... 00:20:33.945 000000b0 e6 ec da c7 79 a8 59 ab f0 f8 11 ac b6 91 1a f3 ....y.Y......... 00:20:33.945 000000c0 9c 6c 8a 2b 0b 5e 9d 65 7c ce df f2 32 c2 12 96 .l.+.^.e|...2... 00:20:33.945 000000d0 c8 9a 9d 75 a6 fc ec f8 5d a1 24 bb 2f da 19 08 ...u....].$./... 00:20:33.945 000000e0 c6 8b 09 20 50 c1 4a 57 e2 36 92 1f 04 56 f6 fc ... P.JW.6...V.. 00:20:33.945 000000f0 9a 98 ac 07 89 34 f8 78 80 e3 20 e3 f8 78 9f df .....4.x.. ..x.. 00:20:33.945 00000100 2a d2 6a c3 c2 9d 82 1b 70 49 b2 f7 54 ba ad 11 *.j.....pI..T... 00:20:33.945 00000110 5d ec e8 21 3f db d2 f9 aa 05 7c 9f 90 88 9e 3e ]..!?.....|....> 00:20:33.945 00000120 90 e7 40 a6 56 e6 f4 df 10 c2 3f 92 c4 04 f5 cf ..@.V.....?..... 00:20:33.945 00000130 b5 7e 77 a1 e4 89 6f c1 fe 83 98 0a 44 60 56 a7 .~w...o.....D`V. 00:20:33.945 00000140 0b 5a 06 7c 34 12 09 ed a3 6a e0 fc 37 f6 13 bf .Z.|4....j..7... 00:20:33.945 00000150 18 d4 3f 10 ee ec 16 06 e8 51 e2 75 97 53 fb 97 ..?......Q.u.S.. 00:20:33.945 00000160 81 b7 8c cd 4d a3 93 00 55 a1 6c 62 7a 2b e9 60 ....M...U.lbz+.` 00:20:33.945 00000170 98 cd b7 a9 a8 fe 5e d7 72 6a f9 ac a6 c3 25 31 ......^.rj....%1 00:20:33.945 00000180 44 1b 88 fe 8b 5b b9 2d 44 67 46 75 55 ca d2 d6 D....[.-DgFuU... 00:20:33.945 00000190 81 ac 5b 43 9e 6c f3 70 f7 09 b0 9e fd 0e 30 e2 ..[C.l.p......0. 00:20:33.945 000001a0 c4 3c e9 cc 4d dd 90 bb 56 d8 47 61 7b ca 54 79 .<..M...V.Ga{.Ty 00:20:33.945 000001b0 8e 6d 72 a2 c0 80 42 93 7b 50 e2 99 1a 6b be 65 .mr...B.{P...k.e 00:20:33.945 000001c0 df b1 3c 2a e0 9b 60 aa 4f 77 4f 6f 2c b9 01 c2 ..<*..`.OwOo,... 00:20:33.945 000001d0 f0 89 bc 09 d5 14 cd e1 a2 2b 97 c1 74 93 58 d5 .........+..t.X. 00:20:33.945 000001e0 ef 7e e9 81 34 f2 a9 0c 2f 0e 10 a5 9b c5 4e 30 .~..4.../.....N0 00:20:33.945 000001f0 46 c4 ff c5 1d a9 4b 51 94 89 ec 67 f2 86 d8 92 F.....KQ...g.... 00:20:33.945 00000200 91 e4 3b b2 59 28 c4 f3 b2 f4 5e f0 bf a4 1d 29 ..;.Y(....^....) 00:20:33.945 00000210 c5 50 e2 b1 a0 6e 0a 0f 50 a6 0c 79 a6 5f a1 74 .P...n..P..y._.t 00:20:33.945 00000220 db 05 79 49 f6 8d 89 35 d8 b6 96 68 c4 bf 10 89 ..yI...5...h.... 00:20:33.945 00000230 e4 f5 94 90 10 b9 00 da 00 fc d5 9e 01 a4 b6 0a ................ 00:20:33.945 00000240 9d 52 96 fc ee d7 e2 28 97 93 aa bd 7b 28 22 90 .R.....(....{(". 00:20:33.945 00000250 ad dd 0c bd 73 30 78 7c 49 cb 6b aa 39 ab 43 6d ....s0x|I.k.9.Cm 00:20:33.946 00000260 be 75 b6 fd 1f bb de df 7e ea 78 c5 9d d4 3d 0c .u......~.x...=. 00:20:33.946 00000270 a0 ff c1 e7 cc f2 13 9c cb 4f f0 ac 6b f6 cc f2 .........O..k... 00:20:33.946 00000280 0d 1e 5d ff b9 7a d8 b8 d7 9e 61 9f 96 d6 5a 5b ..]..z....a...Z[ 00:20:33.946 00000290 57 64 c8 c2 a9 68 57 d9 33 b6 83 a4 67 f5 3e d2 Wd...hW.3...g.>. 00:20:33.946 000002a0 90 43 af d7 ea b8 82 22 6a 8e 58 82 d9 50 ca fa .C....."j.X..P.. 00:20:33.946 000002b0 82 b9 b5 fa 0c 94 87 37 e5 80 58 55 22 90 08 03 .......7..XU"... 00:20:33.946 000002c0 61 ea 94 79 36 15 f7 ca ba 29 a9 69 e7 18 4b d2 a..y6....).i..K. 00:20:33.946 000002d0 aa 09 32 00 61 61 a1 0e 18 97 dd 10 35 42 f9 25 ..2.aa......5B.% 00:20:33.946 000002e0 4b 5f 5e b9 14 22 ce 51 dd 45 82 4c da 7a 6f a0 K_^..".Q.E.L.zo. 00:20:33.946 000002f0 2c ad 13 3e 4a de 9e e5 f9 d7 13 52 97 db 10 6a ,..>J......R...j 00:20:33.946 00000300 a6 eb 09 b7 5a d3 d6 0d 1b 06 29 53 c8 9d 7c 65 ....Z.....)S..|e 00:20:33.946 00000310 07 d0 45 41 ec 3b e0 40 4d 54 bd 2a 5c 01 5f db ..EA.;.@MT.*\._. 00:20:33.946 00000320 c3 18 cf 56 f2 b3 a6 a2 88 e7 e8 1c f4 48 4e e7 ...V.........HN. 00:20:33.946 00000330 e9 c9 5c 18 d4 d0 fa d3 d3 2e 0a 9c 57 3a 4e 4c ..\.........W:NL 00:20:33.946 00000340 a0 8e 2f 68 f1 33 50 03 ee 6b 84 8e 19 0e d7 60 ../h.3P..k.....` 00:20:33.946 00000350 67 2d b0 54 e2 55 53 43 eb 68 92 86 85 36 3c 3f g-.T.USC.h...6e..iD.D.NJ 00:20:33.946 000001c0 1e 54 87 66 10 7c ea 0b de 1a 20 77 35 c2 a9 0f .T.f.|.... w5... 00:20:33.946 000001d0 1c 61 54 bd 17 8b 0d d3 3b 79 c6 73 f4 2e 30 96 .aT.....;y.s..0. 00:20:33.946 000001e0 f2 a5 d7 ef 47 c9 78 41 4e d8 e1 20 87 a8 ec e7 ....G.xAN.. .... 00:20:33.946 000001f0 46 d9 0d 80 a8 ff 71 52 9f 66 ad f9 36 fc f9 28 F.....qR.f..6..( 00:20:33.946 00000200 85 ae ad 38 2c 88 5c 18 8f 08 b4 e2 64 18 82 72 ...8,.\.....d..r 00:20:33.946 00000210 83 b1 80 05 2e 96 3b 9c fd f6 07 67 1b bb e7 5c ......;....g...\ 00:20:33.946 00000220 97 1c 41 d4 48 1d 65 1c bb c6 c9 34 ee 0e fd 7e ..A.H.e....4...~ 00:20:33.946 00000230 1b bd 87 ad 72 54 71 4e 90 89 a3 d1 ba 4d 16 cf ....rTqN.....M.. 00:20:33.946 00000240 93 8e 06 67 0a 54 b9 14 9f 0f 0f 2f 32 55 9b 49 ...g.T...../2U.I 00:20:33.946 00000250 80 e0 72 8e 61 76 39 53 62 6e 41 7e 91 1e 61 6f ..r.av9SbnA~..ao 00:20:33.946 00000260 2b 5f 77 d9 9e 92 0a c2 a9 9f de 12 01 83 16 d4 +_w............. 00:20:33.946 00000270 ca 17 63 2a 44 cf a6 41 2f 04 26 14 67 9f 80 8b ..c*D..A/.&.g... 00:20:33.946 00000280 bc 19 ec 7e 5f 03 73 93 78 08 d2 e0 1c 42 69 a5 ...~_.s.x....Bi. 00:20:33.946 00000290 e5 76 8d d1 57 9f 19 43 68 76 0f 26 2d 53 2c 4f .v..W..Chv.&-S,O 00:20:33.946 000002a0 86 6b 05 cd 3f 46 41 58 dd ad f3 5a a4 df 35 18 .k..?FAX...Z..5. 00:20:33.946 000002b0 cd 23 67 ac 85 b9 56 e0 63 0b 3d 73 13 d2 76 d8 .#g...V.c.=s..v. 00:20:33.946 000002c0 f0 8e d2 c7 c8 e3 55 97 4c bf d8 55 2a a2 a0 76 ......U.L..U*..v 00:20:33.946 000002d0 52 a2 b0 87 82 ec 8c af da b4 d2 68 d1 6b e1 5e R..........h.k.^ 00:20:33.946 000002e0 7e 4b 61 d1 4b e3 e0 92 d6 94 0b 90 28 3a 4c 66 ~Ka.K.......(:Lf 00:20:33.946 000002f0 36 14 62 07 9c 39 3f 60 ce 83 02 c5 4b d0 42 61 6.b..9?`....K.Ba 00:20:33.946 00000300 fe a1 c5 5e 47 14 ca 6d a8 6d ab a2 ab 69 b5 17 ...^G..m.m...i.. 00:20:33.946 00000310 54 21 7e 5f 74 02 52 e3 4b 5a 47 d4 f5 f9 07 f5 T!~_t.R.KZG..... 00:20:33.946 00000320 7d 78 b8 ff 2d 6e 2c cd 24 92 01 06 7c ca 9c 51 }x..-n,.$...|..Q 00:20:33.946 00000330 32 d9 d7 7f 4d c1 18 cd 13 60 69 85 e2 86 fb 7f 2...M....`i..... 00:20:33.946 00000340 31 89 b4 8e cf 83 23 bc eb be 8e 9f 49 36 4b 75 1.....#.....I6Ku 00:20:33.946 00000350 e9 05 ff f0 9f ea 2a a7 5a 0e 44 f6 20 d6 f9 d3 ......*.Z.D. ... 00:20:33.946 00000360 eb 88 e9 a4 cc 44 f6 d1 1e bd 49 b1 eb f1 16 2d .....D....I....- 00:20:33.946 00000370 4a f7 f7 b0 e2 ad af 4b 31 c5 5b 7e ec 0a 8b a3 J......K1.[~.... 00:20:33.946 00000380 48 3b 33 41 d6 a2 ec 2d 2d e5 eb c3 3c 86 c1 b5 H;3A...--...<... 00:20:33.946 00000390 d3 85 7e 50 a7 24 ff b0 99 df 21 75 9c da 81 44 ..~P.$....!u...D 00:20:33.946 000003a0 34 7b 32 d2 45 b6 67 d7 6e 6f 50 31 de 20 ae ef 4{2.E.g.noP1. .. 00:20:33.946 000003b0 c8 0c b7 b3 9a db 85 ca c6 75 32 2d 4e 42 16 c3 .........u2-NB.. 00:20:33.946 000003c0 e5 fb 05 17 4e 8b 1a 93 4d 68 e9 70 50 4a 28 f2 ....N...Mh.pPJ(. 00:20:33.946 000003d0 4f d9 f4 61 af 7e 04 99 9f 11 e0 90 a2 db 2d 83 O..a.~........-. 00:20:33.946 000003e0 ff 93 92 6a ab 84 d1 96 9d 3e 4b 9c f7 45 eb 0a ...j.....>K..E.. 00:20:33.946 000003f0 92 4a 51 63 64 8c 75 4d 65 07 bf 36 2e 87 c1 7a .JQcd.uMe..6...z 00:20:33.946 dh secret: 00:20:33.946 00000000 a9 7c d1 ec 27 88 ac 05 c1 3b 1e 32 e3 20 04 d1 .|..'....;.2. .. 00:20:33.946 00000010 7b cc fb 12 e9 51 e2 45 9b ec 33 ab 16 8b ae 71 {....Q.E..3....q 00:20:33.946 00000020 00 95 8a 66 8f b4 c4 ef 69 09 cc ec c0 cd 0f c1 ...f....i....... 00:20:33.946 00000030 a1 f6 d6 88 de 05 ae f3 5d 00 64 95 ee 64 4c ae ........].d..dL. 00:20:33.946 00000040 06 eb 72 a2 c3 97 a3 7f f1 4e 72 ee cb d0 dc 94 ..r......Nr..... 00:20:33.946 00000050 15 6d 77 7d 85 2a ce 82 68 82 17 27 d4 e1 9b 38 .mw}.*..h..'...8 00:20:33.946 00000060 c0 ad 76 93 be 17 c9 a4 02 d8 b2 a8 9d 9c 2b 39 ..v...........+9 00:20:33.946 00000070 9c fb f2 33 e7 eb ba c8 82 00 f8 6d 14 fe c2 dd ...3.......m.... 00:20:33.946 00000080 42 09 fd 1f ed 4f cd 05 05 22 99 c8 d7 5f d2 57 B....O..."..._.W 00:20:33.946 00000090 cb 55 b7 08 a2 db 6d a4 77 44 50 c8 37 a1 a6 08 .U....m.wDP.7... 00:20:33.946 000000a0 79 00 85 07 ef 20 3b 0c 7c b5 67 78 fa c7 22 91 y.... ;.|.gx..". 00:20:33.946 000000b0 f8 47 bd e1 d0 8e 26 38 b6 ae dd 4f f5 3d 95 e5 .G....&8...O.=.. 00:20:33.946 000000c0 05 b0 33 59 a1 77 92 5d 04 a8 05 2d 51 bc d8 e1 ..3Y.w.]...-Q... 00:20:33.946 000000d0 ea 32 dd 93 f6 f4 37 cc 62 72 51 c7 04 5f ef 93 .2....7.brQ.._.. 00:20:33.946 000000e0 88 16 e0 b4 17 82 39 3e c4 2a 81 47 9e b7 40 a8 ......9>.*.G..@. 00:20:33.946 000000f0 af 1e 0a 46 53 36 64 ad 5c 4d 7f ec 4b b9 4a 00 ...FS6d.\M..K.J. 00:20:33.946 00000100 1d b8 19 f7 c9 c8 43 36 c7 8f 0c e6 f4 68 16 98 ......C6.....h.. 00:20:33.946 00000110 bc 1d 7c 9b 29 9a 6e 76 17 92 d3 e4 a2 21 bd 6e ..|.).nv.....!.n 00:20:33.946 00000120 38 b0 ef 39 2c c0 db ba e8 0f 08 fa eb 6a e7 2b 8..9,........j.+ 00:20:33.946 00000130 96 9b 6c 34 64 7c db 04 17 41 fd b7 f5 d9 4d 53 ..l4d|...A....MS 00:20:33.946 00000140 e7 df 1e f9 ce eb 1c e4 bb 95 eb ab cd c7 29 fa ..............). 00:20:33.946 00000150 03 42 ab b2 20 bf ec 5f 50 e7 7b 23 3c f4 31 2e .B.. .._P.{#<.1. 00:20:33.946 00000160 24 2c 9c bf 92 db 60 9d 7a ac 5f ca 86 e8 d5 48 $,....`.z._....H 00:20:33.946 00000170 7b 28 6a e6 64 1e 9f a6 1c 96 00 47 39 bc 7b 1b {(j.d......G9.{. 00:20:33.946 00000180 bc df 85 dd 87 cd 31 bd 4e e7 01 2c 4e fa a1 e6 ......1.N..,N... 00:20:33.946 00000190 95 b8 93 3c b7 97 8d 5e 23 fa a2 19 29 bc 24 b2 ...<...^#...).$. 00:20:33.946 000001a0 d2 57 12 81 7a 58 22 ef ed b4 4f 25 c6 e0 7e 58 .W..zX"...O%..~X 00:20:33.946 000001b0 6e 23 e3 04 fe eb 00 3d 70 a2 a4 39 07 17 0d d2 n#.....=p..9.... 00:20:33.946 000001c0 85 4d 1f f2 b5 71 40 9f 74 7a 0a 86 77 57 7f e5 .M...q@.tz..wW.. 00:20:33.946 000001d0 7c 23 db 5f db f0 49 f5 0c c2 e8 42 ed 32 22 bd |#._..I....B.2". 00:20:33.946 000001e0 2c 7d 79 f3 ed 6d c6 d5 89 96 d4 09 f9 ec 61 d0 ,}y..m........a. 00:20:33.946 000001f0 5c cc d2 2a ce 20 51 8b 3d dc ae 53 7e b8 48 5f \..*. Q.=..S~.H_ 00:20:33.946 00000200 ac 71 d1 69 c7 bf 87 b3 a1 32 d2 91 19 f9 45 39 .q.i.....2....E9 00:20:33.946 00000210 92 13 a4 71 2a ba 09 5c 67 ee cd 15 de b3 9c ea ...q*..\g....... 00:20:33.946 00000220 fc fc ff a1 87 54 44 d9 19 68 1d 2b 7f 00 15 a1 .....TD..h.+.... 00:20:33.946 00000230 7e ce b5 40 a5 0c a6 c1 9b 88 29 79 80 bf c3 c2 ~..@......)y.... 00:20:33.946 00000240 a7 fe 12 ff 33 03 83 22 8d ed 65 0e c8 b9 17 e6 ....3.."..e..... 00:20:33.946 00000250 48 59 9e 98 49 6d 6a cc e7 3f c4 42 45 c8 21 a1 HY..Imj..?.BE.!. 00:20:33.946 00000260 f5 15 55 bc fc d8 b4 22 09 3d f7 88 4d df e2 f2 ..U....".=..M... 00:20:33.946 00000270 e3 fa 96 a0 b2 f9 e4 af 31 1e 3d 97 f7 6f de 8c ........1.=..o.. 00:20:33.946 00000280 da c4 d7 1e 11 c1 5b a2 37 bd b8 a4 19 48 db 3b ......[.7....H.; 00:20:33.946 00000290 4d 24 a7 50 8e 67 87 bc a4 fb b6 54 06 f3 11 2d M$.P.g.....T...- 00:20:33.946 000002a0 f1 6a 55 da fe d0 c4 3b 4d 67 0a a0 26 19 92 f9 .jU....;Mg..&... 00:20:33.946 000002b0 92 25 1e a0 34 5b 98 61 3a be 8f 5b d5 3e 4e b3 .%..4[.a:..[.>N. 00:20:33.946 000002c0 ae da 5f d3 4e 84 5c 0c bd 50 24 c3 6f ee 29 c5 .._.N.\..P$.o.). 00:20:33.946 000002d0 5f a9 47 37 5b 2b c2 a1 cb 53 e8 0b 58 6f b2 81 _.G7[+...S..Xo.. 00:20:33.946 000002e0 90 ad 57 99 85 a5 80 a6 a1 18 2b bb d6 e0 f2 3f ..W.......+....? 00:20:33.946 000002f0 16 ac c9 5f ad c0 cb 9a 58 90 3b 95 3f 13 89 e6 ..._....X.;.?... 00:20:33.946 00000300 b1 b1 90 3b a4 aa ce e4 27 04 56 dd d8 19 37 8e ...;....'.V...7. 00:20:33.946 00000310 8a 71 5b 66 0f 77 9e 94 f7 71 6d 53 bc 3e e3 b6 .q[f.w...qmS.>.. 00:20:33.946 00000320 14 fe e6 4a 6d 37 a2 a5 91 52 5e 32 b4 7c 31 c6 ...Jm7...R^2.|1. 00:20:33.946 00000330 d7 88 7a 64 10 2f 8a fc 2b 65 4b 3d 5c 57 22 cf ..zd./..+eK=\W". 00:20:33.946 00000340 3b 0e 82 44 73 f6 14 13 28 32 46 8b 8f 62 09 f9 ;..Ds...(2F..b.. 00:20:33.946 00000350 b7 ce c2 a9 67 dc ae 4d 37 44 25 ce 5e 1c 2b 68 ....g..M7D%.^.+h 00:20:33.946 00000360 4c 5a de 83 8f e9 c0 de 4b 36 3c 15 50 43 bf c3 LZ......K6<.PC.. 00:20:33.946 00000370 19 66 7c ca 24 12 c2 f5 8b 2a d6 f0 e2 54 a7 10 .f|.$....*...T.. 00:20:33.946 00000380 ac ac f4 70 6c 92 23 b7 9e db 64 30 13 bf 36 58 ...pl.#...d0..6X 00:20:33.946 00000390 82 d7 4e 07 d2 b9 29 9d 8f 8d 0b f5 71 33 17 05 ..N...).....q3.. 00:20:33.946 000003a0 09 fd 47 2b 11 59 54 00 04 31 bb e3 ed c9 38 9c ..G+.YT..1....8. 00:20:33.946 000003b0 cd 77 7d ca 9c a7 7c 15 d1 2f 07 4b c7 07 d5 18 .w}...|../.K.... 00:20:33.946 000003c0 6d 01 0d a1 c0 68 b7 c2 4f c8 6c 00 c1 6c 6c 98 m....h..O.l..ll. 00:20:33.946 000003d0 3c d4 4f e2 5f 9f 6e bd 9b 43 e0 2e ac 59 76 f1 <.O._.n..C...Yv. 00:20:33.946 000003e0 39 ff 1b c8 d5 e7 fa a1 ad 17 2b ee b0 e1 a0 88 9.........+..... 00:20:33.946 000003f0 c9 e3 10 cd 0c 5f 2d ba 43 06 10 5c 7a 58 1a 0d ....._-.C..\zX.. 00:20:33.946 [2024-09-27 15:25:23.919741] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=2, dhgroup=5, seq=3428451792, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.946 [2024-09-27 15:25:23.919844] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.947 [2024-09-27 15:25:24.000970] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.947 [2024-09-27 15:25:24.001015] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.947 [2024-09-27 15:25:24.001025] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.947 [2024-09-27 15:25:24.001051] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.947 [2024-09-27 15:25:24.190457] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.947 [2024-09-27 15:25:24.190476] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 2 (sha384) 00:20:33.947 [2024-09-27 15:25:24.190483] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.947 [2024-09-27 15:25:24.190532] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.947 [2024-09-27 15:25:24.190555] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.947 ctrlr pubkey: 00:20:33.947 00000000 7c 53 a6 84 3a 8b d2 4d e3 9d 44 28 71 04 08 55 |S..:..M..D(q..U 00:20:33.947 00000010 7d 6d 1b c0 9a 40 57 32 c5 7f cc f5 48 fa e2 99 }m...@W2....H... 00:20:33.947 00000020 d2 48 38 f8 69 55 e0 99 cf ec 5c 53 93 52 29 3a .H8.iU....\S.R): 00:20:33.947 00000030 78 40 67 e1 0a 6e b7 ee 03 57 e7 52 b3 ca ea 35 x@g..n...W.R...5 00:20:33.947 00000040 48 e3 3f 11 5b a5 e4 3a ce 3b e1 93 f5 b5 75 ba H.?.[..:.;....u. 00:20:33.947 00000050 dc 92 6d fa 09 15 8f 3f ea 7f 13 ab 4b a0 07 4b ..m....?....K..K 00:20:33.947 00000060 4f 41 e4 3a 3d 4c 74 e8 cc b7 be 15 31 9a 1b b7 OA.:=Lt.....1... 00:20:33.947 00000070 e6 fd 81 25 0c e6 49 41 df f8 f0 4c 0f 95 44 7b ...%..IA...L..D{ 00:20:33.947 00000080 9f 44 b1 e3 65 e2 66 01 4c 6a e9 67 c9 f3 f0 1d .D..e.f.Lj.g.... 00:20:33.947 00000090 b7 5e a6 a3 4b ba b8 0d d9 87 b2 23 f0 e3 1a 16 .^..K......#.... 00:20:33.947 000000a0 33 71 ee c4 46 35 bc 4c 41 40 69 e7 91 65 45 81 3q..F5.LA@i..eE. 00:20:33.947 000000b0 85 b0 dc 1c 57 34 30 27 75 ce 37 4c ab 7d 39 a1 ....W40'u.7L.}9. 00:20:33.947 000000c0 12 98 52 78 c8 a9 d1 ce 7e 96 12 32 f2 93 79 96 ..Rx....~..2..y. 00:20:33.947 000000d0 04 9b 17 49 b7 0a 22 e8 66 a9 89 6f 56 07 12 82 ...I..".f..oV... 00:20:33.947 000000e0 f1 07 f6 ae 01 42 5d b8 1b 90 36 d2 51 86 30 94 .....B]...6.Q.0. 00:20:33.947 000000f0 63 eb 9a a7 27 9c c0 af df 28 5f 3e 81 7b 99 79 c...'....(_>.{.y 00:20:33.947 00000100 5f 47 b8 d7 7b 2e 58 7d 8c 70 4b 66 37 b2 79 20 _G..{.X}.pKf7.y 00:20:33.947 00000110 9a b5 d3 53 3a b0 b1 b8 87 ea ae 4b cf 11 88 d3 ...S:......K.... 00:20:33.947 00000120 a1 60 ba 6a d3 c0 bc 88 9e 1e 1a f7 84 6f 4a 0e .`.j.........oJ. 00:20:33.947 00000130 c1 de d3 23 1c f5 87 eb 95 cd a0 c2 9f db 8a 50 ...#...........P 00:20:33.947 00000140 72 27 bc 65 61 40 80 bb 78 1f 86 82 bc 36 ed a1 r'.ea@..x....6.. 00:20:33.947 00000150 0b 10 1e 7d d4 f3 73 13 e3 dd 42 85 fc 25 5b cf ...}..s...B..%[. 00:20:33.947 00000160 a2 6a 94 8b ce 2b 21 6d ea 48 d5 20 c6 ac 4e 9b .j...+!m.H. ..N. 00:20:33.947 00000170 a4 5c 72 e1 70 8c c4 0d 89 d3 aa 72 47 cc 41 2c .\r.p......rG.A, 00:20:33.947 00000180 03 ee d7 87 73 35 f8 86 52 72 b7 53 c4 aa f4 64 ....s5..Rr.S...d 00:20:33.947 00000190 25 43 22 79 f8 7d 8a cc 95 b8 b3 6b d2 5c ed 35 %C"y.}.....k.\.5 00:20:33.947 000001a0 a3 33 0c d5 6b c5 a5 2f da d6 f5 c9 7a d9 e3 50 .3..k../....z..P 00:20:33.947 000001b0 66 7c 2e 4c a0 70 a7 0f 60 e2 b3 bf 0b 20 20 fd f|.L.p..`.... . 00:20:33.947 000001c0 20 6d 0f e4 e1 fe 34 2b 68 14 13 8e 50 53 76 32 m....4+h...PSv2 00:20:33.947 000001d0 28 4e 75 f1 c9 7f b7 21 1d a6 d8 48 27 e0 0b 40 (Nu....!...H'..@ 00:20:33.947 000001e0 b4 57 7d 97 c8 6e b8 4e 73 8f 92 64 95 ec 1e 45 .W}..n.Ns..d...E 00:20:33.947 000001f0 ef 68 53 7c 51 ea d6 c2 60 e7 c4 b4 36 18 23 d0 .hS|Q...`...6.#. 00:20:33.947 00000200 2c 2d f4 b1 61 cf b9 f1 dd 18 8f 0e 00 6b 9f 71 ,-..a........k.q 00:20:33.947 00000210 e7 a5 82 76 dc 30 7e 6c d4 62 44 5a e6 61 98 72 ...v.0~l.bDZ.a.r 00:20:33.947 00000220 4b f7 fd f8 99 0f cd f9 c8 64 00 ac 1f 4f 15 bd K........d...O.. 00:20:33.947 00000230 0a 3f d4 d9 e0 cf 50 55 d3 ad 66 59 87 ec f0 c8 .?....PU..fY.... 00:20:33.947 00000240 40 34 e3 2d 08 bb b8 34 50 3c 78 c5 ce bf d0 26 @4.-...4P.... 00:20:33.947 000002c0 a9 73 73 05 e7 c6 8c 71 d3 e6 be 02 90 d8 75 12 .ss....q......u. 00:20:33.947 000002d0 47 43 68 5f a3 82 f7 11 31 09 37 dc a6 2a 6a 2e GCh_....1.7..*j. 00:20:33.947 000002e0 4e 15 c9 59 38 e5 04 9f 10 7e 86 2d ac dd 8f 72 N..Y8....~.-...r 00:20:33.947 000002f0 b8 c2 d2 f1 f5 d9 17 7c 45 40 28 4b fa a8 2c 91 .......|E@(K..,. 00:20:33.947 00000300 30 73 78 a6 85 09 14 d5 22 8f 34 bb f4 62 c2 e5 0sx.....".4..b.. 00:20:33.947 00000310 f9 a2 9d 8d 47 81 bd 1b 76 83 46 b1 fb c7 8c b2 ....G...v.F..... 00:20:33.947 00000320 3d 92 ad 5a ff 1b e1 83 12 26 c0 52 d1 36 36 fe =..Z.....&.R.66. 00:20:33.947 00000330 a7 3a 28 49 41 6c ee 87 8e 44 13 5e 4a 27 08 2c .:(IAl...D.^J'., 00:20:33.947 00000340 4a 38 6d 44 91 6a 8b 11 26 c2 3d 5b 54 bc 67 8d J8mD.j..&.=[T.g. 00:20:33.947 00000350 11 de 35 83 c7 1e 91 ff 92 9d 4e 10 a6 b8 37 98 ..5.......N...7. 00:20:33.947 00000360 a1 2b e3 39 46 ae 6d 46 e0 78 5b 81 16 50 6a 7a .+.9F.mF.x[..Pjz 00:20:33.947 00000370 d4 f5 46 a2 dc 61 bf f6 08 9b 3c 8d 94 2c 5e e3 ..F..a....<..,^. 00:20:33.947 00000380 05 49 e7 0b 02 1c 83 02 be e5 8a 8b 31 00 71 00 .I..........1.q. 00:20:33.947 00000390 06 ad 23 09 92 4b e3 eb ea fa b5 dc 96 f9 d9 30 ..#..K.........0 00:20:33.947 000003a0 27 1b 3d 19 fa 14 9f 85 7c 90 2a 96 89 4b a8 97 '.=.....|.*..K.. 00:20:33.947 000003b0 4c 74 aa 67 79 8f 0b 94 67 32 42 9e 3a 75 09 b5 Lt.gy...g2B.:u.. 00:20:33.947 000003c0 47 04 dc 94 68 c9 cd 4d 02 5d ac 1f 1f aa f8 31 G...h..M.].....1 00:20:33.947 000003d0 75 ca aa f6 d1 53 96 a3 57 4e 6d eb 77 11 58 3b u....S..WNm.w.X; 00:20:33.947 000003e0 cf b5 7a c6 c0 a5 79 e3 8f d6 45 90 d3 34 bc d0 ..z...y...E..4.. 00:20:33.947 000003f0 e1 47 f8 83 d1 eb 01 09 46 ef 94 34 bc 13 fb 5c .G......F..4...\ 00:20:33.947 host pubkey: 00:20:33.947 00000000 a9 6d 59 a0 e8 06 49 0d 36 f3 af b6 a1 8d f8 85 .mY...I.6....... 00:20:33.947 00000010 ec ad b0 29 47 46 28 09 14 42 02 91 0e c1 b5 20 ...)GF(..B..... 00:20:33.947 00000020 f1 18 08 75 7c cd 20 32 8c 00 9c 3b 6c bc 35 04 ...u|. 2...;l.5. 00:20:33.947 00000030 42 68 11 90 df b7 ae ad d5 a9 ca dc 41 03 bc cc Bh..........A... 00:20:33.947 00000040 1f 55 0d 48 18 7b b8 1f 0a d5 c6 d5 50 1c be 2c .U.H.{......P.., 00:20:33.947 00000050 da c9 7e 7d 36 b0 0a 54 2a 0b ce 3b 0f 75 6b 58 ..~}6..T*..;.ukX 00:20:33.947 00000060 91 3c c1 03 6a 5b 1b 95 4e 06 e5 8e 2c ff c7 29 .<..j[..N...,..) 00:20:33.947 00000070 81 5c 3d 8e a6 ac 0f 14 69 02 87 7e 83 a2 3e 42 .\=.....i..~..>B 00:20:33.947 00000080 12 25 0b 28 8c d4 e5 39 75 c7 fc f2 cd 1e 9c 68 .%.(...9u......h 00:20:33.947 00000090 f7 73 09 40 63 0d 2a 43 40 3e e9 28 62 ea 7a c4 .s.@c.*C@>.(b.z. 00:20:33.947 000000a0 8b 95 f9 e0 3d 73 51 ac a7 47 5c 55 a9 84 cc e3 ....=sQ..G\U.... 00:20:33.947 000000b0 86 81 80 87 69 81 92 3b c6 71 c3 65 87 24 e1 f9 ....i..;.q.e.$.. 00:20:33.947 000000c0 77 2e 1e 14 ba 4b 30 87 91 0e 3f 5e 22 d2 b1 b4 w....K0...?^"... 00:20:33.947 000000d0 dc 89 9d 51 99 38 21 c1 f0 ac f0 b5 af df 01 50 ...Q.8!........P 00:20:33.947 000000e0 31 01 fc 1a b6 f1 f1 a7 76 bc a2 38 0c d8 a2 74 1.......v..8...t 00:20:33.947 000000f0 62 ef d1 32 ac 28 93 b4 1b 1f d1 4c cb 1f 9a 3e b..2.(.....L...> 00:20:33.947 00000100 07 4c a1 2f 46 89 db 00 6d ec ca 0f 40 9c 9c 43 .L./F...m...@..C 00:20:33.947 00000110 d9 96 bd ee 99 ea ee 13 c0 91 c0 c8 d3 ed 7a 69 ..............zi 00:20:33.947 00000120 bd ba 70 86 da 94 0e c4 64 25 f3 f1 c6 d5 c2 47 ..p.....d%.....G 00:20:33.947 00000130 21 b9 a4 9d 08 a8 f8 70 0d ac 82 e2 33 50 f0 2e !......p....3P.. 00:20:33.947 00000140 a1 dd a0 9e 67 47 13 e1 2f 50 c7 60 db 26 b5 d5 ....gG../P.`.&.. 00:20:33.947 00000150 c4 80 b1 61 39 49 b4 4c fe aa 96 05 31 43 e6 66 ...a9I.L....1C.f 00:20:33.947 00000160 3d e1 17 c1 8e 75 0f df 44 9b bd ad ec f8 cf 95 =....u..D....... 00:20:33.947 00000170 40 18 75 35 7a 2f 2d 51 67 f3 47 14 b1 ca 26 59 @.u5z/-Qg.G...&Y 00:20:33.947 00000180 92 e3 4d 3f f0 b3 ee 2b c3 78 7f 62 bd 94 57 ef ..M?...+.x.b..W. 00:20:33.947 00000190 7d 25 fc fa 8d a0 52 65 aa f9 c9 66 6f cb 67 75 }%....Re...fo.gu 00:20:33.947 000001a0 70 3a 4c fb 4d 27 e9 7b 0f 1f 8b b7 0c 93 2b 71 p:L.M'.{......+q 00:20:33.947 000001b0 4f 9c 15 22 98 1f d9 47 ad 8a 30 4d 7c 48 6b a4 O.."...G..0M|Hk. 00:20:33.947 000001c0 94 e5 ee 0a 2c 14 d5 1f 02 f6 cf 36 6b e0 00 9f ....,......6k... 00:20:33.947 000001d0 ea a9 b8 a2 6f 59 25 07 08 cd ad 1a 5e 15 62 69 ....oY%.....^.bi 00:20:33.947 000001e0 71 2e 58 28 ef 1e 7f 7c fb 5e 57 0b 18 db d2 90 q.X(...|.^W..... 00:20:33.947 000001f0 ba 94 0c 47 ed 77 2b 07 0f 9e 3c ae 7d 7e 98 04 ...G.w+...<.}~.. 00:20:33.947 00000200 13 e7 72 cd 46 0c 1d 6c 97 06 df 24 85 a3 86 a9 ..r.F..l...$.... 00:20:33.947 00000210 6c 83 9d 86 32 a8 4d 04 a3 ad be ee 3c 40 97 e0 l...2.M.....<@.. 00:20:33.947 00000220 f0 33 b8 a8 8f 45 bb 48 bb 77 3c ff 37 57 5d 4a .3...E.H.w<.7W]J 00:20:33.947 00000230 ff c6 33 80 14 82 90 a4 11 79 ce 48 e2 26 b7 ee ..3......y.H.&.. 00:20:33.947 00000240 48 8b 40 db 72 f3 f6 dd 48 d2 82 53 39 06 77 77 H.@.r...H..S9.ww 00:20:33.947 00000250 80 eb a7 83 44 7d 28 15 3c 02 f7 09 b5 1d 08 6d ....D}(.<......m 00:20:33.947 00000260 be 2e 9e ed dc 1a 7e bc 1c 40 19 09 be 53 83 85 ......~..@...S.. 00:20:33.947 00000270 a4 5a d1 56 ba 19 43 13 d9 60 a1 4d 0a 42 ab fe .Z.V..C..`.M.B.. 00:20:33.947 00000280 61 4a f5 2c a3 e3 b5 23 23 da f2 57 7b cd fe 7f aJ.,...##..W{... 00:20:33.947 00000290 7c 20 6c c9 45 51 8a 01 53 e3 d3 cd 67 53 db be | l.EQ..S...gS.. 00:20:33.947 000002a0 e0 c3 e6 3a 49 b8 fc 74 17 bb e4 be d8 01 5e cc ...:I..t......^. 00:20:33.947 000002b0 83 8d 62 fc d9 cd 85 07 f2 11 43 1b 51 d9 94 ff ..b.......C.Q... 00:20:33.947 000002c0 40 05 a5 79 13 7d 90 bf e8 77 e9 67 fc 6c c1 6d @..y.}...w.g.l.m 00:20:33.947 000002d0 6d 2e 81 23 68 df 7a de ef ea b6 3b bb 1f 4a 3e m..#h.z....;..J> 00:20:33.947 000002e0 37 60 f5 ee d5 88 ad ff f2 1e 6d 5a 60 02 9f 4d 7`........mZ`..M 00:20:33.947 000002f0 28 75 55 a0 d4 84 20 82 74 d2 cb 0b 79 05 67 1a (uU... .t...y.g. 00:20:33.947 00000300 10 be 17 2d 7b c6 88 23 fa 12 4d d9 2f 6c d8 91 ...-{..#..M./l.. 00:20:33.947 00000310 20 63 2b ee c4 4c dc 2e 0b 6a 39 67 b7 bc f9 e1 c+..L...j9g.... 00:20:33.947 00000320 71 e3 5d 44 ab c4 4f 8a 33 56 40 9b 5c 53 7c 8e q.]D..O.3V@.\S|. 00:20:33.947 00000330 6d b7 44 d5 89 3e 01 0c c1 fe 43 2c a1 12 3b eb m.D..>....C,..;. 00:20:33.947 00000340 e1 d4 cb 97 ed 30 55 dc a9 39 4b 14 37 09 31 ae .....0U..9K.7.1. 00:20:33.947 00000350 1f 73 33 1f 3a 26 c5 46 c6 90 97 02 55 aa 5f 98 .s3.:&.F....U._. 00:20:33.947 00000360 3b e3 7c f4 8a ac 3c c8 5e cd 70 ce fc b3 af a2 ;.|...<.^.p..... 00:20:33.947 00000370 fe 55 08 4f e9 93 2f de eb f3 69 2b a1 68 4b 5b .U.O../...i+.hK[ 00:20:33.947 00000380 ec 9b 9c 9e e2 bd 49 b3 96 52 70 66 ed c7 90 42 ......I..Rpf...B 00:20:33.947 00000390 1d 3d 0e 0a 43 57 d3 5e 47 f7 a3 c2 9b 1f b4 92 .=..CW.^G....... 00:20:33.947 000003a0 9b 0a 29 2e 26 30 d8 d2 e4 3d 33 59 b8 cd d9 7c ..).&0...=3Y...| 00:20:33.947 000003b0 4f 03 53 77 38 bc 1c dd 55 94 d5 a6 86 48 b0 ab O.Sw8...U....H.. 00:20:33.947 000003c0 9a d8 34 71 9b fb 0c 4d 3e 44 e3 c7 0d 5f e5 f3 ..4q...M>D..._.. 00:20:33.947 000003d0 5f 1e 10 56 8f 84 70 7c 26 88 57 e3 2f fa b5 34 _..V..p|&.W./..4 00:20:33.947 000003e0 3c 38 71 47 a6 9c 38 ae 65 1f 06 73 bc 18 92 79 <8qG..8.e..s...y 00:20:33.947 000003f0 3d 16 20 b3 4b 72 af bb eb c3 95 9a dc 24 1a 02 =. .Kr.......$.. 00:20:33.947 dh secret: 00:20:33.947 00000000 40 b4 78 7d 38 b0 db 91 e9 9a 4f 7e 30 e9 56 9e @.x}8.....O~0.V. 00:20:33.947 00000010 f5 96 bc 13 99 f1 c5 45 05 fa aa 2f 3e 77 6e bf .......E.../>wn. 00:20:33.947 00000020 e0 ef 1b fa 8a f9 50 8f 9f 5c b1 cd 8a cc ef b7 ......P..\...... 00:20:33.947 00000030 46 9e ae 7d 68 f6 0a b8 d7 c3 2e ce 1c 94 a6 14 F..}h........... 00:20:33.948 00000040 c1 b2 82 85 35 12 90 be 18 bb 7b 86 c0 05 bb 45 ....5.....{....E 00:20:33.948 00000050 8a bc fb 06 ff 11 a0 4f dc c5 f8 4a 62 16 de 62 .......O...Jb..b 00:20:33.948 00000060 f3 b8 bf 23 02 24 68 bc 1a 95 67 45 d5 f1 42 71 ...#.$h...gE..Bq 00:20:33.948 00000070 39 10 6c 07 2d b4 9e f6 37 1a 4f 44 f5 51 7a 9e 9.l.-...7.OD.Qz. 00:20:33.948 00000080 3b fb e1 c6 69 9b b9 b8 be 07 28 fa 81 cb d9 a6 ;...i.....(..... 00:20:33.948 00000090 91 9a 4c 96 64 b3 94 43 d4 9f 6b c2 ed 9f c6 96 ..L.d..C..k..... 00:20:33.948 000000a0 13 08 64 b8 c7 44 ad df 47 36 eb 32 55 66 df fa ..d..D..G6.2Uf.. 00:20:33.948 000000b0 1c 43 c6 eb 02 15 bf f6 29 60 05 12 33 86 f3 08 .C......)`..3... 00:20:33.948 000000c0 89 84 6f c5 df d0 71 fe fe 09 17 36 53 58 b5 54 ..o...q....6SX.T 00:20:33.948 000000d0 1f 08 24 f8 1d 9e aa 29 23 2d 5b 8b 4a 89 fa 18 ..$....)#-[.J... 00:20:33.948 000000e0 d2 1a da f9 d7 35 8d 17 8f 2d 8f 63 39 03 d5 2c .....5...-.c9.., 00:20:33.948 000000f0 3a c8 bf cc d8 e3 75 72 57 10 9e ad 32 31 c9 53 :.....urW...21.S 00:20:33.948 00000100 b6 ec 43 92 59 df 14 8f ca e9 b2 24 45 d5 63 69 ..C.Y......$E.ci 00:20:33.948 00000110 c5 10 9d 37 d2 6e 00 eb 2c 01 c1 63 a4 59 56 2d ...7.n..,..c.YV- 00:20:33.948 00000120 d4 11 5f 5a 1b f4 60 40 49 1c 8b 9e a1 73 6d 60 .._Z..`@I....sm` 00:20:33.948 00000130 93 d6 50 5f df cf 92 37 c7 77 09 0d 27 75 15 f1 ..P_...7.w..'u.. 00:20:33.948 00000140 ca 9f 7f 73 32 fc ad 14 98 c7 f2 38 9b 2f 8c 71 ...s2......8./.q 00:20:33.948 00000150 d9 a1 a4 04 c8 42 c4 01 2d 97 c0 12 fa 6a 85 28 .....B..-....j.( 00:20:33.948 00000160 3a e1 5b b0 31 69 de 6f b9 45 8f 37 2c 80 a4 e9 :.[.1i.o.E.7,... 00:20:33.948 00000170 21 8d 90 c0 e3 26 e1 ec e4 6b 92 82 1c 64 70 e8 !....&...k...dp. 00:20:33.948 00000180 6a c7 98 f6 3c 68 35 42 46 66 83 58 a1 7f cc ab j...Y....t...F. 00:20:33.948 00000240 2c de 60 be 7e 05 8b 92 6a c8 46 a8 92 33 00 2c ,.`.~...j.F..3., 00:20:33.948 00000250 9a 6c 53 0c f4 66 33 5b d1 3e aa c7 d3 ec 6d 57 .lS..f3[.>....mW 00:20:33.948 00000260 1f c0 e1 42 76 ab 08 4c 0c 7b d3 1e 3c 7c 65 ed ...Bv..L.{..<|e. 00:20:33.948 00000270 cd 0e b2 62 42 a2 22 76 ea ab fb e5 2e d9 54 67 ...bB."v......Tg 00:20:33.948 00000280 a8 28 0c 1b d8 7c a8 8b bd 5d bb 4f f5 59 27 ff .(...|...].O.Y'. 00:20:33.948 00000290 f0 b3 34 f8 43 17 48 56 af e0 ca d2 76 e4 b1 64 ..4.C.HV....v..d 00:20:33.948 000002a0 3e a9 3e e9 56 db 15 34 fa d5 c4 03 5a 5b 38 c9 >.>.V..4....Z[8. 00:20:33.948 000002b0 15 6e ca ac 82 ca d9 86 62 37 26 ee 72 3d 46 0d .n......b7&.r=F. 00:20:33.948 000002c0 43 46 eb f7 e2 70 1f 48 ae 7c 20 ef 44 17 09 17 CF...p.H.| .D... 00:20:33.948 000002d0 8b e0 41 cc dc 7c e2 f2 fc 60 e9 26 5c f3 48 9a ..A..|...`.&\.H. 00:20:33.948 000002e0 05 e5 be 07 26 e6 28 9d 33 c1 aa f5 55 ba 0a 58 ....&.(.3...U..X 00:20:33.948 000002f0 6b bd a0 08 2b e2 a7 a7 22 56 cb 8d c2 72 ad dd k...+..."V...r.. 00:20:33.948 00000300 b1 7c c5 17 13 03 dc b5 9b 0f 3f 8e 58 90 ef f4 .|........?.X... 00:20:33.948 00000310 63 31 06 81 36 91 a1 b5 6e 23 19 44 64 8d af de c1..6...n#.Dd... 00:20:33.948 00000320 a6 01 84 c6 b5 8d 01 21 a1 e2 d2 d2 c9 9d e0 28 .......!.......( 00:20:33.948 00000330 4b 61 36 ba 3b d8 6b c7 c7 34 6e b6 48 5f 92 3e Ka6.;.k..4n.H_.> 00:20:33.948 00000340 f2 15 df d7 a1 ae 6c d6 c8 48 ab 02 26 2d 99 96 ......l..H..&-.. 00:20:33.948 00000350 98 b7 6b 6d 24 62 be 7b 5b 4b 52 ec 20 93 32 21 ..km$b.{[KR. .2! 00:20:33.948 00000360 8d 8d 7c 15 ad 7e 39 dc ba 8b 97 da ae 66 77 3b ..|..~9......fw; 00:20:33.948 00000370 9d 43 ee 92 5f 3e 1f ff 2e c7 ae 92 bb 83 44 57 .C.._>........DW 00:20:33.948 00000380 7b 46 36 a4 c0 2f 53 95 fc e6 c4 33 ad 0f 0b e5 {F6../S....3.... 00:20:33.948 00000390 d0 64 31 ee 5b 85 6f f4 18 1c 89 7b a5 56 26 fc .d1.[.o....{.V&. 00:20:33.948 000003a0 17 b8 89 71 23 6c f8 04 c5 3f 74 75 d9 81 bc 1a ...q#l...?tu.... 00:20:33.948 000003b0 72 19 68 55 1c 22 52 11 95 4b 05 a7 ef 86 18 c1 r.hU."R..K...... 00:20:33.948 000003c0 f0 bf f2 ef d4 c5 15 95 de f4 12 25 a4 87 52 ad ...........%..R. 00:20:33.948 000003d0 c7 04 83 7c 62 fe 98 ff 7c 15 fd 09 bc 22 91 d5 ...|b...|....".. 00:20:33.948 000003e0 8d 79 ac e9 b7 f7 72 6f d3 25 dd 76 70 80 7f 83 .y....ro.%.vp... 00:20:33.948 000003f0 72 89 1e 3e 39 89 03 13 dc 15 84 d5 bd cc 7b 0d r..>9.........{. 00:20:33.948 [2024-09-27 15:25:24.301480] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=2, dhgroup=5, seq=3428451793, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.948 [2024-09-27 15:25:24.359967] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.948 [2024-09-27 15:25:24.359999] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.948 [2024-09-27 15:25:24.360016] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.948 [2024-09-27 15:25:24.360022] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.948 [2024-09-27 15:25:24.465839] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.948 [2024-09-27 15:25:24.465857] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 2 (sha384) 00:20:33.948 [2024-09-27 15:25:24.465869] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.948 [2024-09-27 15:25:24.465880] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.948 [2024-09-27 15:25:24.465934] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.948 ctrlr pubkey: 00:20:33.948 00000000 7c 53 a6 84 3a 8b d2 4d e3 9d 44 28 71 04 08 55 |S..:..M..D(q..U 00:20:33.948 00000010 7d 6d 1b c0 9a 40 57 32 c5 7f cc f5 48 fa e2 99 }m...@W2....H... 00:20:33.948 00000020 d2 48 38 f8 69 55 e0 99 cf ec 5c 53 93 52 29 3a .H8.iU....\S.R): 00:20:33.948 00000030 78 40 67 e1 0a 6e b7 ee 03 57 e7 52 b3 ca ea 35 x@g..n...W.R...5 00:20:33.948 00000040 48 e3 3f 11 5b a5 e4 3a ce 3b e1 93 f5 b5 75 ba H.?.[..:.;....u. 00:20:33.948 00000050 dc 92 6d fa 09 15 8f 3f ea 7f 13 ab 4b a0 07 4b ..m....?....K..K 00:20:33.948 00000060 4f 41 e4 3a 3d 4c 74 e8 cc b7 be 15 31 9a 1b b7 OA.:=Lt.....1... 00:20:33.948 00000070 e6 fd 81 25 0c e6 49 41 df f8 f0 4c 0f 95 44 7b ...%..IA...L..D{ 00:20:33.948 00000080 9f 44 b1 e3 65 e2 66 01 4c 6a e9 67 c9 f3 f0 1d .D..e.f.Lj.g.... 00:20:33.948 00000090 b7 5e a6 a3 4b ba b8 0d d9 87 b2 23 f0 e3 1a 16 .^..K......#.... 00:20:33.948 000000a0 33 71 ee c4 46 35 bc 4c 41 40 69 e7 91 65 45 81 3q..F5.LA@i..eE. 00:20:33.948 000000b0 85 b0 dc 1c 57 34 30 27 75 ce 37 4c ab 7d 39 a1 ....W40'u.7L.}9. 00:20:33.948 000000c0 12 98 52 78 c8 a9 d1 ce 7e 96 12 32 f2 93 79 96 ..Rx....~..2..y. 00:20:33.948 000000d0 04 9b 17 49 b7 0a 22 e8 66 a9 89 6f 56 07 12 82 ...I..".f..oV... 00:20:33.948 000000e0 f1 07 f6 ae 01 42 5d b8 1b 90 36 d2 51 86 30 94 .....B]...6.Q.0. 00:20:33.948 000000f0 63 eb 9a a7 27 9c c0 af df 28 5f 3e 81 7b 99 79 c...'....(_>.{.y 00:20:33.948 00000100 5f 47 b8 d7 7b 2e 58 7d 8c 70 4b 66 37 b2 79 20 _G..{.X}.pKf7.y 00:20:33.948 00000110 9a b5 d3 53 3a b0 b1 b8 87 ea ae 4b cf 11 88 d3 ...S:......K.... 00:20:33.948 00000120 a1 60 ba 6a d3 c0 bc 88 9e 1e 1a f7 84 6f 4a 0e .`.j.........oJ. 00:20:33.948 00000130 c1 de d3 23 1c f5 87 eb 95 cd a0 c2 9f db 8a 50 ...#...........P 00:20:33.948 00000140 72 27 bc 65 61 40 80 bb 78 1f 86 82 bc 36 ed a1 r'.ea@..x....6.. 00:20:33.948 00000150 0b 10 1e 7d d4 f3 73 13 e3 dd 42 85 fc 25 5b cf ...}..s...B..%[. 00:20:33.948 00000160 a2 6a 94 8b ce 2b 21 6d ea 48 d5 20 c6 ac 4e 9b .j...+!m.H. ..N. 00:20:33.948 00000170 a4 5c 72 e1 70 8c c4 0d 89 d3 aa 72 47 cc 41 2c .\r.p......rG.A, 00:20:33.948 00000180 03 ee d7 87 73 35 f8 86 52 72 b7 53 c4 aa f4 64 ....s5..Rr.S...d 00:20:33.948 00000190 25 43 22 79 f8 7d 8a cc 95 b8 b3 6b d2 5c ed 35 %C"y.}.....k.\.5 00:20:33.948 000001a0 a3 33 0c d5 6b c5 a5 2f da d6 f5 c9 7a d9 e3 50 .3..k../....z..P 00:20:33.948 000001b0 66 7c 2e 4c a0 70 a7 0f 60 e2 b3 bf 0b 20 20 fd f|.L.p..`.... . 00:20:33.948 000001c0 20 6d 0f e4 e1 fe 34 2b 68 14 13 8e 50 53 76 32 m....4+h...PSv2 00:20:33.948 000001d0 28 4e 75 f1 c9 7f b7 21 1d a6 d8 48 27 e0 0b 40 (Nu....!...H'..@ 00:20:33.948 000001e0 b4 57 7d 97 c8 6e b8 4e 73 8f 92 64 95 ec 1e 45 .W}..n.Ns..d...E 00:20:33.948 000001f0 ef 68 53 7c 51 ea d6 c2 60 e7 c4 b4 36 18 23 d0 .hS|Q...`...6.#. 00:20:33.948 00000200 2c 2d f4 b1 61 cf b9 f1 dd 18 8f 0e 00 6b 9f 71 ,-..a........k.q 00:20:33.948 00000210 e7 a5 82 76 dc 30 7e 6c d4 62 44 5a e6 61 98 72 ...v.0~l.bDZ.a.r 00:20:33.948 00000220 4b f7 fd f8 99 0f cd f9 c8 64 00 ac 1f 4f 15 bd K........d...O.. 00:20:33.948 00000230 0a 3f d4 d9 e0 cf 50 55 d3 ad 66 59 87 ec f0 c8 .?....PU..fY.... 00:20:33.948 00000240 40 34 e3 2d 08 bb b8 34 50 3c 78 c5 ce bf d0 26 @4.-...4P.... 00:20:33.949 000002c0 a9 73 73 05 e7 c6 8c 71 d3 e6 be 02 90 d8 75 12 .ss....q......u. 00:20:33.949 000002d0 47 43 68 5f a3 82 f7 11 31 09 37 dc a6 2a 6a 2e GCh_....1.7..*j. 00:20:33.949 000002e0 4e 15 c9 59 38 e5 04 9f 10 7e 86 2d ac dd 8f 72 N..Y8....~.-...r 00:20:33.949 000002f0 b8 c2 d2 f1 f5 d9 17 7c 45 40 28 4b fa a8 2c 91 .......|E@(K..,. 00:20:33.949 00000300 30 73 78 a6 85 09 14 d5 22 8f 34 bb f4 62 c2 e5 0sx.....".4..b.. 00:20:33.949 00000310 f9 a2 9d 8d 47 81 bd 1b 76 83 46 b1 fb c7 8c b2 ....G...v.F..... 00:20:33.949 00000320 3d 92 ad 5a ff 1b e1 83 12 26 c0 52 d1 36 36 fe =..Z.....&.R.66. 00:20:33.949 00000330 a7 3a 28 49 41 6c ee 87 8e 44 13 5e 4a 27 08 2c .:(IAl...D.^J'., 00:20:33.949 00000340 4a 38 6d 44 91 6a 8b 11 26 c2 3d 5b 54 bc 67 8d J8mD.j..&.=[T.g. 00:20:33.949 00000350 11 de 35 83 c7 1e 91 ff 92 9d 4e 10 a6 b8 37 98 ..5.......N...7. 00:20:33.949 00000360 a1 2b e3 39 46 ae 6d 46 e0 78 5b 81 16 50 6a 7a .+.9F.mF.x[..Pjz 00:20:33.949 00000370 d4 f5 46 a2 dc 61 bf f6 08 9b 3c 8d 94 2c 5e e3 ..F..a....<..,^. 00:20:33.949 00000380 05 49 e7 0b 02 1c 83 02 be e5 8a 8b 31 00 71 00 .I..........1.q. 00:20:33.949 00000390 06 ad 23 09 92 4b e3 eb ea fa b5 dc 96 f9 d9 30 ..#..K.........0 00:20:33.949 000003a0 27 1b 3d 19 fa 14 9f 85 7c 90 2a 96 89 4b a8 97 '.=.....|.*..K.. 00:20:33.949 000003b0 4c 74 aa 67 79 8f 0b 94 67 32 42 9e 3a 75 09 b5 Lt.gy...g2B.:u.. 00:20:33.949 000003c0 47 04 dc 94 68 c9 cd 4d 02 5d ac 1f 1f aa f8 31 G...h..M.].....1 00:20:33.949 000003d0 75 ca aa f6 d1 53 96 a3 57 4e 6d eb 77 11 58 3b u....S..WNm.w.X; 00:20:33.949 000003e0 cf b5 7a c6 c0 a5 79 e3 8f d6 45 90 d3 34 bc d0 ..z...y...E..4.. 00:20:33.949 000003f0 e1 47 f8 83 d1 eb 01 09 46 ef 94 34 bc 13 fb 5c .G......F..4...\ 00:20:33.949 host pubkey: 00:20:33.949 00000000 69 61 97 1a cc fc 76 b3 2b 6d 3e 38 1b 16 01 ea ia....v.+m>8.... 00:20:33.949 00000010 de d0 9e 37 46 ae 02 26 10 9c a8 b9 c0 5a 41 38 ...7F..&.....ZA8 00:20:33.949 00000020 e1 46 d2 98 14 cc 74 d0 10 d6 c5 8a 0e bb 16 e6 .F....t......... 00:20:33.949 00000030 18 fd a8 ce d4 28 60 a2 47 e6 c2 03 9b a8 0f af .....(`.G....... 00:20:33.949 00000040 50 15 87 af b7 86 6a 96 f6 0c 3a 86 80 82 fa 4b P.....j...:....K 00:20:33.949 00000050 1c 42 69 41 1a 1d 88 75 4a 00 6b a5 23 d9 47 fb .BiA...uJ.k.#.G. 00:20:33.949 00000060 70 81 6f 10 69 e7 17 9d ba 8c 65 4c 24 95 0d d1 p.o.i.....eL$... 00:20:33.949 00000070 2c 6c f6 09 ca ea b6 b4 cb df 2f 82 a9 73 bf 05 ,l......../..s.. 00:20:33.949 00000080 3e 2f b6 52 f5 dd d7 61 81 fb 9a bf 17 98 c8 d2 >/.R...a........ 00:20:33.949 00000090 7f d0 f2 d6 69 3b c5 5c 95 4b 36 5f 68 dd 86 41 ....i;.\.K6_h..A 00:20:33.949 000000a0 a5 20 69 8d 6f d5 26 45 2a a6 b6 45 f3 0c 1b f0 . i.o.&E*..E.... 00:20:33.949 000000b0 d0 38 39 d4 b0 9e 4f 21 23 00 a7 95 0a ed b5 7d .89...O!#......} 00:20:33.949 000000c0 03 68 74 03 f1 8d 7d 04 e4 f1 e8 d2 3b 40 39 c4 .ht...}.....;@9. 00:20:33.949 000000d0 5c d2 68 af be 21 ea 43 fe 09 50 b6 3d 3f 37 b6 \.h..!.C..P.=?7. 00:20:33.949 000000e0 c8 5b 80 a6 ed 31 8f 82 1a 16 ad ff d8 05 47 63 .[...1........Gc 00:20:33.949 000000f0 ce 53 7a 0b d1 85 88 cc fc f7 c5 b2 82 17 d8 a3 .Sz............. 00:20:33.949 00000100 53 64 1c 19 a1 c0 75 43 7f eb 06 a6 f0 bd 24 a6 Sd....uC......$. 00:20:33.949 00000110 62 23 4c be 7a f1 49 1c 4f 44 91 cd 9b 60 bc 61 b#L.z.I.OD...`.a 00:20:33.949 00000120 af b1 49 6b eb e0 74 11 b5 fc 8e c1 60 ee 30 24 ..Ik..t.....`.0$ 00:20:33.949 00000130 9d 93 3e 1e 56 0a 91 52 f4 cc bf f7 f2 64 4f d6 ..>.V..R.....dO. 00:20:33.949 00000140 36 f6 75 28 c7 46 17 86 1b a3 55 ba d5 58 c1 1e 6.u(.F....U..X.. 00:20:33.949 00000150 53 48 60 3d 0b 72 ef ad 7a 0f d9 e5 c3 01 9d d4 SH`=.r..z....... 00:20:33.949 00000160 cf a4 7a d4 8b a9 54 f5 07 ad e0 ba 68 88 96 34 ..z...T.....h..4 00:20:33.949 00000170 79 3d 75 bc ac 1b 4b d9 1c af 7f 1a dd d1 53 f6 y=u...K.......S. 00:20:33.949 00000180 4f 92 aa de fc aa d1 46 cc dd 7e b2 60 fb 73 40 O......F..~.`.s@ 00:20:33.949 00000190 8d 98 50 7e b3 23 a4 62 ef bb 5e c5 5e 03 b9 c5 ..P~.#.b..^.^... 00:20:33.949 000001a0 80 e7 44 97 7e a5 c7 da 6d c3 b5 9f cc a3 a2 38 ..D.~...m......8 00:20:33.949 000001b0 4e 36 c6 79 ad ec ef ac 10 7f 12 e7 cc 4b 44 9d N6.y.........KD. 00:20:33.949 000001c0 16 35 68 d2 25 31 81 47 95 d2 2c c5 d8 e3 b1 69 .5h.%1.G..,....i 00:20:33.949 000001d0 44 58 11 15 05 7e 2f 61 91 40 1c 26 c2 2b ad ad DX...~/a.@.&.+.. 00:20:33.949 000001e0 dc 12 4e 5f ee 07 7d a4 62 bb df 28 10 72 72 dc ..N_..}.b..(.rr. 00:20:33.949 000001f0 0e ec 43 33 c4 1c 26 cf ec 3a 0b b5 d1 a5 bd e1 ..C3..&..:...... 00:20:33.949 00000200 37 a8 ee c5 fb 60 72 f7 df cc c6 90 cd 37 bf 9b 7....`r......7.. 00:20:33.949 00000210 ba 3b d9 4c 43 40 20 f9 1e 5d 6a fc 22 bc 2d 37 .;.LC@ ..]j.".-7 00:20:33.949 00000220 a9 9f 6c bd 4c d8 e9 03 9e e2 58 84 78 49 74 9f ..l.L.....X.xIt. 00:20:33.949 00000230 3e 24 f7 e0 da 37 9a e2 29 12 6b b2 55 fc cd bb >$...7..).k.U... 00:20:33.949 00000240 f6 93 30 9f 0c 0b a3 b7 cc 82 36 44 59 7f da 26 ..0.......6DY..& 00:20:33.949 00000250 c0 bf 5b 6a 06 d9 fa b8 4e d6 53 9a c8 17 b1 95 ..[j....N.S..... 00:20:33.949 00000260 b6 8a 9a 67 ca 49 09 d9 48 4d ae 2e 91 60 8c c9 ...g.I..HM...`.. 00:20:33.949 00000270 25 f8 52 47 30 a1 5a ca 59 ac dc 9b 01 a0 b6 35 %.RG0.Z.Y......5 00:20:33.949 00000280 25 22 2d 6c 52 e3 dd ba b6 d1 b4 a2 3f de 57 cc %"-lR.......?.W. 00:20:33.949 00000290 0d c1 44 17 26 19 cc 63 62 20 e6 23 c9 98 87 65 ..D.&..cb .#...e 00:20:33.949 000002a0 37 45 3b c4 d3 21 e5 ad a1 43 db 2e e8 d4 4d e7 7E;..!...C....M. 00:20:33.949 000002b0 e8 ea 92 91 62 b6 bd e1 42 ef f9 7f 5e a9 18 cb ....b...B...^... 00:20:33.949 000002c0 06 43 91 31 e0 07 1d 42 23 8e a0 d8 d5 4d 3e 18 .C.1...B#....M>. 00:20:33.949 000002d0 8d a0 01 61 9f 64 ff 6f 2c b1 c6 0c 39 5a b0 69 ...a.d.o,...9Z.i 00:20:33.949 000002e0 fc 0c 6b 5c 3d 47 89 9a 4f f9 9f 12 a1 7e ad 96 ..k\=G..O....~.. 00:20:33.949 000002f0 84 a0 85 00 8c bc 72 67 ed d0 57 fb 97 26 23 7d ......rg..W..&#} 00:20:33.949 00000300 a5 67 6e c2 51 29 88 ad ed b6 41 cf 20 43 98 e0 .gn.Q)....A. C.. 00:20:33.949 00000310 e1 41 e0 d9 fe 4b 30 fc 2f 53 e6 4c 98 cc f7 53 .A...K0./S.L...S 00:20:33.949 00000320 6e 67 13 01 fc 20 67 c0 c7 5f 31 90 25 3d 39 72 ng... g.._1.%=9r 00:20:33.949 00000330 c1 a2 b2 14 2d f9 eb 4e 33 8e 2a 18 92 b2 de e8 ....-..N3.*..... 00:20:33.949 00000340 f8 32 cb 16 0e fd a9 33 98 72 68 13 1a 51 d7 b9 .2.....3.rh..Q.. 00:20:33.949 00000350 8f f1 86 67 c6 67 72 cb f5 6a 08 10 b4 ec 69 65 ...g.gr..j....ie 00:20:33.949 00000360 32 08 1d 4e f2 23 5d c2 e4 f8 3b 87 40 6b 23 cf 2..N.#]...;.@k#. 00:20:33.949 00000370 b4 d3 92 82 f7 80 4f 29 20 5a 78 d4 27 b4 44 58 ......O) Zx.'.DX 00:20:33.949 00000380 e3 ee 8e a1 87 d3 89 2d 97 d5 67 16 9a d7 1f 9d .......-..g..... 00:20:33.949 00000390 63 75 30 2a 81 5c cc 41 59 63 42 75 44 ce cb d1 cu0*.\.AYcBuD... 00:20:33.949 000003a0 eb 13 c2 68 fe 09 99 2a 13 d8 70 74 db 20 bb 00 ...h...*..pt. .. 00:20:33.949 000003b0 87 c8 4e a3 15 69 91 15 16 82 d2 38 08 5f 74 c8 ..N..i.....8._t. 00:20:33.949 000003c0 1f 93 df 40 0d 1b 21 bd b3 c6 8c f2 44 fe 41 d1 ...@..!.....D.A. 00:20:33.949 000003d0 88 35 b1 a4 b9 14 6a 7a c9 99 4d a2 67 64 58 af .5....jz..M.gdX. 00:20:33.949 000003e0 3a bd 11 50 63 9b f0 f5 db 1c 96 3c 70 16 fa 78 :..Pc.........&.* 00:20:33.949 00000100 c4 8c 66 07 43 18 5b b4 fa 18 d9 5c 8b 49 e8 df ..f.C.[....\.I.. 00:20:33.949 00000110 60 85 bb 36 20 18 fc 37 01 0e 8e 5d a5 6e a7 c1 `..6 ..7...].n.. 00:20:33.949 00000120 90 61 68 03 d4 8e 57 6e d0 ae 16 c0 d9 81 13 6f .ah...Wn.......o 00:20:33.949 00000130 b8 1c db 02 86 dd 5b df 80 64 dd ae c4 6e e6 b2 ......[..d...n.. 00:20:33.949 00000140 83 f6 bc 52 be bc d0 5b 99 53 11 0b e1 c8 ff 68 ...R...[.S.....h 00:20:33.949 00000150 82 eb 18 01 55 84 6d 54 d9 cf 15 40 33 5b bf db ....U.mT...@3[.. 00:20:33.949 00000160 ea f8 0c 11 85 9c 41 72 bd 02 6d 35 fa ed a9 63 ......Ar..m5...c 00:20:33.949 00000170 c8 0a 72 19 fd 4b 87 b3 09 f7 09 1b 5d 8d 31 d1 ..r..K......].1. 00:20:33.949 00000180 92 f7 a7 53 f0 5a 65 f3 f0 90 14 d1 cd 07 90 9a ...S.Ze......... 00:20:33.949 00000190 f9 1f c8 1f d2 47 f4 4f eb 7f 2e c1 60 d1 2e 77 .....G.O....`..w 00:20:33.949 000001a0 fa 66 d7 bc e8 68 2d 2e 1c 18 37 ee 14 27 cb b3 .f...h-...7..'.. 00:20:33.949 000001b0 9b 40 81 b7 2a 39 6f a9 f0 6c 6f ff bf 23 16 3c .@..*9o..lo..#.< 00:20:33.949 000001c0 40 ec 3d d0 7a 0a 55 f8 4b 4c 4e 25 d4 f4 0c 60 @.=.z.U.KLN%...` 00:20:33.949 000001d0 76 5b 47 cb 2a e1 56 62 4b d5 e8 e7 e6 6d 3c e1 v[G.*.VbK....m<. 00:20:33.949 000001e0 9e 47 6c 5e f5 ee 8f 8f b4 ac 10 c1 9c c8 0e 6b .Gl^...........k 00:20:33.949 000001f0 3b 90 ea cc 85 2b d2 8a 4d e5 1c 3a e9 0f cd 93 ;....+..M..:.... 00:20:33.949 00000200 4a 10 b9 94 e8 2e 6f 1a 11 33 9c 25 27 27 4d 97 J.....o..3.%''M. 00:20:33.949 00000210 63 60 ec 71 c5 7f 00 6d c7 d1 38 fc 26 65 98 bf c`.q...m..8.&e.. 00:20:33.949 00000220 09 12 98 4c 22 17 13 5b 6a 66 24 35 b2 a9 03 6f ...L"..[jf$5...o 00:20:33.949 00000230 9c b7 98 9f 2a d4 6f d9 91 33 46 64 a2 8f 78 b3 ....*.o..3Fd..x. 00:20:33.949 00000240 ec 44 a3 66 b2 4c d2 a5 76 e6 24 dc 46 05 62 c2 .D.f.L..v.$.F.b. 00:20:33.949 00000250 9a 68 94 97 6f 3e 03 9f 03 b2 36 dc b6 5e 23 5c .h..o>....6..^#\ 00:20:33.949 00000260 22 6c 23 4b 26 ca 37 66 8c 6e d6 a0 cd fb 77 fa "l#K&.7f.n....w. 00:20:33.949 00000270 6a 70 b8 8f 00 f7 d2 df 00 20 33 25 ef af 53 36 jp....... 3%..S6 00:20:33.949 00000280 01 4b d1 e0 ac 7a bc 27 0e 0d 90 04 ee c4 fb 4d .K...z.'.......M 00:20:33.949 00000290 b7 aa 90 eb ae d3 ac 06 8b 96 32 b3 0d 8d 83 ac ..........2..... 00:20:33.950 000002a0 3f ab 9a 5b 07 0e 11 9a a7 e6 50 eb 74 77 36 bf ?..[......P.tw6. 00:20:33.950 000002b0 6e d1 ef 6c 05 90 4b 2c 81 2b 2c 66 a1 3b 2f 69 n..l..K,.+,f.;/i 00:20:33.950 000002c0 2a 3c 35 91 1f 18 97 d1 03 93 3d b4 bb 76 f9 c1 *<5.......=..v.. 00:20:33.950 000002d0 61 33 b8 1c 19 90 6b b0 db cf 9f e3 0e f9 2a 6b a3....k.......*k 00:20:33.950 000002e0 db 19 d3 58 bc cf 1a 33 52 06 20 4c 22 13 6c 34 ...X...3R. L".l4 00:20:33.950 000002f0 91 75 76 fa 3a b6 80 9a f7 4a 5e 01 78 1e c6 33 .uv.:....J^.x..3 00:20:33.950 00000300 96 f1 d4 53 d9 20 f2 95 e8 54 14 22 78 b1 59 9f ...S. ...T."x.Y. 00:20:33.950 00000310 15 b0 72 f2 10 97 88 95 fc 97 63 80 7b 70 db e8 ..r.......c.{p.. 00:20:33.950 00000320 52 80 eb c1 6e 39 10 9d eb 13 c7 a8 40 5c 8c 03 R...n9......@\.. 00:20:33.950 00000330 1d 64 57 81 68 ae ea cb 46 ce 83 92 1a ee aa 3c .dW.h...F......< 00:20:33.950 00000340 35 d2 de 37 97 a4 2e d7 71 46 6c f3 ae b8 f4 e2 5..7....qFl..... 00:20:33.950 00000350 fb 52 89 12 ec ff 14 13 12 0c 91 3b 27 bd 2e 16 .R.........;'... 00:20:33.950 00000360 be 9f 1c 56 18 01 42 4a 97 ca 07 9b 3a 9e 5b a4 ...V..BJ....:.[. 00:20:33.950 00000370 92 c7 10 8d ae 9e b2 5c 98 96 b7 8b 56 f3 34 2a .......\....V.4* 00:20:33.950 00000380 da 45 14 8f c4 a7 46 b1 77 89 0f 36 cd c6 f5 4a .E....F.w..6...J 00:20:33.950 00000390 45 39 19 b0 4a 6f e0 bb 8d da b2 76 33 e2 c7 93 E9..Jo.....v3... 00:20:33.950 000003a0 c7 9b 34 e4 c8 c6 7d 58 88 81 6e 71 f1 f2 98 36 ..4...}X..nq...6 00:20:33.950 000003b0 b7 33 7f a9 3c d8 74 2c e8 c1 35 5c 28 97 fc 33 .3..<.t,..5\(..3 00:20:33.950 000003c0 ff 2a 3a e9 52 09 be 42 ff 35 7d ee 49 60 7f cf .*:.R..B.5}.I`.. 00:20:33.950 000003d0 cf 21 63 28 37 e9 d6 fd 11 ab 43 67 e1 38 0f 1b .!c(7.....Cg.8.. 00:20:33.950 000003e0 11 0f f5 2c 49 02 fa 88 aa 20 cb 41 e0 b2 b2 5a ...,I.... .A...Z 00:20:33.950 000003f0 8d 7f f6 00 36 2e e9 39 05 00 c2 3f 15 93 ea 51 ....6..9...?...Q 00:20:33.950 [2024-09-27 15:25:24.587614] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=2, dhgroup=5, seq=3428451794, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=48 00:20:33.950 [2024-09-27 15:25:24.587693] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.950 [2024-09-27 15:25:24.669122] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.950 [2024-09-27 15:25:24.669153] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.950 [2024-09-27 15:25:24.669160] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.950 [2024-09-27 15:25:24.834441] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.950 [2024-09-27 15:25:24.834468] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.950 [2024-09-27 15:25:24.834475] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.950 [2024-09-27 15:25:24.834519] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.950 [2024-09-27 15:25:24.834541] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.950 ctrlr pubkey: 00:20:33.950 00000000 da 57 a3 d1 78 cb 1d b3 46 09 70 a8 bb 8a b4 98 .W..x...F.p..... 00:20:33.950 00000010 19 bc a2 9a a2 20 63 89 7f 8f 03 b6 5e 50 1f 3a ..... c.....^P.: 00:20:33.950 00000020 3c 69 38 4e e0 8c 19 2e 1e 83 c9 d9 a7 f9 c8 ff .R..k.... 00:20:33.950 000000e0 a8 74 7d f3 a9 70 0d f8 02 cc af fc d4 01 5f 45 .t}..p........_E 00:20:33.950 000000f0 ae 7b 17 c1 34 d7 90 cb 12 fb ca 06 88 68 4d 0b .{..4........hM. 00:20:33.950 host pubkey: 00:20:33.950 00000000 ae 19 69 7a 9d e9 b6 a4 7f 62 1c a2 8e 84 20 4d ..iz.....b.... M 00:20:33.950 00000010 da 9b 27 62 30 28 03 ff 0b ef e4 c3 b7 4e 9c 51 ..'b0(.......N.Q 00:20:33.950 00000020 d9 38 29 f3 76 e0 9b b7 c5 79 40 0c 3b e0 d5 34 .8).v....y@.;..4 00:20:33.950 00000030 d6 c8 f0 7d 53 86 7b 93 e9 3e 5a d3 df 67 ad d8 ...}S.{..>Z..g.. 00:20:33.950 00000040 53 01 47 df 3d 61 1d d4 d4 bd 56 84 55 9b ec 99 S.G.=a....V.U... 00:20:33.950 00000050 e8 60 4e 00 43 fc b6 5f e2 5b c7 75 ab 54 d3 2d .`N.C.._.[.u.T.- 00:20:33.950 00000060 a1 ad ee ee 76 6c 2b a8 18 b8 58 f3 4b a2 94 e7 ....vl+...X.K... 00:20:33.950 00000070 85 7d d0 91 09 37 34 23 2a 8f b0 72 90 54 90 76 .}...74#*..r.T.v 00:20:33.950 00000080 e1 58 c4 b4 76 90 af cb 52 3f eb d6 5e 45 2c c9 .X..v...R?..^E,. 00:20:33.950 00000090 6f a2 5b c1 2f 6f 6b 67 ce 4f eb 78 d4 55 fa af o.[./okg.O.x.U.. 00:20:33.950 000000a0 e4 26 6b 90 d0 bd 55 40 9d 4e ca 28 4a 65 27 e5 .&k...U@.N.(Je'. 00:20:33.950 000000b0 4e 90 f1 17 40 4e 7b 7f 1e 72 b8 ac f0 72 39 2e N...@N{..r...r9. 00:20:33.950 000000c0 05 91 cc da 9b 1c 91 28 50 7d f4 5d f1 0a 0e 04 .......(P}.].... 00:20:33.950 000000d0 5c 43 e7 c1 34 49 f3 4b af d4 e3 97 b1 6d e0 64 \C..4I.K.....m.d 00:20:33.950 000000e0 91 6f 5c b4 74 f8 18 44 85 9c b5 c9 80 6e 61 73 .o\.t..D.....nas 00:20:33.950 000000f0 6a 9c 86 9c f2 45 7a 16 dd 88 6a 29 1d 75 11 a7 j....Ez...j).u.. 00:20:33.950 dh secret: 00:20:33.950 00000000 87 c9 cb 43 d6 03 cb fa de 0d a5 9a 7c de b3 cc ...C........|... 00:20:33.950 00000010 15 e1 a6 e8 7e dc 30 e4 f9 42 97 b6 30 78 a6 f3 ....~.0..B..0x.. 00:20:33.950 00000020 46 03 ae 05 56 a5 37 f6 ab f0 7d b9 1f 69 f7 bb F...V.7...}..i.. 00:20:33.950 00000030 28 d6 43 ab 50 3f 36 50 6f a5 cc f0 32 80 28 7d (.C.P?6Po...2.(} 00:20:33.950 00000040 c1 01 de 87 8a ca cf 63 67 30 db 97 54 3a 27 4b .......cg0..T:'K 00:20:33.950 00000050 a2 57 37 8c b1 8f e6 5f eb 75 95 54 11 34 80 9f .W7...._.u.T.4.. 00:20:33.950 00000060 a4 ae d8 ef ea d0 9e 7d 21 1d e2 84 12 30 71 cb .......}!....0q. 00:20:33.950 00000070 c8 c4 9b 99 33 f3 2f bb 5d 7d 49 60 d2 c7 7d 3a ....3./.]}I`..}: 00:20:33.950 00000080 39 dc 1d 69 24 5b 8f 64 f4 c5 fb 8b 13 a8 56 ec 9..i$[.d......V. 00:20:33.950 00000090 33 d6 53 e6 9c 3a 2c d2 80 9a 6c de 34 16 b8 b0 3.S..:,...l.4... 00:20:33.950 000000a0 f0 32 4c 44 7b 68 ef cf 6b e7 df 1e a4 79 7e ea .2LD{h..k....y~. 00:20:33.950 000000b0 ee 6d 29 b5 97 42 31 05 2e bb f2 ff 4a d1 15 ab .m)..B1.....J... 00:20:33.950 000000c0 4b 17 17 43 0e 97 7c a2 f4 3f 80 69 c4 d5 d1 ae K..C..|..?.i.... 00:20:33.950 000000d0 c8 e8 80 ad c9 25 ea ea 9f 60 33 e6 3f f5 40 90 .....%...`3.?.@. 00:20:33.950 000000e0 3a fd 61 9b c0 8a d6 03 7d f4 c2 24 d4 ed da 78 :.a.....}..$...x 00:20:33.950 000000f0 a5 f3 6e 5c 67 c8 ab 93 f8 2a 9d 60 22 7e 65 c6 ..n\g....*.`"~e. 00:20:33.950 [2024-09-27 15:25:24.837108] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=3, dhgroup=1, seq=3428451795, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.950 [2024-09-27 15:25:24.839736] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.950 [2024-09-27 15:25:24.839775] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.950 [2024-09-27 15:25:24.839792] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.950 [2024-09-27 15:25:24.839812] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.950 [2024-09-27 15:25:24.839826] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.950 [2024-09-27 15:25:24.946204] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.950 [2024-09-27 15:25:24.946222] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.950 [2024-09-27 15:25:24.946229] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.950 [2024-09-27 15:25:24.946239] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.950 [2024-09-27 15:25:24.946293] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.950 ctrlr pubkey: 00:20:33.950 00000000 da 57 a3 d1 78 cb 1d b3 46 09 70 a8 bb 8a b4 98 .W..x...F.p..... 00:20:33.950 00000010 19 bc a2 9a a2 20 63 89 7f 8f 03 b6 5e 50 1f 3a ..... c.....^P.: 00:20:33.950 00000020 3c 69 38 4e e0 8c 19 2e 1e 83 c9 d9 a7 f9 c8 ff .R..k.... 00:20:33.951 000000e0 a8 74 7d f3 a9 70 0d f8 02 cc af fc d4 01 5f 45 .t}..p........_E 00:20:33.951 000000f0 ae 7b 17 c1 34 d7 90 cb 12 fb ca 06 88 68 4d 0b .{..4........hM. 00:20:33.951 host pubkey: 00:20:33.951 00000000 94 72 92 b8 e2 c5 cb a0 58 91 66 f9 ac 01 8d fe .r......X.f..... 00:20:33.951 00000010 66 fa 47 9d 94 c8 ce 94 ff 8a 18 10 47 8d bc 60 f.G.........G..` 00:20:33.951 00000020 bb d4 d9 27 a3 26 58 e6 d0 4f 12 e8 96 8e 84 d6 ...'.&X..O...... 00:20:33.951 00000030 a7 f0 66 d7 6e eb 32 d4 2d 89 0e a3 21 d9 09 0f ..f.n.2.-...!... 00:20:33.951 00000040 2c 62 c5 42 53 b5 64 10 b9 a5 4e 1b 6a 2a 04 11 ,b.BS.d...N.j*.. 00:20:33.951 00000050 b8 af b3 55 72 30 21 c5 6e 11 66 ad b5 02 9c 02 ...Ur0!.n.f..... 00:20:33.951 00000060 92 cb 83 32 a2 12 17 12 0e 23 f1 da ed d3 28 df ...2.....#....(. 00:20:33.951 00000070 2c 2d aa b9 25 1f 16 80 ee 03 a9 6e 46 5f d7 f7 ,-..%......nF_.. 00:20:33.951 00000080 a5 d7 03 4c c5 85 f9 51 0a 9e 61 89 ab 7e 2d ef ...L...Q..a..~-. 00:20:33.951 00000090 f3 ed 07 ec fa b2 72 a3 55 80 aa 66 7d 41 e6 b9 ......r.U..f}A.. 00:20:33.951 000000a0 7e 5c 86 03 7e ef f0 af 9d 6e 9e 00 e9 70 9d 1d ~\..~....n...p.. 00:20:33.951 000000b0 5a 66 a4 13 cd 0c cf 71 73 1f 8e 6b cb b0 0a 3e Zf.....qs..k...> 00:20:33.951 000000c0 b1 ab d1 d3 07 5d 22 7e aa a3 b2 2e b4 6f ce f6 .....]"~.....o.. 00:20:33.951 000000d0 c5 79 d0 99 4e c1 b9 1b 30 06 77 d5 d2 e1 e0 e6 .y..N...0.w..... 00:20:33.951 000000e0 63 ac 52 d1 72 80 08 04 ca 2f 9a e4 a4 1d 85 67 c.R.r..../.....g 00:20:33.951 000000f0 37 4b 54 87 85 8e d5 e7 02 9c d7 bd 4d a5 e8 13 7KT.........M... 00:20:33.951 dh secret: 00:20:33.951 00000000 0b 50 5c a1 c6 8a 0a dc 57 ef d5 af b4 0e e3 2c .P\.....W......, 00:20:33.951 00000010 29 5e 37 bd 1a 22 ec ee 03 78 15 58 4e bd 63 f1 )^7.."...x.XN.c. 00:20:33.951 00000020 17 61 5a 3c 55 3f 10 01 a7 d6 2e 62 fc a8 1e af .aZ 00:20:33.951 00000070 31 35 50 c7 c3 30 d0 be 23 37 3c 5f e2 79 f4 db 15P..0..#7<_.y.. 00:20:33.951 00000080 df ff 57 be a2 18 93 25 be d4 6a e6 70 66 a5 7c ..W....%..j.pf.| 00:20:33.951 00000090 0e 48 97 cb 7a e7 bf 71 2c f6 a1 b8 12 49 7f d9 .H..z..q,....I.. 00:20:33.951 000000a0 84 94 da 8c 53 1d 68 4f 89 8f 58 33 84 ab ca e5 ....S.hO..X3.... 00:20:33.951 000000b0 88 a8 bf e2 23 ed 61 e4 79 29 7f 76 58 d2 6d 38 ....#.a.y).vX.m8 00:20:33.951 000000c0 4c d3 d3 23 ea 89 37 74 0d 6d 06 9a e9 d6 2c 2d L..#..7t.m....,- 00:20:33.951 000000d0 fa 8a f1 7b c2 d9 f6 a7 54 d4 01 97 6b 24 9c c8 ...{....T...k$.. 00:20:33.951 000000e0 37 72 90 df 1f 8e cc b1 a5 85 de 20 63 a3 72 ac 7r......... c.r. 00:20:33.951 000000f0 56 a4 0f e0 f2 5f 57 16 57 7c a7 af 87 ab c4 68 V...._W.W|.....h 00:20:33.951 [2024-09-27 15:25:24.948879] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=3, dhgroup=1, seq=3428451796, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.951 [2024-09-27 15:25:24.948977] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.951 [2024-09-27 15:25:24.958211] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.951 [2024-09-27 15:25:24.958292] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.951 [2024-09-27 15:25:24.958302] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.951 [2024-09-27 15:25:24.958338] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.951 [2024-09-27 15:25:25.117310] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.951 [2024-09-27 15:25:25.117328] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.951 [2024-09-27 15:25:25.117335] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.951 [2024-09-27 15:25:25.117391] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.951 [2024-09-27 15:25:25.117414] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.951 ctrlr pubkey: 00:20:33.951 00000000 32 d0 44 31 12 0b cd c2 77 dd a7 97 20 3f 38 5b 2.D1....w... ?8[ 00:20:33.951 00000010 d7 a2 af 09 ab c0 fe 68 4b 94 b1 e8 9e b0 89 8c .......hK....... 00:20:33.951 00000020 84 b6 01 95 75 17 1b e2 df 93 c9 16 2c 93 aa d8 ....u.......,... 00:20:33.951 00000030 18 ac b3 1c e2 94 75 f7 9b 6c 69 22 11 d5 c0 d5 ......u..li".... 00:20:33.951 00000040 8f d8 fd 86 ed 29 ea 25 4e dc 9a 76 56 b0 10 7e .....).%N..vV..~ 00:20:33.951 00000050 1e d5 0c 19 4e ef cd e2 8a 11 4c 5e f6 5d 0c d1 ....N.....L^.].. 00:20:33.951 00000060 54 c3 54 f1 48 84 82 da 94 0a 38 4a a6 70 ca be T.T.H.....8J.p.. 00:20:33.951 00000070 3e 65 9f a7 e8 c3 14 ec 88 54 bd 71 a3 75 01 73 >e.......T.q.u.s 00:20:33.951 00000080 78 eb 42 9f 35 63 83 f3 8f fb 81 ca 75 13 14 ca x.B.5c......u... 00:20:33.951 00000090 e0 f2 67 91 c2 49 bc 95 89 db df 2e 7b d3 23 8c ..g..I......{.#. 00:20:33.951 000000a0 85 b3 90 fa 2e 96 8d fb 13 25 e7 91 f0 df 85 ce .........%...... 00:20:33.951 000000b0 d5 75 1d a7 87 c0 04 2f 2d e2 47 b7 47 3f 44 d2 .u...../-.G.G?D. 00:20:33.951 000000c0 15 78 6a a8 3e 85 dd a4 f0 1f ee 96 4c a4 c4 4d .xj.>.......L..M 00:20:33.951 000000d0 49 d2 da 85 ab 4c 47 44 45 e2 d5 16 50 2b 5f 0e I....LGDE...P+_. 00:20:33.951 000000e0 46 61 6f 90 18 c1 a3 50 11 54 47 a4 81 3d 8a 82 Fao....P.TG..=.. 00:20:33.951 000000f0 d6 63 ed 5f 90 6f 77 6c 2f 13 f7 20 56 b6 9a c9 .c._.owl/.. V... 00:20:33.951 host pubkey: 00:20:33.951 00000000 3f 61 71 fa ce 55 63 61 05 d0 08 85 f7 6f 88 94 ?aq..Uca.....o.. 00:20:33.951 00000010 5f dc 55 19 c0 15 68 5c 56 11 1a bc 44 33 b9 c9 _.U...h\V...D3.. 00:20:33.951 00000020 5e 36 40 6f ec 33 44 b7 55 ff d4 97 db e5 f3 55 ^6@o.3D.U......U 00:20:33.951 00000030 88 1d 97 26 6e 2a 16 d2 87 88 ad 2a 94 e0 30 18 ...&n*.....*..0. 00:20:33.951 00000040 46 02 d1 0a 6f e7 9e 21 65 6e 4f ff 9f ed 71 c6 F...o..!enO...q. 00:20:33.951 00000050 07 d1 57 c6 9f d5 12 14 b0 30 41 dd 47 66 10 cd ..W......0A.Gf.. 00:20:33.951 00000060 43 46 a0 2f 45 26 d2 14 7e af 0f 80 11 ed 21 0d CF./E&..~.....!. 00:20:33.951 00000070 f7 55 ae 95 58 e5 a8 e1 2d 97 d2 86 27 4f 63 54 .U..X...-...'OcT 00:20:33.951 00000080 bd 8b 40 68 bb b7 c0 b6 a5 b9 5e fc af b5 b3 a2 ..@h......^..... 00:20:33.951 00000090 90 df c4 47 cf 12 0b b7 69 bb 74 ac 84 85 67 d0 ...G....i.t...g. 00:20:33.951 000000a0 90 3d 43 3a d1 38 88 55 3b ea da 01 6f fd 5e 31 .=C:.8.U;...o.^1 00:20:33.951 000000b0 8f 8c cd aa 3a c6 1d b5 18 19 f0 d5 01 ff a9 20 ....:.......... 00:20:33.951 000000c0 9c 08 f3 1d 52 3f a4 a0 6e d0 17 ab fb 4b c7 b3 ....R?..n....K.. 00:20:33.951 000000d0 26 fa 7c 22 48 7a 3e 9f 47 07 3e a4 79 6e 9f c2 &.|"Hz>.G.>.yn.. 00:20:33.951 000000e0 74 8f 57 76 00 a0 d5 ad d7 51 a7 4a 55 07 f0 ce t.Wv.....Q.JU... 00:20:33.951 000000f0 d1 d4 79 aa e8 85 e5 36 04 4f 2d 66 c4 c3 92 0b ..y....6.O-f.... 00:20:33.951 dh secret: 00:20:33.951 00000000 98 de 05 5e 22 a3 ab 03 13 66 e5 17 66 3a 6e f0 ...^"....f..f:n. 00:20:33.951 00000010 2f e6 3c b8 13 66 1f 16 c1 ea 51 f9 15 22 5a b6 /.<..f....Q.."Z. 00:20:33.951 00000020 86 d4 23 8c c3 9a 67 4a b4 a4 22 71 40 4c fa 1d ..#...gJ.."q@L.. 00:20:33.951 00000030 05 04 64 42 5f 10 73 85 ac c1 20 88 2c e3 31 ed ..dB_.s... .,.1. 00:20:33.951 00000040 c6 f8 cd 82 ad 9b d3 a8 36 ba 06 14 18 a8 f0 ec ........6....... 00:20:33.951 00000050 d8 22 db 1b c0 60 ee 51 45 ac 71 1a f0 0b 13 f4 ."...`.QE.q..... 00:20:33.951 00000060 c8 0c e5 06 b0 61 e3 30 f4 d6 9e dd cd be c7 cf .....a.0........ 00:20:33.951 00000070 e2 45 d4 43 39 7a b5 b4 05 a1 11 66 48 f4 27 cf .E.C9z.....fH.'. 00:20:33.951 00000080 02 ac 52 1a c5 5c 2c 10 2e 2c 73 f4 c2 8a e9 2d ..R..\,..,s....- 00:20:33.951 00000090 10 ad 2b 17 56 ac 07 c9 45 74 72 a5 c6 16 2e 28 ..+.V...Etr....( 00:20:33.951 000000a0 bc ef 6d 13 b9 f7 e2 73 36 30 f4 63 92 10 4b c2 ..m....s60.c..K. 00:20:33.951 000000b0 ff 1f 0e cc f1 37 a5 31 c8 7c 87 29 39 9b e3 85 .....7.1.|.)9... 00:20:33.951 000000c0 ea 31 b0 1c 21 dc 89 e8 14 f7 ca 5d 4e 0d c0 7c .1..!......]N..| 00:20:33.951 000000d0 7b c9 3f 93 bc 2b e7 9c 25 37 b3 7e 4c 69 41 82 {.?..+..%7.~LiA. 00:20:33.952 000000e0 61 22 54 18 b7 5b a1 fe a9 dd 6b 3e 9a 11 4c 7b a"T..[....k>..L{ 00:20:33.952 000000f0 50 4e 7e 54 3a 2c 3d cb 76 b7 ab 3b 97 78 81 38 PN~T:,=.v..;.x.8 00:20:33.952 [2024-09-27 15:25:25.119896] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=3, dhgroup=1, seq=3428451797, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.952 [2024-09-27 15:25:25.122517] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.952 [2024-09-27 15:25:25.122557] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.952 [2024-09-27 15:25:25.122573] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.952 [2024-09-27 15:25:25.122592] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.952 [2024-09-27 15:25:25.122611] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.952 [2024-09-27 15:25:25.228056] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.952 [2024-09-27 15:25:25.228075] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.952 [2024-09-27 15:25:25.228082] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.952 [2024-09-27 15:25:25.228092] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.952 [2024-09-27 15:25:25.228146] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.952 ctrlr pubkey: 00:20:33.952 00000000 32 d0 44 31 12 0b cd c2 77 dd a7 97 20 3f 38 5b 2.D1....w... ?8[ 00:20:33.952 00000010 d7 a2 af 09 ab c0 fe 68 4b 94 b1 e8 9e b0 89 8c .......hK....... 00:20:33.952 00000020 84 b6 01 95 75 17 1b e2 df 93 c9 16 2c 93 aa d8 ....u.......,... 00:20:33.952 00000030 18 ac b3 1c e2 94 75 f7 9b 6c 69 22 11 d5 c0 d5 ......u..li".... 00:20:33.952 00000040 8f d8 fd 86 ed 29 ea 25 4e dc 9a 76 56 b0 10 7e .....).%N..vV..~ 00:20:33.952 00000050 1e d5 0c 19 4e ef cd e2 8a 11 4c 5e f6 5d 0c d1 ....N.....L^.].. 00:20:33.952 00000060 54 c3 54 f1 48 84 82 da 94 0a 38 4a a6 70 ca be T.T.H.....8J.p.. 00:20:33.952 00000070 3e 65 9f a7 e8 c3 14 ec 88 54 bd 71 a3 75 01 73 >e.......T.q.u.s 00:20:33.952 00000080 78 eb 42 9f 35 63 83 f3 8f fb 81 ca 75 13 14 ca x.B.5c......u... 00:20:33.952 00000090 e0 f2 67 91 c2 49 bc 95 89 db df 2e 7b d3 23 8c ..g..I......{.#. 00:20:33.952 000000a0 85 b3 90 fa 2e 96 8d fb 13 25 e7 91 f0 df 85 ce .........%...... 00:20:33.952 000000b0 d5 75 1d a7 87 c0 04 2f 2d e2 47 b7 47 3f 44 d2 .u...../-.G.G?D. 00:20:33.952 000000c0 15 78 6a a8 3e 85 dd a4 f0 1f ee 96 4c a4 c4 4d .xj.>.......L..M 00:20:33.952 000000d0 49 d2 da 85 ab 4c 47 44 45 e2 d5 16 50 2b 5f 0e I....LGDE...P+_. 00:20:33.952 000000e0 46 61 6f 90 18 c1 a3 50 11 54 47 a4 81 3d 8a 82 Fao....P.TG..=.. 00:20:33.952 000000f0 d6 63 ed 5f 90 6f 77 6c 2f 13 f7 20 56 b6 9a c9 .c._.owl/.. V... 00:20:33.952 host pubkey: 00:20:33.952 00000000 b0 43 a7 49 4b 78 95 61 10 24 83 2c 59 32 45 33 .C.IKx.a.$.,Y2E3 00:20:33.952 00000010 d6 c8 eb 32 af de ab 29 64 fc b1 86 81 e7 1f c1 ...2...)d....... 00:20:33.952 00000020 95 fb de 24 4e aa 38 85 e6 94 29 99 f1 43 1a 29 ...$N.8...)..C.) 00:20:33.952 00000030 65 d3 1e 41 d6 8c e1 a1 e5 df 6a bf 72 fe 97 b2 e..A......j.r... 00:20:33.952 00000040 11 f9 2a c8 53 8a 3a 67 52 8f db 66 56 c5 60 ff ..*.S.:gR..fV.`. 00:20:33.952 00000050 56 d2 08 d9 c8 a4 bd 9a 83 db c2 e9 36 d1 b1 6b V...........6..k 00:20:33.952 00000060 10 66 0c d3 0a 9c 07 3b 1c 42 7d d6 d9 ab a9 f1 .f.....;.B}..... 00:20:33.952 00000070 bf f7 17 d9 f5 51 4c 7f 21 ab 47 12 47 7c 77 21 .....QL.!.G.G|w! 00:20:33.952 00000080 5b f6 8c 9b 35 1f aa 20 9e c9 f1 65 b1 0c 71 dd [...5.. ...e..q. 00:20:33.952 00000090 db cf 04 3c 75 eb 17 62 e3 bc 1b 74 3c e1 40 89 ....P.... 00:20:33.953 00000080 ed 4a ae a6 fc 50 7c 0f 04 11 5e ab 93 10 af 37 .J...P|...^....7 00:20:33.953 00000090 86 8d f6 f1 c8 94 99 1d a6 ad bd 74 cc 31 35 a6 ...........t.15. 00:20:33.953 000000a0 c3 ce e9 74 f3 8e 7e 6b e5 29 7a c4 a6 f6 4d 39 ...t..~k.)z...M9 00:20:33.953 000000b0 94 01 b8 d7 7f 20 55 ab 4b 72 e5 dd db e8 e7 52 ..... U.Kr.....R 00:20:33.953 000000c0 bb 29 4e 83 b1 4b 9c 5f 4f 6d 44 ef 36 02 a4 59 .)N..K._OmD.6..Y 00:20:33.953 000000d0 00 2c 70 f9 f7 e2 60 eb 07 72 50 d1 12 31 be fe .,p...`..rP..1.. 00:20:33.953 000000e0 59 4f 9d c7 d0 f4 2b b9 ad 3b 6b 55 a5 71 dd 03 YO....+..;kU.q.. 00:20:33.953 000000f0 c9 07 dd 36 dc 2c 79 43 06 86 ad a0 01 84 e6 ca ...6.,yC........ 00:20:33.953 host pubkey: 00:20:33.953 00000000 8d cc 16 06 7f 62 d9 dc da 16 5a 83 26 6b c7 e5 .....b....Z.&k.. 00:20:33.953 00000010 90 3a 07 d1 36 bc 21 ca a9 31 45 68 11 51 84 3b .:..6.!..1Eh.Q.; 00:20:33.953 00000020 6b 6f 3b 8d 7d 27 b3 51 ec 90 58 6f a4 77 47 2c ko;.}'.Q..Xo.wG, 00:20:33.953 00000030 62 3f c5 c8 89 56 10 6f 05 e8 e3 23 46 81 96 ce b?...V.o...#F... 00:20:33.953 00000040 e0 ca d9 f5 b1 9d a8 aa 28 13 0a d0 75 be d6 17 ........(...u... 00:20:33.953 00000050 03 b8 a8 ca 2c 48 28 4c 1a 5f 82 42 e5 f9 f6 52 ....,H(L._.B...R 00:20:33.953 00000060 8e 19 01 1b a6 a8 88 0a 71 20 0b e0 40 0d a9 a2 ........q ..@... 00:20:33.953 00000070 29 d0 4d ff 5e b5 cd 5a 75 cb 8e f1 74 cd bc e9 ).M.^..Zu...t... 00:20:33.953 00000080 54 dc fc 38 fb 23 b9 d1 12 e0 9b d1 08 0c 6a 31 T..8.#........j1 00:20:33.953 00000090 bb 56 5c 0e 5d 20 11 8e 9f 5d 6a 59 23 59 98 a8 .V\.] ...]jY#Y.. 00:20:33.953 000000a0 65 08 f4 04 db ce 92 92 78 6c 1a 66 83 67 5e 61 e.......xl.f.g^a 00:20:33.953 000000b0 90 8f 9b b5 4a e3 a6 c0 87 a4 c5 ee 5b 65 6d 28 ....J.......[em( 00:20:33.953 000000c0 71 e0 eb a4 25 54 ac 83 b9 b6 1e 66 73 81 a1 ab q...%T.....fs... 00:20:33.953 000000d0 10 55 26 8d 32 93 3c 09 37 f7 d8 ba 37 67 ed dc .U&.2.<.7...7g.. 00:20:33.953 000000e0 55 16 8a 51 f7 d3 48 fe c9 17 bd 3c 9c 74 28 36 U..Q..H....<.t(6 00:20:33.953 000000f0 51 ef ca 0d e1 75 78 26 aa 52 ce 4c f9 05 07 48 Q....ux&.R.L...H 00:20:33.953 dh secret: 00:20:33.953 00000000 41 5e bf 1b c6 37 94 1e 6d bf be 42 6a 89 4a 2d A^...7..m..Bj.J- 00:20:33.953 00000010 f1 0a 60 04 a5 e9 62 ea b9 e6 94 24 a7 de 22 5b ..`...b....$.."[ 00:20:33.953 00000020 5d 50 44 65 a9 5a 6f 82 53 2c 27 6e 4f 90 33 72 ]PDe.Zo.S,'nO.3r 00:20:33.953 00000030 d0 1f 91 bc 86 e8 e7 4d 3b 19 15 a2 df 09 1e cd .......M;....... 00:20:33.953 00000040 f0 a5 07 94 60 cb 83 19 ec f4 0a 8f 5e cf d9 52 ....`.......^..R 00:20:33.953 00000050 f6 2f 6b 14 96 3e ed e5 1f 48 da bc 10 f1 f8 a5 ./k..>...H...... 00:20:33.953 00000060 68 55 3f c7 cf 61 fd 5b 58 22 3a f2 3f 13 7d dc hU?..a.[X":.?.}. 00:20:33.953 00000070 e3 03 5b 2d 1f 4e 5b c1 cf b3 3e fa 58 d0 95 68 ..[-.N[...>.X..h 00:20:33.953 00000080 9d 7f 4a 48 ab 27 37 81 b3 73 e7 4e 94 35 48 c5 ..JH.'7..s.N.5H. 00:20:33.953 00000090 7f 70 f5 c0 fd eb dc 10 e9 90 ef 71 cb d9 55 d2 .p.........q..U. 00:20:33.953 000000a0 89 f4 11 20 7b f9 61 a4 b0 93 3d 50 b3 1d e1 ee ... {.a...=P.... 00:20:33.953 000000b0 5e 74 f3 a3 db 2f 29 dc 4a 10 28 31 10 ec 28 19 ^t.../).J.(1..(. 00:20:33.953 000000c0 c5 60 10 d7 17 33 a9 e4 2c d5 34 b7 84 5d e4 61 .`...3..,.4..].a 00:20:33.953 000000d0 15 77 f4 05 c7 3c 9d 1b cf 7e 1e 7e f1 8a f5 a6 .w...<...~.~.... 00:20:33.953 000000e0 95 e9 a3 91 14 97 2b 5c dc 32 aa a0 8d 3e 4f 6f ......+\.2...>Oo 00:20:33.953 000000f0 14 e2 a2 b3 e7 e2 b7 a6 8b 13 62 68 21 8e 58 37 ..........bh!.X7 00:20:33.953 [2024-09-27 15:25:25.394118] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=3, dhgroup=1, seq=3428451799, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.953 [2024-09-27 15:25:25.396927] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.953 [2024-09-27 15:25:25.396970] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.953 [2024-09-27 15:25:25.396987] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.953 [2024-09-27 15:25:25.397011] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.953 [2024-09-27 15:25:25.397023] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.953 [2024-09-27 15:25:25.503233] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.953 [2024-09-27 15:25:25.503252] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.953 [2024-09-27 15:25:25.503259] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.953 [2024-09-27 15:25:25.503269] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.953 [2024-09-27 15:25:25.503323] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.953 ctrlr pubkey: 00:20:33.953 00000000 19 5e 67 0a 35 2e 61 11 19 f6 23 db 2d b5 e4 74 .^g.5.a...#.-..t 00:20:33.953 00000010 9c 4a f9 ad fe bf 2c e6 da a5 92 68 31 d3 35 1b .J....,....h1.5. 00:20:33.953 00000020 68 d9 22 c4 c0 0e 5b 13 f4 19 74 46 d4 1a 2d 76 h."...[...tF..-v 00:20:33.953 00000030 d5 6e 67 59 b3 61 1e 9e b3 c3 61 ef f3 d9 d1 83 .ngY.a....a..... 00:20:33.953 00000040 7b 26 1b 08 5c a9 e5 02 fc 83 2d 18 e8 e1 f4 2b {&..\.....-....+ 00:20:33.953 00000050 86 dd 1c 2a 0e 95 e8 83 70 e7 7a 8d 9b d5 bc 1c ...*....p.z..... 00:20:33.953 00000060 fe 33 8c fe 6a 09 7c c7 84 de 44 f9 b8 0f 5c 56 .3..j.|...D...\V 00:20:33.953 00000070 c8 bb b5 34 22 66 44 d6 9a 3e cd 50 c2 8b 86 bb ...4"fD..>.P.... 00:20:33.953 00000080 ed 4a ae a6 fc 50 7c 0f 04 11 5e ab 93 10 af 37 .J...P|...^....7 00:20:33.953 00000090 86 8d f6 f1 c8 94 99 1d a6 ad bd 74 cc 31 35 a6 ...........t.15. 00:20:33.953 000000a0 c3 ce e9 74 f3 8e 7e 6b e5 29 7a c4 a6 f6 4d 39 ...t..~k.)z...M9 00:20:33.953 000000b0 94 01 b8 d7 7f 20 55 ab 4b 72 e5 dd db e8 e7 52 ..... U.Kr.....R 00:20:33.953 000000c0 bb 29 4e 83 b1 4b 9c 5f 4f 6d 44 ef 36 02 a4 59 .)N..K._OmD.6..Y 00:20:33.953 000000d0 00 2c 70 f9 f7 e2 60 eb 07 72 50 d1 12 31 be fe .,p...`..rP..1.. 00:20:33.953 000000e0 59 4f 9d c7 d0 f4 2b b9 ad 3b 6b 55 a5 71 dd 03 YO....+..;kU.q.. 00:20:33.953 000000f0 c9 07 dd 36 dc 2c 79 43 06 86 ad a0 01 84 e6 ca ...6.,yC........ 00:20:33.953 host pubkey: 00:20:33.953 00000000 7d 22 5b d6 00 05 f4 77 a3 aa ce 36 a1 23 81 ab }"[....w...6.#.. 00:20:33.954 00000010 2c 69 f3 1d ce 16 46 9a 87 90 af 35 3c 26 0b 3a ,i....F....5<&.: 00:20:33.954 00000020 0a 75 4f 0b 17 c9 44 56 45 12 5b 52 66 2a 04 a1 .uO...DVE.[Rf*.. 00:20:33.954 00000030 8e a2 2f ba 1d 0b aa 6c 95 45 14 bf 23 f7 63 e6 ../....l.E..#.c. 00:20:33.954 00000040 33 01 24 27 34 dc 5b d8 62 0d c8 0d 90 c3 49 7a 3.$'4.[.b.....Iz 00:20:33.954 00000050 6a c3 ea 2d 7f ed cf 54 f0 5f e2 58 de 47 07 b9 j..-...T._.X.G.. 00:20:33.954 00000060 de 4a 51 62 5b d2 2a 0c c2 58 ec 59 8f 79 0f 52 .JQb[.*..X.Y.y.R 00:20:33.954 00000070 f2 54 b2 1b 16 0b be 4f ab 46 c9 24 31 06 c1 7c .T.....O.F.$1..| 00:20:33.954 00000080 95 db 1d a2 51 07 e7 e3 0b dd a3 ea df 62 01 d1 ....Q........b.. 00:20:33.954 00000090 32 6d 55 67 ac 22 ca be d1 66 12 78 90 47 15 32 2mUg."...f.x.G.2 00:20:33.954 000000a0 a0 40 fd 32 af af 7f 96 91 f4 c9 c1 57 09 12 ca .@.2........W... 00:20:33.954 000000b0 5a 7e 68 d3 d2 23 3f 21 69 3d bd ff b7 98 62 35 Z~h..#?!i=....b5 00:20:33.954 000000c0 39 f2 12 f3 ce ae df 23 da 53 75 a5 2d 03 bd d6 9......#.Su.-... 00:20:33.954 000000d0 4e 7c b6 86 26 fc 38 bb 17 8f 60 99 04 23 ea 3f N|..&.8...`..#.? 00:20:33.954 000000e0 b8 08 ef 7e af 64 ff db f7 cc 60 9b 1a b9 bb fe ...~.d....`..... 00:20:33.954 000000f0 4c 5e bd e8 57 3a 60 8d e8 bc 66 c8 60 64 fa f0 L^..W:`...f.`d.. 00:20:33.954 dh secret: 00:20:33.954 00000000 be 89 06 c1 1d 60 40 6e 11 2e 91 2c 08 06 59 b1 .....`@n...,..Y. 00:20:33.954 00000010 58 6c e1 28 3f 30 a7 aa 4c a7 86 cc 3f 20 bf a4 Xl.(?0..L...? .. 00:20:33.954 00000020 eb b6 8f 57 79 90 74 6b 92 b5 81 54 c5 28 5a 31 ...Wy.tk...T.(Z1 00:20:33.954 00000030 39 41 bb a1 cc be 4d d8 95 4d 24 48 e3 92 2f df 9A....M..M$H../. 00:20:33.954 00000040 c6 28 06 4a ac 05 f7 89 17 19 0e 18 14 00 bb 89 .(.J............ 00:20:33.954 00000050 f0 37 da d2 09 65 07 2e 45 33 b2 81 39 a3 25 67 .7...e..E3..9.%g 00:20:33.954 00000060 e7 52 bc b4 d1 46 52 e6 d2 d8 59 f3 27 71 f4 9e .R...FR...Y.'q.. 00:20:33.954 00000070 22 99 26 c9 d3 cc dd 4b 00 8c cc 5b 30 a4 de 96 ".&....K...[0... 00:20:33.954 00000080 64 6d d8 ba e9 66 c0 1d 23 27 ee d1 0c c5 fe 44 dm...f..#'.....D 00:20:33.954 00000090 a7 c9 f3 ae a3 8e 72 c5 6b c0 b2 85 66 e6 75 67 ......r.k...f.ug 00:20:33.954 000000a0 8e 08 19 42 94 50 16 87 a0 8d 00 e6 a3 28 63 50 ...B.P.......(cP 00:20:33.954 000000b0 cb 20 26 8c b5 d4 64 25 10 75 76 59 8a 8c e0 65 . &...d%.uvY...e 00:20:33.954 000000c0 fe 6e f4 fc 9d 4d 33 4a 43 02 69 b2 7a 0f 1d 8d .n...M3JC.i.z... 00:20:33.954 000000d0 47 ec fa ac 7b 5b 9a 8b ea dd 7b b7 36 21 c5 fb G...{[....{.6!.. 00:20:33.954 000000e0 cf 85 dd 5d d7 fa e8 77 7d 08 95 fd ee 4c 73 55 ...]...w}....LsU 00:20:33.954 000000f0 01 b4 57 fd f2 14 c5 26 9c 46 70 c4 58 c3 8b a8 ..W....&.Fp.X... 00:20:33.954 [2024-09-27 15:25:25.506033] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=3, dhgroup=1, seq=3428451800, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.954 [2024-09-27 15:25:25.506135] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.954 [2024-09-27 15:25:25.515481] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.954 [2024-09-27 15:25:25.515558] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.954 [2024-09-27 15:25:25.515568] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.954 [2024-09-27 15:25:25.515607] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.954 [2024-09-27 15:25:25.666690] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.954 [2024-09-27 15:25:25.666708] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.954 [2024-09-27 15:25:25.666715] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.954 [2024-09-27 15:25:25.666757] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.954 [2024-09-27 15:25:25.666779] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.954 ctrlr pubkey: 00:20:33.954 00000000 35 57 a5 42 3c 96 3b a2 a4 05 cc c0 5e e2 f7 ab 5W.B<.;.....^... 00:20:33.954 00000010 ab b8 3b ea 78 fe 4d 39 e1 2f 80 6f 3e ba 39 de ..;.x.M9./.o>.9. 00:20:33.954 00000020 ca 10 55 85 40 a1 28 11 50 e0 de 5c bb 01 8c 0b ..U.@.(.P..\.... 00:20:33.954 00000030 50 3d 90 f3 d4 3d f6 37 d8 aa 62 91 4a ea bb 4a P=...=.7..b.J..J 00:20:33.954 00000040 17 17 30 d8 08 06 31 73 38 df fc 4a 13 ba 38 b5 ..0...1s8..J..8. 00:20:33.954 00000050 53 3d 3e 52 b2 3d fc fe 24 26 69 a0 1e 16 7a 22 S=>R.=..$&i...z" 00:20:33.954 00000060 eb 10 8f b0 6a b5 f7 41 b1 29 46 37 4d 0d 49 42 ....j..A.)F7M.IB 00:20:33.954 00000070 bf 13 6e ac 98 55 2c db e5 3f 32 5a b0 66 5f df ..n..U,..?2Z.f_. 00:20:33.954 00000080 8f 9d 73 65 91 f6 55 62 f8 3d 02 ef ed a5 b9 d3 ..se..Ub.=...... 00:20:33.954 00000090 16 4e da 30 6b 45 4d 75 38 0a 5d ee de 99 39 00 .N.0kEMu8.]...9. 00:20:33.954 000000a0 14 53 fa 87 6d a0 fd 29 fb 5c d1 af e0 f8 3a 84 .S..m..).\....:. 00:20:33.954 000000b0 7c 12 59 a1 f0 64 42 39 80 3c 22 c0 15 b5 08 f3 |.Y..dB9.<"..... 00:20:33.954 000000c0 aa 2c a4 1e 67 e2 34 58 a4 b9 89 3f fa 73 0c 7b .,..g.4X...?.s.{ 00:20:33.954 000000d0 4c f0 70 c8 da 8e ff c5 ce 72 fb ee 19 46 46 f4 L.p......r...FF. 00:20:33.954 000000e0 bd 54 fb 15 0a 70 a3 c1 fe 1d 49 94 0c ef d2 01 .T...p....I..... 00:20:33.954 000000f0 a6 80 34 e3 ca 54 f5 66 38 1d ba 2c f6 7b 99 d0 ..4..T.f8..,.{.. 00:20:33.954 host pubkey: 00:20:33.954 00000000 5d 4b 06 7b 24 3a 27 62 78 14 c5 3d e3 d1 08 3a ]K.{$:'bx..=...: 00:20:33.954 00000010 68 ac c0 92 28 8a 2e 8b e9 8e 21 4b 45 99 de aa h...(.....!KE... 00:20:33.954 00000020 ed 4b 05 c8 cb 8b 58 d5 d7 c9 55 7f 04 dd 67 b8 .K....X...U...g. 00:20:33.954 00000030 7d 7d c7 4c 8c 11 7f 60 53 e5 67 88 d1 69 7b ab }}.L...`S.g..i{. 00:20:33.954 00000040 d6 10 da e0 5d 44 6d fa f8 07 c0 d5 65 0f 33 31 ....]Dm.....e.31 00:20:33.954 00000050 7f 48 ba 22 dc 25 20 93 61 a8 26 39 b9 59 7c 4e .H.".% .a.&9.Y|N 00:20:33.954 00000060 aa 61 73 9c a0 5c c0 0b 78 0a ba ff 02 b1 38 6e .as..\..x.....8n 00:20:33.954 00000070 e8 a7 e1 c3 55 88 1e fc 97 16 55 99 01 f2 16 d9 ....U.....U..... 00:20:33.954 00000080 30 9d 25 b8 da 5c 26 d8 05 13 91 f4 ba 7e 40 fd 0.%..\&......~@. 00:20:33.954 00000090 bb ce 0a 11 5d de 4e 35 54 e6 7d 23 68 9c 50 0b ....].N5T.}#h.P. 00:20:33.954 000000a0 c4 16 f5 e0 46 95 08 a6 b5 2b 24 40 bb b1 da 59 ....F....+$@...Y 00:20:33.954 000000b0 69 79 f5 59 6d 76 4f 23 7f c4 32 0a 74 69 22 4a iy.YmvO#..2.ti"J 00:20:33.954 000000c0 df fa a2 95 fb 68 ff 8c ef c7 a9 cd df 49 23 fd .....h.......I#. 00:20:33.954 000000d0 4c 2a b5 8b 61 1f e0 95 55 79 87 2b 3e 40 22 36 L*..a...Uy.+>@"6 00:20:33.954 000000e0 0b fb 86 09 a8 a8 ba 03 ab 18 73 8e 4a 14 07 c0 ..........s.J... 00:20:33.954 000000f0 9c 45 d2 73 d4 b5 06 74 04 03 ea 4e a3 3c 9b 31 .E.s...t...N.<.1 00:20:33.954 dh secret: 00:20:33.954 00000000 a8 8b 14 5b f5 98 f7 e7 06 5e 02 e0 72 b3 cf 63 ...[.....^..r..c 00:20:33.954 00000010 95 84 80 1e c6 88 33 fd 63 45 25 3e 7b 61 83 90 ......3.cE%>{a.. 00:20:33.954 00000020 f5 78 fa 0e 0a e1 c9 39 4a e2 f3 45 63 7d 4d fb .x.....9J..Ec}M. 00:20:33.954 00000030 f7 3e 3c 5e fe 24 3c 10 26 27 a8 ab 1b 8d a1 3e .><^.$<.&'.....> 00:20:33.954 00000040 ea c6 50 36 e7 18 88 58 e9 65 0a 41 1d 39 63 97 ..P6...X.e.A.9c. 00:20:33.954 00000050 d0 72 1f 59 5d 26 95 27 24 88 2c f1 58 1a 62 b0 .r.Y]&.'$.,.X.b. 00:20:33.954 00000060 cc e4 6a 9e b1 37 f3 c1 0a 51 6b ac 02 df 78 cf ..j..7...Qk...x. 00:20:33.954 00000070 6c de 95 ef 72 bb fd 20 4b bf e1 1b 03 dd e5 d8 l...r.. K....... 00:20:33.954 00000080 5d 23 55 7f 94 96 92 ec 18 e5 5f 88 34 24 2d a5 ]#U......._.4$-. 00:20:33.954 00000090 15 dc d0 b8 54 4d 13 53 5e 2d c4 69 c7 cb af b5 ....TM.S^-.i.... 00:20:33.955 000000a0 7a 2d a6 dc cc 70 e4 95 76 b5 02 07 08 3c d0 52 z-...p..v....<.R 00:20:33.955 000000b0 41 60 1e 14 13 db 99 22 1d 79 27 80 c1 82 6f 18 A`.....".y'...o. 00:20:33.955 000000c0 94 02 ef a3 35 5a be 94 cb fe a5 7e bd 9e 48 b7 ....5Z.....~..H. 00:20:33.955 000000d0 b4 38 c7 2d c5 a1 82 27 09 a8 a0 dd 8b d5 57 1d .8.-...'......W. 00:20:33.955 000000e0 09 9b 4f 81 22 02 d7 7c 8d 1b 4e 17 54 b2 1c a9 ..O."..|..N.T... 00:20:33.955 000000f0 69 19 57 8c 27 63 c3 a9 5e 61 f2 89 61 13 a7 4e i.W.'c..^a..a..N 00:20:33.955 [2024-09-27 15:25:25.669239] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=3, dhgroup=1, seq=3428451801, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.955 [2024-09-27 15:25:25.671883] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.955 [2024-09-27 15:25:25.671919] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.955 [2024-09-27 15:25:25.671936] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.955 [2024-09-27 15:25:25.671955] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.955 [2024-09-27 15:25:25.671969] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.955 [2024-09-27 15:25:25.777751] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.955 [2024-09-27 15:25:25.777768] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.955 [2024-09-27 15:25:25.777775] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.955 [2024-09-27 15:25:25.777785] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.955 [2024-09-27 15:25:25.777842] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.955 ctrlr pubkey: 00:20:33.955 00000000 35 57 a5 42 3c 96 3b a2 a4 05 cc c0 5e e2 f7 ab 5W.B<.;.....^... 00:20:33.955 00000010 ab b8 3b ea 78 fe 4d 39 e1 2f 80 6f 3e ba 39 de ..;.x.M9./.o>.9. 00:20:33.955 00000020 ca 10 55 85 40 a1 28 11 50 e0 de 5c bb 01 8c 0b ..U.@.(.P..\.... 00:20:33.955 00000030 50 3d 90 f3 d4 3d f6 37 d8 aa 62 91 4a ea bb 4a P=...=.7..b.J..J 00:20:33.955 00000040 17 17 30 d8 08 06 31 73 38 df fc 4a 13 ba 38 b5 ..0...1s8..J..8. 00:20:33.955 00000050 53 3d 3e 52 b2 3d fc fe 24 26 69 a0 1e 16 7a 22 S=>R.=..$&i...z" 00:20:33.955 00000060 eb 10 8f b0 6a b5 f7 41 b1 29 46 37 4d 0d 49 42 ....j..A.)F7M.IB 00:20:33.955 00000070 bf 13 6e ac 98 55 2c db e5 3f 32 5a b0 66 5f df ..n..U,..?2Z.f_. 00:20:33.955 00000080 8f 9d 73 65 91 f6 55 62 f8 3d 02 ef ed a5 b9 d3 ..se..Ub.=...... 00:20:33.955 00000090 16 4e da 30 6b 45 4d 75 38 0a 5d ee de 99 39 00 .N.0kEMu8.]...9. 00:20:33.955 000000a0 14 53 fa 87 6d a0 fd 29 fb 5c d1 af e0 f8 3a 84 .S..m..).\....:. 00:20:33.955 000000b0 7c 12 59 a1 f0 64 42 39 80 3c 22 c0 15 b5 08 f3 |.Y..dB9.<"..... 00:20:33.955 000000c0 aa 2c a4 1e 67 e2 34 58 a4 b9 89 3f fa 73 0c 7b .,..g.4X...?.s.{ 00:20:33.955 000000d0 4c f0 70 c8 da 8e ff c5 ce 72 fb ee 19 46 46 f4 L.p......r...FF. 00:20:33.955 000000e0 bd 54 fb 15 0a 70 a3 c1 fe 1d 49 94 0c ef d2 01 .T...p....I..... 00:20:33.955 000000f0 a6 80 34 e3 ca 54 f5 66 38 1d ba 2c f6 7b 99 d0 ..4..T.f8..,.{.. 00:20:33.955 host pubkey: 00:20:33.955 00000000 e0 5e df bc 14 87 8a 12 27 25 1f 4a bc ce d0 5b .^......'%.J...[ 00:20:33.955 00000010 31 95 af 7e c2 10 3c 36 7a 71 9f ec dd 30 b0 b3 1..~..<6zq...0.. 00:20:33.955 00000020 e6 b8 a2 4d 0f 67 04 5e b7 1e c4 f4 bd e8 90 22 ...M.g.^......." 00:20:33.955 00000030 83 04 c7 92 f0 82 ca 84 ec 57 ad 38 e8 9c 3c 71 .........W.8... 00:20:33.955 00000090 21 59 16 ce 8f 27 fc bf 12 14 57 5f 9a 6c 39 48 !Y...'....W_.l9H 00:20:33.955 000000a0 0b 72 ba bf f4 21 8e 94 31 b6 a9 f1 fc 9d bb 9e .r...!..1....... 00:20:33.955 000000b0 04 ff 44 a2 b6 92 03 da cd a6 be 58 c2 b5 73 0d ..D........X..s. 00:20:33.955 000000c0 dd 84 f2 69 78 22 98 d2 e5 26 73 6c a2 cc d9 56 ...ix"...&sl...V 00:20:33.955 000000d0 a7 3f cc a7 f7 cf d1 1a 24 e1 cc b0 2f b3 45 a3 .?......$.../.E. 00:20:33.955 000000e0 6f 28 e0 ae 2a 67 b5 50 df 23 57 89 f1 78 d0 e2 o(..*g.P.#W..x.. 00:20:33.955 000000f0 87 e7 c6 72 76 3c d0 44 c1 95 92 44 73 c6 27 60 ...rv<.D...Ds.'` 00:20:33.955 dh secret: 00:20:33.955 00000000 98 ef b5 ef a5 6e 4c 4b ca a1 be 96 a7 b0 d0 e7 .....nLK........ 00:20:33.955 00000010 60 89 84 75 b6 57 b9 35 0f 3e d9 6d 9f 63 c5 bd `..u.W.5.>.m.c.. 00:20:33.955 00000020 89 cc 7a 4f 2f 94 51 60 ed 00 1a 03 09 02 29 a7 ..zO/.Q`......). 00:20:33.955 00000030 d2 a2 a5 c9 0f ab 35 37 ab ff cf 39 bd c0 ec 48 ......57...9...H 00:20:33.955 00000040 0c 54 48 6d 22 60 4c ec 71 49 b4 fb 6a 57 12 61 .THm"`L.qI..jW.a 00:20:33.955 00000050 96 87 86 40 47 1a cc 4e d3 50 b5 58 12 39 ce ee ...@G..N.P.X.9.. 00:20:33.955 00000060 a0 d2 c2 8d 4d 2b b0 64 a2 b5 8d 0e 4d e5 30 95 ....M+.d....M.0. 00:20:33.955 00000070 72 36 22 90 f7 46 08 53 3c e7 1a 6b 51 39 10 32 r6"..F.S<..kQ9.2 00:20:33.955 00000080 93 2a 71 b3 eb b1 de 59 0d cc 7e 8a 5c ea 30 1b .*q....Y..~.\.0. 00:20:33.955 00000090 35 c4 41 26 ae 10 44 02 9e e3 cf a6 9e 9d 8a 7d 5.A&..D........} 00:20:33.955 000000a0 07 18 77 91 4a 7c 36 37 19 f6 29 7b e1 94 d9 33 ..w.J|67..){...3 00:20:33.955 000000b0 d8 9e e2 c0 fb 37 0a 21 0c a0 ce 29 de 9d fb ce .....7.!...).... 00:20:33.955 000000c0 1b 0e 27 8a 81 48 af 9b 27 f7 d8 03 75 41 d3 62 ..'..H..'...uA.b 00:20:33.955 000000d0 42 5d a4 24 40 8d 88 58 ed e3 91 13 f1 bc d7 bf B].$@..X........ 00:20:33.955 000000e0 17 ff 50 94 f3 45 77 b6 5a 1f 6c 0f 1f cd 80 f9 ..P..Ew.Z.l..... 00:20:33.955 000000f0 6b 4b 9e f6 1e 32 64 6b 05 8c f7 4a 2b 2b e1 58 kK...2dk...J++.X 00:20:33.955 [2024-09-27 15:25:25.780420] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=3, dhgroup=1, seq=3428451802, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.955 [2024-09-27 15:25:25.780512] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.955 [2024-09-27 15:25:25.789835] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.955 [2024-09-27 15:25:25.789907] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.955 [2024-09-27 15:25:25.789917] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.955 [2024-09-27 15:25:25.789956] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.955 [2024-09-27 15:25:25.950310] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.955 [2024-09-27 15:25:25.950330] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.955 [2024-09-27 15:25:25.950349] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:33.955 [2024-09-27 15:25:25.950395] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.955 [2024-09-27 15:25:25.950419] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.955 ctrlr pubkey: 00:20:33.955 00000000 e1 4a 90 c0 16 a2 8e 4f ec d1 7a ef ae 86 1c 74 .J.....O..z....t 00:20:33.955 00000010 86 c6 fb ff 1a f3 bd 09 c7 93 f2 cd 3a a7 69 53 ............:.iS 00:20:33.955 00000020 e8 c9 5b 32 ec 66 a2 4c 83 27 ee e5 5a 71 61 27 ..[2.f.L.'..Zqa' 00:20:33.956 00000030 0c 9b a5 ed 83 3d 55 6c 4a a1 da f1 4f 3e 57 07 .....=UlJ...O>W. 00:20:33.956 00000040 c9 e8 1c 01 9a 18 7b 17 1d a9 36 5f ec 20 72 02 ......{...6_. r. 00:20:33.956 00000050 af e1 c4 85 dd fa 49 f5 68 96 8e 22 03 85 e8 e2 ......I.h..".... 00:20:33.956 00000060 a4 44 9d 59 19 20 d9 5c b5 e0 bf 2a 13 cd 99 dc .D.Y. .\...*.... 00:20:33.956 00000070 bb 98 99 dd fc cd 45 8e 6c 16 6a 51 c0 d6 7e 2d ......E.l.jQ..~- 00:20:33.956 00000080 f6 38 46 5b 92 af 2a 99 5a 70 04 aa 5d bf 51 4d .8F[..*.Zp..].QM 00:20:33.956 00000090 d3 bb 83 6a 85 b0 95 7a 4c e8 cd 65 e0 83 d4 4f ...j...zL..e...O 00:20:33.956 000000a0 2a ad c3 52 6e b4 48 8e ae d2 e5 77 78 f9 0d 4b *..Rn.H....wx..K 00:20:33.956 000000b0 39 71 7f 50 80 76 da 8b 25 02 f8 4b 2e e3 d5 de 9q.P.v..%..K.... 00:20:33.956 000000c0 89 05 d5 58 09 56 c4 74 dc d0 f2 88 1d 95 10 d9 ...X.V.t........ 00:20:33.956 000000d0 0a af c8 6f cc 5a 87 09 7c 2e ea e8 6b 21 4d 2b ...o.Z..|...k!M+ 00:20:33.956 000000e0 d9 b4 5f 75 f7 4f 0b 52 d7 56 61 68 0d 05 76 f9 .._u.O.R.Vah..v. 00:20:33.956 000000f0 b1 67 c6 f5 c9 b9 04 97 5a fa 42 47 d0 4e f9 d7 .g......Z.BG.N.. 00:20:33.956 host pubkey: 00:20:33.956 00000000 67 76 c9 bb 9a 2a dc 9c e3 b7 c7 b6 53 e3 61 e4 gv...*......S.a. 00:20:33.956 00000010 fb b4 54 65 c3 13 a0 5b a7 d7 05 09 f4 a0 a7 7a ..Te...[.......z 00:20:33.956 00000020 64 bf 34 5d 20 f8 9c ea 92 c8 1b 8e f7 e4 9e 1b d.4] ........... 00:20:33.956 00000030 42 6b 7b 52 89 25 58 84 4e 91 b7 6f 95 57 a0 d7 Bk{R.%X.N..o.W.. 00:20:33.956 00000040 b8 bb 92 be 4c 8e 4d 02 c0 1b 91 92 b8 a0 e1 64 ....L.M........d 00:20:33.956 00000050 2c 28 97 88 fe cd bd ab ce fb 55 d4 cb 9e 4d 6e ,(........U...Mn 00:20:33.956 00000060 6d 6e 3f d5 ee 77 c9 e2 a6 99 ba ab 36 cd 4a cb mn?..w......6.J. 00:20:33.956 00000070 42 78 ba 67 c2 bd f2 16 71 ad 18 76 b4 18 68 06 Bx.g....q..v..h. 00:20:33.956 00000080 f8 ce 9b c1 21 49 6f 01 20 e8 ee f0 79 88 be b1 ....!Io. ...y... 00:20:33.956 00000090 24 6d 69 12 79 b2 5a 4c 2c a6 aa be 7f c5 a0 d8 $mi.y.ZL,....... 00:20:33.956 000000a0 66 89 1a ee b0 10 d8 e2 a4 db e2 31 1c 18 48 4d f..........1..HM 00:20:33.956 000000b0 5b 30 e6 ab c5 39 39 3c 92 42 42 ca e9 8b 0b de [0...99<.BB..... 00:20:33.956 000000c0 a2 70 82 03 0c c6 a5 e3 5f 14 24 50 0a 91 60 eb .p......_.$P..`. 00:20:33.956 000000d0 8e 35 14 a3 58 1c 77 51 13 fc dd be 28 57 e3 27 .5..X.wQ....(W.' 00:20:33.956 000000e0 70 89 01 58 cb 9e 38 b8 5e 10 d6 e7 98 3a 71 fe p..X..8.^....:q. 00:20:33.956 000000f0 6e ec de c2 69 48 41 7f e4 19 74 9c 46 e3 12 07 n...iHA...t.F... 00:20:33.956 dh secret: 00:20:33.956 00000000 9e d3 c8 64 46 9c fb 7c 91 02 b9 7a 0d 07 fb 10 ...dF..|...z.... 00:20:33.956 00000010 e0 50 e5 e0 54 4e 28 a9 80 73 81 82 b6 1c d2 80 .P..TN(..s...... 00:20:33.956 00000020 aa 4a 0f 5f f3 72 07 51 80 5e c5 30 fd 3b cc 0d .J._.r.Q.^.0.;.. 00:20:33.956 00000030 8a 06 67 08 82 7e 43 a3 82 ad aa dd dd 28 f4 77 ..g..~C......(.w 00:20:33.956 00000040 27 8d 45 42 ab e9 85 7c 6a ae 66 91 cc d8 c3 22 '.EB...|j.f...." 00:20:33.956 00000050 4e e6 f3 a5 d1 73 7d b1 2c 23 80 7c 6f 83 d4 7e N....s}.,#.|o..~ 00:20:33.956 00000060 c7 b3 6a 50 04 3b 8d 95 10 0f dc 94 dd e7 01 a8 ..jP.;.......... 00:20:33.956 00000070 b9 4b c1 22 2d 04 a7 2a 6f c1 a5 59 9e 8b 4f fe .K."-..*o..Y..O. 00:20:33.956 00000080 6e 29 b0 54 bf 77 94 0d b2 96 be 27 24 ba 62 8e n).T.w.....'$.b. 00:20:33.956 00000090 b3 01 1c 8c 54 eb f4 6f 1a 70 68 a8 3b 65 c7 21 ....T..o.ph.;e.! 00:20:33.956 000000a0 a7 1d 8e ce 7b fb 89 4e 4d 1c 7d a0 03 df b5 7d ....{..NM.}....} 00:20:33.956 000000b0 5a 8a 03 1c fe a8 d2 fb d7 52 5c 41 03 c6 cc 32 Z........R\A...2 00:20:33.956 000000c0 1b b2 1d e9 a5 fa 29 89 8c 6f c0 3b f4 43 eb f2 ......)..o.;.C.. 00:20:33.956 000000d0 fb f6 a9 7d be f1 e1 5a 18 46 d1 16 87 41 10 0c ...}...Z.F...A.. 00:20:33.956 000000e0 ff e2 94 ba 0e b1 17 33 ea cb ed d7 3a 13 c0 d4 .......3....:... 00:20:33.956 000000f0 82 45 25 e9 e2 26 3b be f6 4c 0c b9 fa 90 7e df .E%..&;..L....~. 00:20:33.956 [2024-09-27 15:25:25.953033] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=3, dhgroup=1, seq=3428451803, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.956 [2024-09-27 15:25:25.955811] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.956 [2024-09-27 15:25:25.955838] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.956 [2024-09-27 15:25:25.955854] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.956 [2024-09-27 15:25:25.955860] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.956 [2024-09-27 15:25:26.061580] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.956 [2024-09-27 15:25:26.061598] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.956 [2024-09-27 15:25:26.061605] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 1 (ffdhe2048) 00:20:33.956 [2024-09-27 15:25:26.061615] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.956 [2024-09-27 15:25:26.061669] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.956 ctrlr pubkey: 00:20:33.956 00000000 e1 4a 90 c0 16 a2 8e 4f ec d1 7a ef ae 86 1c 74 .J.....O..z....t 00:20:33.956 00000010 86 c6 fb ff 1a f3 bd 09 c7 93 f2 cd 3a a7 69 53 ............:.iS 00:20:33.956 00000020 e8 c9 5b 32 ec 66 a2 4c 83 27 ee e5 5a 71 61 27 ..[2.f.L.'..Zqa' 00:20:33.956 00000030 0c 9b a5 ed 83 3d 55 6c 4a a1 da f1 4f 3e 57 07 .....=UlJ...O>W. 00:20:33.956 00000040 c9 e8 1c 01 9a 18 7b 17 1d a9 36 5f ec 20 72 02 ......{...6_. r. 00:20:33.956 00000050 af e1 c4 85 dd fa 49 f5 68 96 8e 22 03 85 e8 e2 ......I.h..".... 00:20:33.956 00000060 a4 44 9d 59 19 20 d9 5c b5 e0 bf 2a 13 cd 99 dc .D.Y. .\...*.... 00:20:33.956 00000070 bb 98 99 dd fc cd 45 8e 6c 16 6a 51 c0 d6 7e 2d ......E.l.jQ..~- 00:20:33.956 00000080 f6 38 46 5b 92 af 2a 99 5a 70 04 aa 5d bf 51 4d .8F[..*.Zp..].QM 00:20:33.956 00000090 d3 bb 83 6a 85 b0 95 7a 4c e8 cd 65 e0 83 d4 4f ...j...zL..e...O 00:20:33.956 000000a0 2a ad c3 52 6e b4 48 8e ae d2 e5 77 78 f9 0d 4b *..Rn.H....wx..K 00:20:33.956 000000b0 39 71 7f 50 80 76 da 8b 25 02 f8 4b 2e e3 d5 de 9q.P.v..%..K.... 00:20:33.956 000000c0 89 05 d5 58 09 56 c4 74 dc d0 f2 88 1d 95 10 d9 ...X.V.t........ 00:20:33.956 000000d0 0a af c8 6f cc 5a 87 09 7c 2e ea e8 6b 21 4d 2b ...o.Z..|...k!M+ 00:20:33.956 000000e0 d9 b4 5f 75 f7 4f 0b 52 d7 56 61 68 0d 05 76 f9 .._u.O.R.Vah..v. 00:20:33.956 000000f0 b1 67 c6 f5 c9 b9 04 97 5a fa 42 47 d0 4e f9 d7 .g......Z.BG.N.. 00:20:33.956 host pubkey: 00:20:33.956 00000000 d3 c1 1b c7 c9 28 c2 58 52 82 67 05 ab 81 66 d6 .....(.XR.g...f. 00:20:33.956 00000010 5f de 49 25 ab 7d 37 42 98 7b c5 3c b5 5b 69 43 _.I%.}7B.{.<.[iC 00:20:33.956 00000020 e3 ba 2f d1 f9 63 1e ad 46 22 5a 92 f9 e2 21 f0 ../..c..F"Z...!. 00:20:33.956 00000030 f7 a8 40 d4 55 f4 78 e1 9e 6d ad 09 94 a6 eb a3 ..@.U.x..m...... 00:20:33.956 00000040 f0 a6 c4 8e 67 27 13 73 e8 39 fb 10 d5 22 f6 c3 ....g'.s.9...".. 00:20:33.956 00000050 78 1e f6 a6 73 58 60 39 47 ce 5e e4 4d be 30 51 x...sX`9G.^.M.0Q 00:20:33.956 00000060 b2 50 ea 22 f7 a5 0f 60 21 9d e2 f6 d0 54 04 d9 .P."...`!....T.. 00:20:33.956 00000070 21 e2 68 3f 45 a7 7d 71 5b 89 86 02 c1 8f 3e d0 !.h?E.}q[.....>. 00:20:33.956 00000080 e4 67 c0 f8 7c 2e 77 d1 aa 07 f7 df d6 3a 6c 28 .g..|.w......:l( 00:20:33.956 00000090 0c f5 97 3c db 9f 47 6e f7 58 f2 e0 43 d8 82 a1 ...<..Gn.X..C... 00:20:33.956 000000a0 50 be f9 d9 d1 66 26 6f 5a b0 3b a6 24 62 24 15 P....f&oZ.;.$b$. 00:20:33.956 000000b0 25 ea 80 69 ce e9 72 13 3b 94 af f7 7d 55 64 1f %..i..r.;...}Ud. 00:20:33.956 000000c0 2e a5 d5 04 e3 31 16 ee e8 8a d1 4c 95 e2 b9 1a .....1.....L.... 00:20:33.956 000000d0 e1 d7 bc 1f 2c 81 cc bb 7b 72 48 a3 48 8e 0b e6 ....,...{rH.H... 00:20:33.956 000000e0 8e 4e 1b 2a cf 77 85 7c d9 ec 2c 81 4b 56 0d 74 .N.*.w.|..,.KV.t 00:20:33.956 000000f0 90 d7 85 0d bc 54 5e 33 10 45 02 50 84 f6 a9 79 .....T^3.E.P...y 00:20:33.956 dh secret: 00:20:33.956 00000000 0b 70 78 16 5c 04 f3 77 b8 a8 92 0e ae 74 82 5b .px.\..w.....t.[ 00:20:33.956 00000010 16 d9 00 73 dd d3 7b 2c a7 40 87 fe 8c c8 b0 f2 ...s..{,.@...... 00:20:33.956 00000020 a7 5c 1c e5 cf 54 54 23 35 df 14 bb 93 ec cb 31 .\...TT#5......1 00:20:33.956 00000030 b2 fb d3 4c 27 de 90 b0 ca ee c3 29 d1 e0 df 6e ...L'......)...n 00:20:33.956 00000040 0f 05 bb 00 51 15 a8 e5 58 42 f8 41 54 1c c2 2d ....Q...XB.AT..- 00:20:33.956 00000050 35 53 8a 86 53 a6 cf b7 38 07 bb f2 bf 1e 06 03 5S..S...8....... 00:20:33.956 00000060 e4 f6 1d 32 8e bc 71 86 a3 ed 5d 64 c7 58 0d ca ...2..q...]d.X.. 00:20:33.956 00000070 43 1e 5c 32 96 1f b5 be 9e 67 9a 29 d3 85 26 fe C.\2.....g.)..&. 00:20:33.956 00000080 27 bd f1 9e 63 18 ce 51 69 5f eb ca 33 0b d9 07 '...c..Qi_..3... 00:20:33.956 00000090 77 51 58 73 2d 84 c4 d6 f2 54 60 9a 08 63 f4 e2 wQXs-....T`..c.. 00:20:33.956 000000a0 a8 16 f6 6e a9 97 63 0d 60 3d dc 7f c9 a5 b2 db ...n..c.`=...... 00:20:33.956 000000b0 e1 3f 5c cf 42 31 0b f7 84 31 cd be bc ff e5 7e .?\.B1...1.....~ 00:20:33.956 000000c0 ac 2b 08 9c 6f c7 37 62 18 81 94 33 e6 85 18 ea .+..o.7b...3.... 00:20:33.956 000000d0 a3 97 23 4b 6d ae 17 7a 5f 76 1f 6e ff 52 78 57 ..#Km..z_v.n.RxW 00:20:33.957 000000e0 aa 06 84 3e 45 74 f2 c6 17 ab ac 7b 14 18 be 8a ...>Et.....{.... 00:20:33.957 000000f0 1d 95 2a c2 03 30 ee d8 f3 33 ff 72 7d f5 1a e2 ..*..0...3.r}... 00:20:33.957 [2024-09-27 15:25:26.064234] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=3, dhgroup=1, seq=3428451804, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.957 [2024-09-27 15:25:26.064294] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.957 [2024-09-27 15:25:26.073662] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.957 [2024-09-27 15:25:26.073699] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.957 [2024-09-27 15:25:26.073706] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.957 [2024-09-27 15:25:26.227842] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.957 [2024-09-27 15:25:26.227861] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.957 [2024-09-27 15:25:26.227868] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.957 [2024-09-27 15:25:26.227913] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.957 [2024-09-27 15:25:26.227937] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.957 ctrlr pubkey: 00:20:33.957 00000000 b0 a7 26 ee 22 24 ed 77 05 95 17 40 32 bf e7 6d ..&."$.w...@2..m 00:20:33.957 00000010 6e 2a 12 c5 93 31 e7 d2 10 60 ef 8a fc 32 e4 be n*...1...`...2.. 00:20:33.957 00000020 f1 77 dc b5 09 f1 14 9e 7f 89 9b f9 0c 38 d5 2f .w...........8./ 00:20:33.957 00000030 08 c1 e6 c4 e4 e6 7a 1c 2e 1c d4 e3 3c 71 34 d6 ......z.............c..... 00:20:33.957 00000030 aa f1 22 8b 1f df 2b 53 e3 eb 8d dc 18 24 7c 48 .."...+S.....$|H 00:20:33.957 00000040 ef 6a f3 30 04 b0 bc 99 47 11 fb a1 99 b3 33 a2 .j.0....G.....3. 00:20:33.957 00000050 d5 b3 fe 26 19 ba a0 71 c7 69 3a 04 9e 0f 31 8e ...&...q.i:...1. 00:20:33.957 00000060 8b d0 27 1e 88 0c 46 4a 34 9c ff 8a 0a 69 d5 d9 ..'...FJ4....i.. 00:20:33.957 00000070 27 bf dd df e6 b2 b7 2e df ac c4 67 50 d6 f4 96 '..........gP... 00:20:33.957 00000080 87 4e 48 ba 75 51 ee 79 19 0a 6d 5c 0d 24 56 a5 .NH.uQ.y..m\.$V. 00:20:33.957 00000090 d6 22 ce 46 5e 95 f4 e8 c1 16 a0 65 f6 e8 c5 d3 .".F^......e.... 00:20:33.957 000000a0 fb 4e dc b8 72 58 4e 25 5e 1a af 72 f8 d6 88 4e .N..rXN%^..r...N 00:20:33.957 000000b0 dc fd b7 94 60 e0 5e 4c 6e ea 60 c7 21 ee 80 6b ....`.^Ln.`.!..k 00:20:33.957 000000c0 ab 0d 22 a3 7b 13 13 85 bc cb 2a bc 33 a1 9d 72 ..".{.....*.3..r 00:20:33.957 000000d0 f8 e7 8d 4c 69 f5 74 ea 82 58 95 47 64 80 a8 69 ...Li.t..X.Gd..i 00:20:33.957 000000e0 34 83 3c be e8 4e db 65 d7 65 97 8f 21 cd d4 29 4.<..N.e.e..!..) 00:20:33.957 000000f0 d0 1a 9e 80 d6 e0 c3 3d c6 17 1f 20 b3 8f 9a b3 .......=... .... 00:20:33.957 00000100 de 90 e4 7f ab 62 5f 44 28 80 15 55 3f 3d ae f4 .....b_D(..U?=.. 00:20:33.957 00000110 2d 2f e1 af 7e df 59 6d fe f7 57 2f a5 a6 af 57 -/..~.Ym..W/...W 00:20:33.957 00000120 69 c2 f4 9f ed 2d 90 d3 11 6d 50 5b e7 54 aa a6 i....-...mP[.T.. 00:20:33.957 00000130 a9 5e 30 4f 26 ce 75 f8 56 31 81 4d 29 b8 f7 49 .^0O&.u.V1.M)..I 00:20:33.957 00000140 89 99 2e 79 71 8f 33 da a8 a6 50 08 46 e4 65 44 ...yq.3...P.F.eD 00:20:33.957 00000150 c8 c7 a8 e0 e9 96 a2 c9 a0 20 68 d3 96 eb 9c eb ......... h..... 00:20:33.957 00000160 2a 8e 8c f7 79 21 b0 00 e3 24 df a0 47 d8 dd 51 *...y!...$..G..Q 00:20:33.957 00000170 82 9c 14 61 17 7d 8f da 57 6d 60 46 21 81 8d e1 ...a.}..Wm`F!... 00:20:33.957 dh secret: 00:20:33.957 00000000 5a 32 d1 1d c7 82 b5 f7 8a b2 c2 66 9f df 2d ef Z2.........f..-. 00:20:33.957 00000010 27 ae 1c 42 6b f0 96 d6 27 1c 33 19 a4 00 84 2b '..Bk...'.3....+ 00:20:33.957 00000020 3c c5 29 b6 d4 92 4e a1 bc 0f 5f 37 e7 83 a5 96 <.)...N..._7.... 00:20:33.957 00000030 a2 84 4d ce b0 0f ca b9 c2 da 1b 75 ec f6 26 d8 ..M........u..&. 00:20:33.957 00000040 a3 85 d2 6b fa d4 41 02 5e de 5e 7b 66 45 04 e8 ...k..A.^.^{fE.. 00:20:33.957 00000050 82 c3 1a d3 d5 b1 1c 9c e5 a5 05 93 00 b2 7f bd ................ 00:20:33.957 00000060 52 fa d6 ec 65 53 c6 ba 45 60 ae fe 06 38 e7 0c R...eS..E`...8.. 00:20:33.957 00000070 b2 9e d0 71 76 bf 6c 45 8f 08 48 93 23 1b 9b 4c ...qv.lE..H.#..L 00:20:33.957 00000080 a7 64 3b 0b fd 09 6f 56 98 f0 4c 25 bc a5 ad 2f .d;...oV..L%.../ 00:20:33.957 00000090 ef ce 22 66 47 aa cc 5b 52 90 de 0e a3 25 1a af .."fG..[R....%.. 00:20:33.957 000000a0 8b c1 4f 8e ec ba a2 a7 b6 ba ba f8 63 a2 ce d9 ..O.........c... 00:20:33.957 000000b0 85 2e f2 2e eb bd 67 70 07 e5 8c 57 e5 a8 e1 d6 ......gp...W.... 00:20:33.957 000000c0 24 04 1a c9 f3 4c ae 38 61 1c c4 c4 6b c9 32 4f $....L.8a...k.2O 00:20:33.957 000000d0 90 56 bf 2c c6 d8 a3 86 67 24 ec 3d 63 4c 2f 34 .V.,....g$.=cL/4 00:20:33.957 000000e0 19 2e c0 c5 0e 6e 49 67 95 ee 5e 21 a1 79 5f 1e .....nIg..^!.y_. 00:20:33.957 000000f0 a8 db c1 d4 b3 7e 88 c0 50 29 be 5c 45 3b 34 c6 .....~..P).\E;4. 00:20:33.957 00000100 ad aa 09 b3 21 6a b4 9f c6 cd 63 24 b5 08 3f 09 ....!j....c$..?. 00:20:33.957 00000110 7c 1d a0 42 06 5e 3a 18 c5 04 5f 99 48 5a 44 af |..B.^:..._.HZD. 00:20:33.957 00000120 f3 08 ca 30 e3 e4 47 4d 59 e0 b1 86 52 32 bf c9 ...0..GMY...R2.. 00:20:33.957 00000130 b4 0e 75 cb c0 77 0f 29 1e 49 77 c9 bc bf f4 8a ..u..w.).Iw..... 00:20:33.957 00000140 4a f8 98 c2 1f 61 a7 60 60 c0 e7 27 ad 62 94 36 J....a.``..'.b.6 00:20:33.957 00000150 b3 cd 52 b2 eb af e0 26 97 28 ba c6 9a 57 7f ba ..R....&.(...W.. 00:20:33.957 00000160 c2 eb d9 54 57 c6 00 55 2b f2 9e b2 4b 15 57 47 ...TW..U+...K.WG 00:20:33.957 00000170 52 4e 70 3b 7d 5e 7e 45 47 62 dd 8d cb e4 4d 5d RNp;}^~EGb....M] 00:20:33.957 [2024-09-27 15:25:26.235151] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=3, dhgroup=2, seq=3428451805, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.957 [2024-09-27 15:25:26.240269] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.957 [2024-09-27 15:25:26.240309] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.957 [2024-09-27 15:25:26.240325] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.957 [2024-09-27 15:25:26.240358] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.957 [2024-09-27 15:25:26.240369] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.957 [2024-09-27 15:25:26.346680] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.957 [2024-09-27 15:25:26.346698] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.957 [2024-09-27 15:25:26.346706] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.957 [2024-09-27 15:25:26.346716] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.957 [2024-09-27 15:25:26.346773] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.957 ctrlr pubkey: 00:20:33.957 00000000 b0 a7 26 ee 22 24 ed 77 05 95 17 40 32 bf e7 6d ..&."$.w...@2..m 00:20:33.957 00000010 6e 2a 12 c5 93 31 e7 d2 10 60 ef 8a fc 32 e4 be n*...1...`...2.. 00:20:33.957 00000020 f1 77 dc b5 09 f1 14 9e 7f 89 9b f9 0c 38 d5 2f .w...........8./ 00:20:33.957 00000030 08 c1 e6 c4 e4 e6 7a 1c 2e 1c d4 e3 3c 71 34 d6 ......z...... 00:20:33.958 00000020 99 47 05 c8 ef e8 b5 3e b2 f5 d8 9c 7a 71 50 f3 .G.....>....zqP. 00:20:33.958 00000030 ea 9f 04 40 1f 06 21 b0 9f 92 6c 3e 11 67 73 16 ...@..!...l>.gs. 00:20:33.958 00000040 ea a1 c9 c3 66 e1 99 3d e3 0b 4e dd 0d 0d dc b4 ....f..=..N..... 00:20:33.958 00000050 de 88 db f4 49 d1 10 15 86 17 9a c7 1e d5 ec d3 ....I........... 00:20:33.958 00000060 5a af 76 39 99 ee ee 09 c3 1c ed ba 6b a1 1f 3f Z.v9........k..? 00:20:33.958 00000070 9c 9c 70 cd 1c 88 c8 e4 d4 72 1d 62 cd e3 2a 58 ..p......r.b..*X 00:20:33.958 00000080 93 7f db 20 b9 78 69 a9 64 6b bb a6 5e e2 e0 8f ... .xi.dk..^... 00:20:33.958 00000090 5b 45 ca 14 fa 3a ed 7e 44 d9 8a a3 27 9b 2b 8c [E...:.~D...'.+. 00:20:33.958 000000a0 cb 90 6d 7e 5d ef ec 98 35 b7 4d 02 b6 06 86 1e ..m~]...5.M..... 00:20:33.958 000000b0 42 c4 fe e2 a9 85 57 53 1d cf 0b f8 49 df 40 d1 B.....WS....I.@. 00:20:33.958 000000c0 69 24 4d 3f 6d ae de 76 30 07 d5 4a 16 1a 83 3f i$M?m..v0..J...? 00:20:33.958 000000d0 d9 7b b5 0c 0d e4 44 05 ec 98 d7 f5 bc 12 93 c6 .{....D......... 00:20:33.958 000000e0 93 18 93 4d b9 a6 92 d4 01 ce 19 76 52 26 4c f6 ...M.......vR&L. 00:20:33.958 000000f0 3a 65 90 00 ec 7e 20 86 3e ff 9c b3 2d 51 24 31 :e...~ .>...-Q$1 00:20:33.958 00000100 e2 ce 0f 52 dd 0f 64 b1 b5 e4 e0 5e 4e b0 ab 72 ...R..d....^N..r 00:20:33.958 00000110 43 a9 a0 92 17 54 43 60 20 bb d9 d1 2e 1d a4 11 C....TC` ....... 00:20:33.958 00000120 f1 82 e4 b4 b6 ef 7c 59 93 ab a7 7d 86 7d 01 9e ......|Y...}.}.. 00:20:33.958 00000130 fb 87 7e f4 db 14 2c 41 42 ca 9c bc 9f c8 8b 86 ..~...,AB....... 00:20:33.958 00000140 c6 cb 9e 82 ae 12 03 4b 00 9b 49 4f 91 7c 78 98 .......K..IO.|x. 00:20:33.958 00000150 29 a8 70 95 cc fe 4f 9d 2d fc fd 07 1e 9a f3 98 ).p...O.-....... 00:20:33.958 00000160 43 e0 9e ec 7b 76 ec 22 2c e7 52 6a 19 1a 2c 56 C...{v.",.Rj..,V 00:20:33.958 00000170 81 03 ee 17 6a 4b 07 53 d8 bd 54 c3 a5 8f 62 e2 ....jK.S..T...b. 00:20:33.958 dh secret: 00:20:33.958 00000000 cd 28 aa 4c da c8 85 0b e0 05 63 5c 13 9d 98 d7 .(.L......c\.... 00:20:33.958 00000010 40 a4 da 56 53 f8 bb 6f 63 89 26 d5 81 46 83 b2 @..VS..oc.&..F.. 00:20:33.958 00000020 ce 34 d1 c5 d5 28 e0 e3 49 02 82 77 2c d2 65 98 .4...(..I..w,.e. 00:20:33.958 00000030 2f 7d 86 f2 75 47 05 c7 e8 89 57 d5 e2 2a 9a af /}..uG....W..*.. 00:20:33.958 00000040 6a 5e 84 8b c4 43 85 b2 9d ef 14 15 36 bc 4e 0e j^...C......6.N. 00:20:33.958 00000050 1d b8 73 53 a8 b7 93 a4 3b 40 39 6f 73 b7 7a bf ..sS....;@9os.z. 00:20:33.958 00000060 5c b6 87 d1 66 98 a9 37 6c 45 fd 47 7d 73 6f 60 \...f..7lE.G}so` 00:20:33.958 00000070 43 28 2d eb 5f cf 58 ca b4 3b f0 62 08 ba 91 81 C(-._.X..;.b.... 00:20:33.958 00000080 56 e3 c1 67 61 b6 e9 78 a6 06 6d 07 f3 ea 6a 42 V..ga..x..m...jB 00:20:33.958 00000090 be 62 62 d3 92 e3 78 0d 6a 88 bb 1a 34 b3 35 ab .bb...x.j...4.5. 00:20:33.958 000000a0 f9 2b a6 73 fc 7f 25 ab 0d d9 f3 d1 bf 83 05 0f .+.s..%......... 00:20:33.958 000000b0 81 6f ce ef e0 f6 ab fb 7c 2b 21 dd 59 08 bd 13 .o......|+!.Y... 00:20:33.958 000000c0 31 91 4c 64 f3 2a 3a e7 c4 a2 77 15 df cf 9d 0b 1.Ld.*:...w..... 00:20:33.958 000000d0 2f f2 de 7f d0 db a2 af d7 77 bd 26 1e 6b d3 84 /........w.&.k.. 00:20:33.958 000000e0 99 fe 32 99 80 3b 5c cf a5 a2 42 9f 15 c1 28 06 ..2..;\...B...(. 00:20:33.958 000000f0 8c 96 42 bc 5a 48 76 9a 7a a3 ff 04 ff c0 70 b9 ..B.ZHv.z.....p. 00:20:33.958 00000100 af 0a 78 88 4c 8c a2 7a 9e 51 7e 2e a6 24 98 95 ..x.L..z.Q~..$.. 00:20:33.958 00000110 5f b8 67 72 39 40 a0 97 09 a0 61 c7 f5 9a a3 04 _.gr9@....a..... 00:20:33.958 00000120 6d 8e f1 c5 61 c5 d5 4f 87 8e b5 d2 8b 65 ac 1f m...a..O.....e.. 00:20:33.958 00000130 4c 0e 8e 11 97 a7 46 26 6d dd b0 e2 52 0d 98 27 L.....F&m...R..' 00:20:33.958 00000140 63 5d 86 90 a0 82 6e 3a 09 a9 88 53 8d 9d e5 44 c]....n:...S...D 00:20:33.958 00000150 f9 37 be 73 8e 26 a3 7a 76 b8 8d 48 3c 7d 68 c7 .7.s.&.zv..H<}h. 00:20:33.958 00000160 a3 b9 05 8a eb 6c 31 13 ef da 47 39 15 92 b9 e0 .....l1...G9.... 00:20:33.958 00000170 e8 d8 3b 10 f8 c5 89 a4 0c f9 f5 4f 98 02 ba 3d ..;........O...= 00:20:33.958 [2024-09-27 15:25:26.354024] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=3, dhgroup=2, seq=3428451806, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.958 [2024-09-27 15:25:26.354124] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.958 [2024-09-27 15:25:26.371599] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.958 [2024-09-27 15:25:26.371658] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.958 [2024-09-27 15:25:26.371668] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.958 [2024-09-27 15:25:26.371701] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.958 [2024-09-27 15:25:26.528676] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.958 [2024-09-27 15:25:26.528696] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.958 [2024-09-27 15:25:26.528703] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.958 [2024-09-27 15:25:26.528750] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.958 [2024-09-27 15:25:26.528773] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.958 ctrlr pubkey: 00:20:33.958 00000000 54 56 6e 22 e4 3f df 0f b4 56 91 f0 47 38 b1 79 TVn".?...V..G8.y 00:20:33.958 00000010 c3 df c7 10 a6 e3 2e 04 15 29 5c 76 86 f9 f0 f5 .........)\v.... 00:20:33.958 00000020 cd da d4 2a 4b 0c 18 66 88 e0 5c 36 b2 48 8c f5 ...*K..f..\6.H.. 00:20:33.958 00000030 79 d7 0c 87 83 2e 00 29 72 f7 ac cc 4c 00 fb 57 y......)r...L..W 00:20:33.958 00000040 19 6c 04 be 05 d2 2e a8 e3 f8 40 9b 4d e0 57 58 .l........@.M.WX 00:20:33.958 00000050 ab 0a 1b d9 e5 d4 76 98 a1 e5 b0 09 6c 36 45 b3 ......v.....l6E. 00:20:33.958 00000060 b8 ca 9f 77 c7 45 a5 56 c6 c2 3b 78 47 fa 78 2f ...w.E.V..;xG.x/ 00:20:33.958 00000070 81 0c a5 3f 3e 0f 64 5d 89 03 dd 23 ca 78 4a 17 ...?>.d]...#.xJ. 00:20:33.958 00000080 9c 86 bb 02 37 85 b5 be c3 e9 b5 99 91 90 03 9b ....7........... 00:20:33.958 00000090 53 bb 21 d0 cd d9 8b 10 e4 48 f7 39 fd 08 15 8d S.!......H.9.... 00:20:33.958 000000a0 c2 bd 7e 1e 9d 75 6d db e0 ec da 03 30 c0 06 5c ..~..um.....0..\ 00:20:33.958 000000b0 4a de 80 5e 79 cc e2 6f 47 34 c1 c8 b7 f2 b4 62 J..^y..oG4.....b 00:20:33.958 000000c0 7a bc bc 16 43 8c fd 0c eb 94 dd 14 0a e9 d4 07 z...C........... 00:20:33.958 000000d0 02 33 c9 e3 7d 52 75 0c 5c c2 f0 e3 cf db 5d 44 .3..}Ru.\.....]D 00:20:33.958 000000e0 b8 bd 23 e3 24 67 4a c0 b2 0c 11 25 40 07 9d ab ..#.$gJ....%@... 00:20:33.958 000000f0 4d 7a f3 91 7b bf 52 7d be 36 fc 15 9b 4d 76 30 Mz..{.R}.6...Mv0 00:20:33.958 00000100 a6 61 85 89 19 e9 9c 19 7e 01 ee e5 75 53 7c fc .a......~...uS|. 00:20:33.958 00000110 7c 03 c8 ef 69 3d 8f f8 34 8a b3 e6 98 90 c2 e2 |...i=..4....... 00:20:33.958 00000120 b9 ee a4 67 50 16 de 1f c7 be f2 36 7b ef f5 d4 ...gP......6{... 00:20:33.958 00000130 67 5f 87 de f8 8b 2a 5f 61 50 ab 34 46 a2 84 f9 g_....*_aP.4F... 00:20:33.958 00000140 6d a5 c6 25 ca 9c e9 8f c9 30 9f 3a 5d 30 44 a8 m..%.....0.:]0D. 00:20:33.958 00000150 0d b4 22 f9 9e 56 6b 3f bb 96 53 e8 ee 8a 96 a0 .."..Vk?..S..... 00:20:33.958 00000160 62 59 43 33 81 d1 1b 9b 1c 72 56 f4 d8 22 80 1d bYC3.....rV..".. 00:20:33.958 00000170 b8 6a 98 93 1c 64 1a ec 40 b3 25 80 78 50 3a aa .j...d..@.%.xP:. 00:20:33.958 host pubkey: 00:20:33.958 00000000 54 0a 0d 08 50 d8 ba 89 b6 a4 3c 63 2f 98 68 2a T...P.......... 00:20:33.958 00000020 f2 c9 b2 a5 6f 2b a7 e4 0b f5 51 64 85 8a d6 5c ....o+....Qd...\ 00:20:33.958 00000030 74 32 4d 80 8b 6b 0e 20 d4 a5 ee 5f 3c a3 ab 06 t2M..k. ..._<... 00:20:33.958 00000040 99 e3 4a a1 3b dd 9d 0f 08 f1 b3 ec fa 97 ec 5f ..J.;.........._ 00:20:33.958 00000050 3c c8 6e 9f ce 4f d2 3a c1 4b e2 c7 75 8f b9 b7 <.n..O.:.K..u... 00:20:33.958 00000060 46 74 27 d1 e0 91 2a 0b 70 25 6d bc 71 8c c9 98 Ft'...*.p%m.q... 00:20:33.958 00000070 7e 0f 3e ed 3c 49 36 c1 d6 bb 5f cb c2 10 47 51 ~.>..d]...#.xJ. 00:20:33.959 00000080 9c 86 bb 02 37 85 b5 be c3 e9 b5 99 91 90 03 9b ....7........... 00:20:33.959 00000090 53 bb 21 d0 cd d9 8b 10 e4 48 f7 39 fd 08 15 8d S.!......H.9.... 00:20:33.959 000000a0 c2 bd 7e 1e 9d 75 6d db e0 ec da 03 30 c0 06 5c ..~..um.....0..\ 00:20:33.959 000000b0 4a de 80 5e 79 cc e2 6f 47 34 c1 c8 b7 f2 b4 62 J..^y..oG4.....b 00:20:33.959 000000c0 7a bc bc 16 43 8c fd 0c eb 94 dd 14 0a e9 d4 07 z...C........... 00:20:33.959 000000d0 02 33 c9 e3 7d 52 75 0c 5c c2 f0 e3 cf db 5d 44 .3..}Ru.\.....]D 00:20:33.959 000000e0 b8 bd 23 e3 24 67 4a c0 b2 0c 11 25 40 07 9d ab ..#.$gJ....%@... 00:20:33.959 000000f0 4d 7a f3 91 7b bf 52 7d be 36 fc 15 9b 4d 76 30 Mz..{.R}.6...Mv0 00:20:33.959 00000100 a6 61 85 89 19 e9 9c 19 7e 01 ee e5 75 53 7c fc .a......~...uS|. 00:20:33.959 00000110 7c 03 c8 ef 69 3d 8f f8 34 8a b3 e6 98 90 c2 e2 |...i=..4....... 00:20:33.959 00000120 b9 ee a4 67 50 16 de 1f c7 be f2 36 7b ef f5 d4 ...gP......6{... 00:20:33.959 00000130 67 5f 87 de f8 8b 2a 5f 61 50 ab 34 46 a2 84 f9 g_....*_aP.4F... 00:20:33.959 00000140 6d a5 c6 25 ca 9c e9 8f c9 30 9f 3a 5d 30 44 a8 m..%.....0.:]0D. 00:20:33.959 00000150 0d b4 22 f9 9e 56 6b 3f bb 96 53 e8 ee 8a 96 a0 .."..Vk?..S..... 00:20:33.959 00000160 62 59 43 33 81 d1 1b 9b 1c 72 56 f4 d8 22 80 1d bYC3.....rV..".. 00:20:33.959 00000170 b8 6a 98 93 1c 64 1a ec 40 b3 25 80 78 50 3a aa .j...d..@.%.xP:. 00:20:33.959 host pubkey: 00:20:33.959 00000000 06 d9 c0 f9 8e 6b c2 ac 5e 9c 90 bc 82 fc 9d 52 .....k..^......R 00:20:33.959 00000010 87 20 b3 16 5b f2 ab 14 7c 07 97 5a 40 3c a1 11 . ..[...|..Z@<.. 00:20:33.959 00000020 01 4a a8 92 3b bf ed 18 3e 02 58 b3 9b d0 5a 0e .J..;...>.X...Z. 00:20:33.959 00000030 f5 7f 4d d0 4d d6 ce 17 d1 20 07 5f 08 29 11 1f ..M.M.... ._.).. 00:20:33.959 00000040 86 59 1a e9 28 bc 88 a3 d3 b6 83 f2 af 0d e3 85 .Y..(........... 00:20:33.959 00000050 93 17 b3 5f 99 56 a6 0c 38 a0 23 6c 5b b8 02 fe ..._.V..8.#l[... 00:20:33.959 00000060 e4 8b 31 84 c9 42 ec 00 88 e9 52 b7 4d 80 83 9a ..1..B....R.M... 00:20:33.959 00000070 49 46 96 1c 8e 26 39 23 47 41 75 c7 e9 4f 38 5f IF...&9#GAu..O8_ 00:20:33.959 00000080 a8 2f c7 f7 b9 9d f0 6e 8d 00 4d d9 e1 55 68 45 ./.....n..M..UhE 00:20:33.959 00000090 99 1b 94 c9 05 79 3b b7 28 a8 e8 a7 96 06 fa 49 .....y;.(......I 00:20:33.959 000000a0 43 bc 9e 81 cb 4e 1a f8 69 6b 4f 9e 5e 3e 5a 41 C....N..ikO.^>ZA 00:20:33.959 000000b0 22 73 b2 c7 84 a3 1a 30 80 80 08 05 4f 14 37 29 "s.....0....O.7) 00:20:33.959 000000c0 84 4f ce 79 8b b7 2b c2 28 14 63 3b 17 63 81 e4 .O.y..+.(.c;.c.. 00:20:33.959 000000d0 54 21 08 e7 51 74 57 de f7 a3 db 5f c8 31 4f a7 T!..QtW...._.1O. 00:20:33.959 000000e0 7b af 5f 2c a1 46 3b 5b 5f e2 cf 58 69 43 63 e0 {._,.F;[_..XiCc. 00:20:33.959 000000f0 f6 3f 3e 27 b8 95 23 07 98 84 36 b0 f0 ad 99 f6 .?>'..#...6..... 00:20:33.959 00000100 0c bf 62 f3 49 aa 18 84 5e 00 0e a6 7a 57 c2 7e ..b.I...^...zW.~ 00:20:33.959 00000110 12 ff 24 1c 09 1a 08 d9 b1 f3 c0 4d bd 03 ae a3 ..$........M.... 00:20:33.959 00000120 61 c9 91 19 6c 99 cb 4f e4 94 5a 4d 1e ff 6c f2 a...l..O..ZM..l. 00:20:33.959 00000130 d1 b3 08 ae 59 e0 b0 56 44 ca 9d 86 f0 f0 f2 bc ....Y..VD....... 00:20:33.959 00000140 39 2e 18 15 56 1e 5e 90 aa 15 9e ae c1 a2 13 bd 9...V.^......... 00:20:33.959 00000150 93 33 c8 82 a4 a4 48 29 63 b6 3a ff 18 17 05 bf .3....H)c.:..... 00:20:33.959 00000160 86 d8 1b e0 41 03 eb 6c 48 de 82 0e 6d f9 dd 43 ....A..lH...m..C 00:20:33.959 00000170 63 4b 1d 8c 94 d7 72 3b ba b9 bd 90 e8 76 b8 a5 cK....r;.....v.. 00:20:33.959 dh secret: 00:20:33.959 00000000 4b 54 03 58 34 79 1b 07 08 5e f5 f1 21 5c 59 ee KT.X4y...^..!\Y. 00:20:33.959 00000010 1f 8d 86 65 fd 11 ee f1 ea 26 25 21 17 5b 65 74 ...e.....&%!.[et 00:20:33.959 00000020 80 55 8a 84 e2 2a d8 bd 40 3b 62 0a f5 5c 7a 9a .U...*..@;b..\z. 00:20:33.959 00000030 08 55 72 10 ae 5a d9 cb d3 56 3b 39 10 5e f5 5e .Ur..Z...V;9.^.^ 00:20:33.959 00000040 42 0d 88 a7 41 99 d4 c7 9d 05 91 b8 05 e0 b8 c1 B...A........... 00:20:33.959 00000050 86 fd 67 c9 6a 8f 45 2b 98 4d c4 6d 27 57 cf b4 ..g.j.E+.M.m'W.. 00:20:33.959 00000060 8d 20 fc 50 55 91 38 f1 eb df 91 a4 98 1a 87 73 . .PU.8........s 00:20:33.959 00000070 8d 59 3d f4 e2 54 c7 03 35 91 50 9d 72 d3 99 3d .Y=..T..5.P.r..= 00:20:33.959 00000080 f1 20 55 a9 05 63 49 a9 04 08 f3 f6 7a 24 c2 d1 . U..cI.....z$.. 00:20:33.959 00000090 b3 65 10 bf 37 c3 ad 72 ef 0e 8b 9a ba de 91 4a .e..7..r.......J 00:20:33.959 000000a0 83 5e c3 03 f5 87 11 9b 82 4a 95 66 a8 5a 92 22 .^.......J.f.Z." 00:20:33.959 000000b0 ef e3 84 81 0c 05 8c f7 e2 51 de cb f5 f0 6f 2d .........Q....o- 00:20:33.959 000000c0 21 8e cb 58 35 80 aa 72 6a 29 a8 8e 4d 1e 33 c3 !..X5..rj)..M.3. 00:20:33.959 000000d0 ba a8 ef c2 2e ac 2a c6 73 1a 0b ee 30 e5 15 f3 ......*.s...0... 00:20:33.959 000000e0 65 f8 8f 3b 75 09 cc 78 b7 e8 b3 1c 2d ab d8 b3 e..;u..x....-... 00:20:33.959 000000f0 f8 9e 36 e0 7d 0f 1b a1 50 bf 1a c0 90 f3 e6 d7 ..6.}...P....... 00:20:33.960 00000100 d9 5a 61 4d da c0 03 fe 7d 0e c5 d2 f8 fe 50 df .ZaM....}.....P. 00:20:33.960 00000110 75 61 fb d5 9b 6c 1b cf ad 30 77 ee fb 06 1f 50 ua...l...0w....P 00:20:33.960 00000120 9b 77 0b 5c cc 81 84 5a 61 cd f6 05 52 46 a9 74 .w.\...Za...RF.t 00:20:33.960 00000130 16 89 be a9 91 02 bc 7a a5 d6 9f 11 7b 70 10 68 .......z....{p.h 00:20:33.960 00000140 d7 aa 3a c3 53 46 a9 a7 b2 36 3e ec c7 23 fe b3 ..:.SF...6>..#.. 00:20:33.960 00000150 8f dd a6 f0 35 e9 8b 68 91 70 30 53 7f 04 5e 51 ....5..h.p0S..^Q 00:20:33.960 00000160 60 d1 e5 8a a1 ea 61 d5 2e 9a da b7 c7 d4 49 d7 `.....a.......I. 00:20:33.960 00000170 37 31 7d b2 f2 6c 2f df fe aa a5 1e e8 b8 05 a9 71}..l/......... 00:20:33.960 [2024-09-27 15:25:26.654725] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=3, dhgroup=2, seq=3428451808, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.960 [2024-09-27 15:25:26.654825] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.960 [2024-09-27 15:25:26.672570] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.960 [2024-09-27 15:25:26.672646] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.960 [2024-09-27 15:25:26.672656] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.960 [2024-09-27 15:25:26.672692] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.960 [2024-09-27 15:25:26.828237] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.960 [2024-09-27 15:25:26.828256] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.960 [2024-09-27 15:25:26.828262] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.960 [2024-09-27 15:25:26.828304] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.960 [2024-09-27 15:25:26.828330] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.960 ctrlr pubkey: 00:20:33.960 00000000 e9 e3 6f 71 43 17 06 54 11 bf b1 21 96 bb 3e a5 ..oqC..T...!..>. 00:20:33.960 00000010 ad 57 df 00 c4 23 4c 4f 79 b7 97 35 e2 ce d3 dd .W...#LOy..5.... 00:20:33.960 00000020 b9 dc 26 c4 8d 58 ec 1c 84 b8 6f a8 30 8d 4e 56 ..&..X....o.0.NV 00:20:33.960 00000030 50 d2 df c3 da 01 0b 96 3e ba 6c 8f d6 11 d4 1a P.......>.l..... 00:20:33.960 00000040 51 c3 9d a0 b4 cb 74 0e f1 7d 9f ea dd 64 a8 01 Q.....t..}...d.. 00:20:33.960 00000050 de 55 94 7c 81 f2 ed 79 0c 4f 17 f5 05 56 da 9b .U.|...y.O...V.. 00:20:33.960 00000060 50 fc 70 01 ec bc 21 b2 1b 48 16 5f 5f 16 d0 26 P.p...!..H.__..& 00:20:33.960 00000070 31 34 cd 8f 57 d5 e2 f6 45 4e 78 bf 9a a6 26 62 14..W...ENx...&b 00:20:33.960 00000080 54 bd c1 63 f5 d7 0b da e3 ae 43 d3 36 58 40 95 T..c......C.6X@. 00:20:33.960 00000090 7f 38 c2 0d dc e9 c7 f9 f8 37 68 0f 38 3d e5 c3 .8.......7h.8=.. 00:20:33.960 000000a0 1c 91 67 3b 1b 35 b3 03 dd bb f9 16 9d 08 79 cd ..g;.5........y. 00:20:33.960 000000b0 e0 54 88 10 f4 a7 2a e3 12 56 02 cf fb 45 f0 19 .T....*..V...E.. 00:20:33.960 000000c0 c3 e7 97 b6 0b 41 c4 cb d0 4f 19 3d 25 5e 97 ce .....A...O.=%^.. 00:20:33.960 000000d0 f6 4e ab 60 78 f7 1e e7 89 66 19 87 4e 4e 34 0b .N.`x....f..NN4. 00:20:33.960 000000e0 bf be 0c 8d ee 6a 5a 0e 8f 58 f2 6e 2f ac e7 69 .....jZ..X.n/..i 00:20:33.960 000000f0 f8 97 95 97 c7 7d 4d 16 61 47 b5 ab a4 4e 25 ab .....}M.aG...N%. 00:20:33.960 00000100 1c a9 e1 0e cc 8c f7 ca ee aa e0 12 b6 cb f2 b9 ................ 00:20:33.960 00000110 d4 14 7f 21 ab 09 51 1f e3 34 73 cc 73 87 6a 83 ...!..Q..4s.s.j. 00:20:33.960 00000120 a6 34 ce 8c 07 5f 7c 52 7b 23 ef a2 cb f8 a1 50 .4..._|R{#.....P 00:20:33.960 00000130 d7 d7 f7 d8 3b 25 9c f2 56 51 e2 5f 8e 25 06 73 ....;%..VQ._.%.s 00:20:33.960 00000140 8c d1 9e 99 f1 67 9a 1c f0 9d 42 32 51 a3 5b a0 .....g....B2Q.[. 00:20:33.960 00000150 10 db 45 4a 28 0d 43 90 dd 6e cd 4e 52 4b 34 62 ..EJ(.C..n.NRK4b 00:20:33.960 00000160 74 04 db 11 94 b8 0c 9a 38 b9 27 b2 36 63 03 c2 t.......8.'.6c.. 00:20:33.960 00000170 c8 5e 91 11 53 91 af 7a 63 92 93 96 78 52 8d 16 .^..S..zc...xR.. 00:20:33.960 host pubkey: 00:20:33.960 00000000 55 79 a4 40 a3 10 af fa 15 a9 17 62 e2 ca 9e b0 Uy.@.......b.... 00:20:33.960 00000010 14 a7 b8 5b ce 19 2d 45 ca 97 b6 f8 fb 54 6c 6c ...[..-E.....Tll 00:20:33.960 00000020 e4 72 4a 69 b2 bd 9c dd 98 cd 91 b0 43 df 3f 03 .rJi........C.?. 00:20:33.960 00000030 90 bd 31 f8 6e b9 10 1a d9 66 16 60 8b 84 82 59 ..1.n....f.`...Y 00:20:33.960 00000040 79 74 f0 5f f0 49 83 43 9d fd db 66 00 fc 88 8b yt._.I.C...f.... 00:20:33.960 00000050 2c e8 b2 e6 dc bb 76 0c c9 92 89 50 15 55 32 3b ,.....v....P.U2; 00:20:33.960 00000060 39 ec c2 ff 19 60 9c 34 5e 43 d8 b8 24 68 fb dc 9....`.4^C..$h.. 00:20:33.960 00000070 11 e9 5e 23 3f c9 12 ec fd 4e e6 76 24 de e7 d4 ..^#?....N.v$... 00:20:33.960 00000080 5a 33 c6 f6 ec 6e da 48 79 c5 3a 0f 58 60 d2 7e Z3...n.Hy.:.X`.~ 00:20:33.960 00000090 24 3e ac 97 01 b2 13 a0 fe 8d fa 67 7c db 22 e9 $>.........g|.". 00:20:33.960 000000a0 5c 6b ba dd c4 de be 3b 0f e9 d6 f1 91 b4 4a 73 \k.....;......Js 00:20:33.960 000000b0 17 2d 6d ee 15 25 75 46 01 38 d8 fa b4 81 16 85 .-m..%uF.8...... 00:20:33.960 000000c0 2d 38 de 30 55 ed a0 90 26 9c bb 40 c1 45 f1 32 -8.0U...&..@.E.2 00:20:33.960 000000d0 2a 6d a8 51 64 69 55 20 30 eb 53 a6 23 bc f8 f7 *m.QdiU 0.S.#... 00:20:33.960 000000e0 f7 f6 a1 38 01 69 94 f6 93 3b 56 e8 c1 cd 32 6f ...8.i...;V...2o 00:20:33.960 000000f0 0d 01 f4 f8 6d 8b fa 19 bd c3 42 6a 7e 32 07 c0 ....m.....Bj~2.. 00:20:33.960 00000100 66 8d ed 4a 9c a2 98 46 03 ba b8 cd 56 df 56 8b f..J...F....V.V. 00:20:33.960 00000110 89 20 40 80 82 dd 7e 67 54 5f 41 09 5b a9 5d f4 . @...~gT_A.[.]. 00:20:33.960 00000120 70 15 ca 24 20 1d 85 e1 a5 a9 95 f1 d8 92 c4 e9 p..$ ........... 00:20:33.960 00000130 67 52 76 3d 2f 8c 2d f7 43 5b 4f 0e 99 7c bd bd gRv=/.-.C[O..|.. 00:20:33.960 00000140 1d 2f 5c 2b cd f4 df 21 6f d4 bf 65 01 43 f5 c8 ./\+...!o..e.C.. 00:20:33.960 00000150 c3 d7 32 6a 6c ea a5 e7 67 b5 31 05 17 54 61 f7 ..2jl...g.1..Ta. 00:20:33.960 00000160 d9 c5 1f 3e 34 be f0 3e f4 69 47 3c c0 04 f9 73 ...>4..>.iG<...s 00:20:33.960 00000170 18 62 b2 88 32 a6 cb e1 9a 6f a7 19 c8 20 ae bc .b..2....o... .. 00:20:33.960 dh secret: 00:20:33.960 00000000 5f 2f 4c ea 87 95 4f f0 05 06 55 92 1d c8 d0 14 _/L...O...U..... 00:20:33.960 00000010 05 83 18 a6 80 99 8b 4e 47 6e da 77 ee d3 2f 0e .......NGn.w../. 00:20:33.960 00000020 3a 02 82 7c 90 89 e7 6b c4 98 55 af 9b 08 9e 2c :..|...k..U...., 00:20:33.960 00000030 86 3d 8b 3e 4a b0 12 cb 49 7a aa 2f c1 b1 a0 cc .=.>J...Iz./.... 00:20:33.960 00000040 af 1c 32 c4 89 b1 77 1c b8 b2 92 1c 2a 9c d2 6d ..2...w.....*..m 00:20:33.960 00000050 06 19 36 14 19 bc e4 6a dc 24 32 46 3a bd a6 e7 ..6....j.$2F:... 00:20:33.960 00000060 e5 3b 93 ac c4 42 09 58 e7 cd 8f 9c d6 45 e4 8c .;...B.X.....E.. 00:20:33.960 00000070 e1 dc a8 5d 96 e9 39 47 8e d6 17 c3 15 27 e1 70 ...]..9G.....'.p 00:20:33.960 00000080 e5 1c f2 de ad 18 95 44 bd 4c 0b ae 97 42 7a f5 .......D.L...Bz. 00:20:33.960 00000090 91 1f 81 f6 6e 34 c1 09 95 8d 40 46 d7 5a 7b 72 ....n4....@F.Z{r 00:20:33.960 000000a0 2e 60 e6 92 97 6f 64 8c 96 d3 d3 1a 9e 56 07 b4 .`...od......V.. 00:20:33.960 000000b0 fc 31 50 45 0f 4a d2 04 0c ee 06 b4 62 06 2b e0 .1PE.J......b.+. 00:20:33.960 000000c0 9f 36 59 9a 4b 87 0d f4 95 32 ee 15 dd a5 cc 2a .6Y.K....2.....* 00:20:33.960 000000d0 3d 8e 3d 91 94 ed 9f 19 d5 d3 fa 0f 55 7f a3 58 =.=.........U..X 00:20:33.960 000000e0 b7 1e 85 52 0e a9 72 2f f0 eb 4d 32 f2 78 c0 3d ...R..r/..M2.x.= 00:20:33.960 000000f0 c0 1b 8f 46 b2 a2 08 7f 85 1a 26 26 de 61 ee f7 ...F......&&.a.. 00:20:33.960 00000100 17 9a 9c 69 f5 94 0b b4 dc fd 87 61 4b 79 0e c9 ...i.......aKy.. 00:20:33.960 00000110 f1 d9 c4 c5 1b da 68 a3 50 54 4a b0 9e e8 e4 42 ......h.PTJ....B 00:20:33.960 00000120 42 ff 47 4d cc d1 e0 4c 68 bd c9 f8 32 dc 1e 76 B.GM...Lh...2..v 00:20:33.960 00000130 97 f8 72 42 a6 ec 1f 6f 8f 37 95 6b b5 a8 ec 5c ..rB...o.7.k...\ 00:20:33.960 00000140 22 66 cf ef 0a f6 cf 4e 56 a3 f6 1b ed d5 61 45 "f.....NV.....aE 00:20:33.960 00000150 0a f1 04 fc 8e 2e 6d b8 cf 61 ce fd a2 b0 ea 3c ......m..a.....< 00:20:33.960 00000160 32 c6 7b c9 3e 01 b6 b1 ea 97 af b8 2d 9d 65 73 2.{.>.......-.es 00:20:33.960 00000170 4d 3b f3 e7 eb 4c ec 74 d2 e9 4a 76 1c 4a 7d 20 M;...L.t..Jv.J} 00:20:33.960 [2024-09-27 15:25:26.835641] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=3, dhgroup=2, seq=3428451809, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.960 [2024-09-27 15:25:26.840688] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.960 [2024-09-27 15:25:26.840728] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.960 [2024-09-27 15:25:26.840743] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.960 [2024-09-27 15:25:26.840764] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.960 [2024-09-27 15:25:26.840778] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.960 [2024-09-27 15:25:26.946680] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.960 [2024-09-27 15:25:26.946697] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.960 [2024-09-27 15:25:26.946704] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.960 [2024-09-27 15:25:26.946714] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.960 [2024-09-27 15:25:26.946771] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.960 ctrlr pubkey: 00:20:33.960 00000000 e9 e3 6f 71 43 17 06 54 11 bf b1 21 96 bb 3e a5 ..oqC..T...!..>. 00:20:33.960 00000010 ad 57 df 00 c4 23 4c 4f 79 b7 97 35 e2 ce d3 dd .W...#LOy..5.... 00:20:33.960 00000020 b9 dc 26 c4 8d 58 ec 1c 84 b8 6f a8 30 8d 4e 56 ..&..X....o.0.NV 00:20:33.960 00000030 50 d2 df c3 da 01 0b 96 3e ba 6c 8f d6 11 d4 1a P.......>.l..... 00:20:33.960 00000040 51 c3 9d a0 b4 cb 74 0e f1 7d 9f ea dd 64 a8 01 Q.....t..}...d.. 00:20:33.960 00000050 de 55 94 7c 81 f2 ed 79 0c 4f 17 f5 05 56 da 9b .U.|...y.O...V.. 00:20:33.960 00000060 50 fc 70 01 ec bc 21 b2 1b 48 16 5f 5f 16 d0 26 P.p...!..H.__..& 00:20:33.960 00000070 31 34 cd 8f 57 d5 e2 f6 45 4e 78 bf 9a a6 26 62 14..W...ENx...&b 00:20:33.960 00000080 54 bd c1 63 f5 d7 0b da e3 ae 43 d3 36 58 40 95 T..c......C.6X@. 00:20:33.961 00000090 7f 38 c2 0d dc e9 c7 f9 f8 37 68 0f 38 3d e5 c3 .8.......7h.8=.. 00:20:33.961 000000a0 1c 91 67 3b 1b 35 b3 03 dd bb f9 16 9d 08 79 cd ..g;.5........y. 00:20:33.961 000000b0 e0 54 88 10 f4 a7 2a e3 12 56 02 cf fb 45 f0 19 .T....*..V...E.. 00:20:33.961 000000c0 c3 e7 97 b6 0b 41 c4 cb d0 4f 19 3d 25 5e 97 ce .....A...O.=%^.. 00:20:33.961 000000d0 f6 4e ab 60 78 f7 1e e7 89 66 19 87 4e 4e 34 0b .N.`x....f..NN4. 00:20:33.961 000000e0 bf be 0c 8d ee 6a 5a 0e 8f 58 f2 6e 2f ac e7 69 .....jZ..X.n/..i 00:20:33.961 000000f0 f8 97 95 97 c7 7d 4d 16 61 47 b5 ab a4 4e 25 ab .....}M.aG...N%. 00:20:33.961 00000100 1c a9 e1 0e cc 8c f7 ca ee aa e0 12 b6 cb f2 b9 ................ 00:20:33.961 00000110 d4 14 7f 21 ab 09 51 1f e3 34 73 cc 73 87 6a 83 ...!..Q..4s.s.j. 00:20:33.961 00000120 a6 34 ce 8c 07 5f 7c 52 7b 23 ef a2 cb f8 a1 50 .4..._|R{#.....P 00:20:33.961 00000130 d7 d7 f7 d8 3b 25 9c f2 56 51 e2 5f 8e 25 06 73 ....;%..VQ._.%.s 00:20:33.961 00000140 8c d1 9e 99 f1 67 9a 1c f0 9d 42 32 51 a3 5b a0 .....g....B2Q.[. 00:20:33.961 00000150 10 db 45 4a 28 0d 43 90 dd 6e cd 4e 52 4b 34 62 ..EJ(.C..n.NRK4b 00:20:33.961 00000160 74 04 db 11 94 b8 0c 9a 38 b9 27 b2 36 63 03 c2 t.......8.'.6c.. 00:20:33.961 00000170 c8 5e 91 11 53 91 af 7a 63 92 93 96 78 52 8d 16 .^..S..zc...xR.. 00:20:33.961 host pubkey: 00:20:33.961 00000000 2b 84 82 76 90 be 0d b9 b6 82 20 3d 21 47 a7 94 +..v...... =!G.. 00:20:33.961 00000010 8b 50 6f 77 c9 da 23 c2 c0 a8 65 e6 5a ff e8 94 .Pow..#...e.Z... 00:20:33.961 00000020 60 32 ce 2b ca a6 d5 18 7b eb d3 b3 c6 fd ec c9 `2.+....{....... 00:20:33.961 00000030 a1 c0 21 68 50 98 f3 83 96 75 9e ab d3 83 85 25 ..!hP....u.....% 00:20:33.961 00000040 7c 3c d4 ad db 61 dc e1 fa 58 9a 46 fc f6 4e 32 |<...a...X.F..N2 00:20:33.961 00000050 d7 4c 1f 0d 7b 34 5e 3a 11 c4 dd 45 3e 74 5b 0f .L..{4^:...E>t[. 00:20:33.961 00000060 d5 1f 6c 48 40 58 c5 d6 ec cd 9d 35 9a 32 38 06 ..lH@X.....5.28. 00:20:33.961 00000070 01 5b 23 4c dc d2 7f b1 11 78 7d c4 98 c4 42 db .[#L.....x}...B. 00:20:33.961 00000080 1a c4 ee b8 3d d7 12 00 85 1c 85 67 1b c6 79 77 ....=......g..yw 00:20:33.961 00000090 05 8e d4 16 a4 de f9 1d cf 55 6c 9d e0 58 9a 90 .........Ul..X.. 00:20:33.961 000000a0 72 a2 d5 bf e6 3a 17 55 90 7c 6d a2 a9 a8 4e 88 r....:.U.|m...N. 00:20:33.961 000000b0 9b f0 37 3d 30 d1 6a 53 67 64 30 03 c6 ee 9e 16 ..7=0.jSgd0..... 00:20:33.961 000000c0 67 05 95 90 7f de 5b dc bd f8 77 90 96 0e c0 c2 g.....[...w..... 00:20:33.961 000000d0 46 41 47 ad 8c 20 98 10 6e 2a c9 6a 57 8e fd 00 FAG.. ..n*.jW... 00:20:33.961 000000e0 88 b0 1a ac 02 14 cc 2c 14 19 1e 82 99 e1 bc 7b .......,.......{ 00:20:33.961 000000f0 c5 ff 43 4b 3d 22 95 6e 36 c3 10 cd 04 ed d9 6a ..CK=".n6......j 00:20:33.961 00000100 32 e9 c4 8d 6b 27 31 d5 2d a9 73 31 6c 3a 01 f6 2...k'1.-.s1l:.. 00:20:33.961 00000110 ce 06 d1 ca 40 72 74 1b 8b fe e3 e8 be b6 9c 91 ....@rt......... 00:20:33.961 00000120 20 68 35 71 1e b9 42 27 40 8c 6d dc ef 7d 9b 98 h5q..B'@.m..}.. 00:20:33.961 00000130 3a 28 37 99 92 fa 60 a9 26 e9 3f 8a 48 f5 7e ac :(7...`.&.?.H.~. 00:20:33.961 00000140 83 f0 89 3d 22 9a 13 11 10 ae 0e be 5c 22 49 e7 ...=".......\"I. 00:20:33.961 00000150 0c e7 97 f6 3f bb d0 01 94 3c 09 50 5f 09 f1 41 ....?....<.P_..A 00:20:33.961 00000160 08 0c 55 54 34 ba 16 f9 b3 c1 92 2e fd b6 a2 12 ..UT4........... 00:20:33.961 00000170 8a 42 c5 56 68 d4 13 62 8e 58 27 87 42 ec 54 98 .B.Vh..b.X'.B.T. 00:20:33.961 dh secret: 00:20:33.961 00000000 fd ed e4 3a be 33 55 58 8d ec 57 52 51 71 98 a6 ...:.3UX..WRQq.. 00:20:33.961 00000010 30 5b 57 9f a5 2d 83 ac 6f 2f f1 26 39 ce 3f 3c 0[W..-..o/.&9.?< 00:20:33.961 00000020 72 ce 06 56 93 22 82 96 b4 35 1d c7 dc f0 04 07 r..V."...5...... 00:20:33.961 00000030 4a 35 50 87 ff 32 27 ec f4 45 8c d9 3d 8d e7 b8 J5P..2'..E..=... 00:20:33.961 00000040 eb 20 9d 23 f1 36 97 75 62 4b 8b 53 f4 9a b6 de . .#.6.ubK.S.... 00:20:33.961 00000050 f6 17 a7 03 59 66 46 1b 7b 3d d7 35 18 cc 40 e9 ....YfF.{=.5..@. 00:20:33.961 00000060 d1 d9 9b 27 88 76 c1 c0 83 07 fc c4 64 bc a0 e2 ...'.v......d... 00:20:33.961 00000070 22 6a 19 b8 41 38 a8 cb d8 4d 66 d8 d9 db 4b 92 "j..A8...Mf...K. 00:20:33.961 00000080 81 b5 60 e7 e6 9f e5 29 49 f2 5b 29 2c 59 b7 96 ..`....)I.[),Y.. 00:20:33.961 00000090 29 f6 c6 ce 17 ce 1c dc 86 50 9e 7d 9b 36 36 77 )........P.}.66w 00:20:33.961 000000a0 cd f9 4b f3 0c e0 f1 69 4e d7 5f 26 0a 76 83 3d ..K....iN._&.v.= 00:20:33.961 000000b0 e7 cf b9 02 96 04 c6 98 d5 d1 2a af 49 0a 55 87 ..........*.I.U. 00:20:33.961 000000c0 b8 55 14 b7 9c f5 f0 c3 84 50 cd 7a d1 93 f7 d2 .U.......P.z.... 00:20:33.961 000000d0 08 d5 d6 f8 e9 e8 ce 1a e2 4b 31 13 b2 ba 59 3a .........K1...Y: 00:20:33.961 000000e0 6b 6e e4 8b 10 52 15 e6 25 da dd 78 82 e9 89 f2 kn...R..%..x.... 00:20:33.961 000000f0 04 66 f9 fb 4f f8 fc 06 ad 2c 41 d3 c6 3a 24 4b .f..O....,A..:$K 00:20:33.961 00000100 c7 8a b7 7d 1b 00 1f 80 7f 71 f0 ec 88 d9 7f 94 ...}.....q...... 00:20:33.961 00000110 22 96 80 56 f8 08 84 0f d4 56 78 18 fb 13 dd f7 "..V.....Vx..... 00:20:33.961 00000120 a7 5a c3 41 50 31 bb 56 0b 8c b1 49 57 11 05 7c .Z.AP1.V...IW..| 00:20:33.961 00000130 2e 60 80 02 72 de 2c 07 53 cc 17 50 ae e1 7e 79 .`..r.,.S..P..~y 00:20:33.961 00000140 3c ad 6a 96 d7 0c 8f 8d da 41 6a 65 cf 72 82 e9 <.j......Aje.r.. 00:20:33.961 00000150 26 1e d8 ed 68 13 a2 2e 84 ad 28 54 e1 c5 fa 1f &...h.....(T.... 00:20:33.961 00000160 f4 ae bb ec 68 c5 f9 7b 45 04 9e 3c 01 3f 5b 92 ....h..{E..<.?[. 00:20:33.961 00000170 82 37 26 96 8a 97 b0 62 58 51 24 eb b0 20 bf 6d .7&....bXQ$.. .m 00:20:33.961 [2024-09-27 15:25:26.954015] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=3, dhgroup=2, seq=3428451810, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.961 [2024-09-27 15:25:26.954111] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.961 [2024-09-27 15:25:26.969701] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.961 [2024-09-27 15:25:26.969759] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.961 [2024-09-27 15:25:26.969769] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.961 [2024-09-27 15:25:26.969805] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.961 [2024-09-27 15:25:27.121265] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.961 [2024-09-27 15:25:27.121286] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.961 [2024-09-27 15:25:27.121293] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.961 [2024-09-27 15:25:27.121335] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.961 [2024-09-27 15:25:27.121362] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.961 ctrlr pubkey: 00:20:33.961 00000000 29 40 68 10 33 4f 0b 2e 8e 63 8f 4f 8a 1e f7 b0 )@h.3O...c.O.... 00:20:33.961 00000010 aa 2b 01 28 ef b2 3b 70 d2 d4 97 16 a1 b9 67 c1 .+.(..;p......g. 00:20:33.961 00000020 32 5a 69 1a ce 95 c1 b7 49 de 58 de 91 7f b3 bf 2Zi.....I.X..... 00:20:33.961 00000030 78 0c 4b 30 63 54 28 b9 6c 17 fb ef 98 d3 a3 41 x.K0cT(.l......A 00:20:33.961 00000040 9f 36 3b 3a c4 37 ab dd 44 d8 0e 17 b0 0c 35 f9 .6;:.7..D.....5. 00:20:33.961 00000050 72 36 58 26 19 70 fa b4 70 ce 0c b1 02 62 f2 cd r6X&.p..p....b.. 00:20:33.961 00000060 b8 91 85 08 f3 a0 68 bf fe 94 0f 96 f7 b6 25 83 ......h.......%. 00:20:33.961 00000070 14 a7 6d 60 7d 63 7d 77 cc b1 78 02 f1 ae e8 d7 ..m`}c}w..x..... 00:20:33.961 00000080 e0 9c 5b f7 7f 58 0a 3f 0c c4 b6 1d 2f 7f 48 ba ..[..X.?..../.H. 00:20:33.961 00000090 bc 66 7d 3e a3 f0 3a 21 d8 a1 fe cc 69 68 79 52 .f}>..:!....ihyR 00:20:33.961 000000a0 0a 6d 2b 2c 85 39 70 77 ab 90 69 4c 2e 7d 7e c0 .m+,.9pw..iL.}~. 00:20:33.961 000000b0 ff b5 fd 9a 2f 30 6a 26 a0 97 a7 94 9b 22 0e 53 ..../0j&.....".S 00:20:33.961 000000c0 49 5b 58 9a 43 39 37 3b ea 6e 7f 26 49 75 60 c0 I[X.C97;.n.&Iu`. 00:20:33.961 000000d0 d3 1a bd 12 55 30 8e 15 4a b7 f3 d3 5e 89 0f 75 ....U0..J...^..u 00:20:33.961 000000e0 7a f4 56 ad fc 59 7d 70 b9 4a 14 84 9f c9 86 e4 z.V..Y}p.J...... 00:20:33.961 000000f0 c9 00 d8 42 9e 59 26 c3 64 8c 74 54 a1 01 b5 74 ...B.Y&.d.tT...t 00:20:33.961 00000100 53 d1 71 f0 71 33 09 ff 2c 1a 00 df 37 85 b1 34 S.q.q3..,...7..4 00:20:33.961 00000110 5a bb 5a 3b 70 2b 41 ef eb d8 34 d6 56 14 80 07 Z.Z;p+A...4.V... 00:20:33.961 00000120 57 7b 3a 1b 02 6f 1e 9f 8c 2a 1f a6 9f 9d 66 d0 W{:..o...*....f. 00:20:33.961 00000130 6d c5 e3 da e0 25 5a b1 71 78 40 8b 80 fb ec 13 m....%Z.qx@..... 00:20:33.961 00000140 d7 e3 0d de e3 be e6 2e c7 7a fe 86 1b d0 b7 7b .........z.....{ 00:20:33.961 00000150 ae 33 b6 ef 7a 2f 26 46 94 e4 87 e7 d1 de cd c0 .3..z/&F........ 00:20:33.961 00000160 78 25 78 65 a6 ff a9 a3 2c ac 96 91 fa f6 6c 49 x%xe....,.....lI 00:20:33.961 00000170 90 aa 41 bb 97 27 c9 5f 4b 67 05 de 39 37 55 f0 ..A..'._Kg..97U. 00:20:33.961 host pubkey: 00:20:33.961 00000000 f7 c8 14 ca d9 6c 94 27 67 55 64 41 2b a9 f7 bc .....l.'gUdA+... 00:20:33.961 00000010 8f 9a b9 24 c5 ef ab 51 61 76 4d 5e ff 87 9c fd ...$...QavM^.... 00:20:33.961 00000020 bb 84 6f 2c eb 43 c1 a3 1c 75 45 dc b7 15 8a 51 ..o,.C...uE....Q 00:20:33.961 00000030 17 fb 71 af c5 8f 33 72 29 f4 0a 84 d9 68 f3 3f ..q...3r)....h.? 00:20:33.961 00000040 4f 66 87 3a 20 b5 a2 74 4b 7b 81 0c 1a 00 75 05 Of.: ..tK{....u. 00:20:33.961 00000050 bb 05 09 34 4f f6 3a 84 13 93 d6 fd c7 2e a2 ca ...4O.:......... 00:20:33.961 00000060 d6 af e1 d6 1b b1 72 2f 05 51 9f b2 ca 1b 34 f7 ......r/.Q....4. 00:20:33.961 00000070 46 46 4b f6 21 27 f9 8b 85 6e 72 92 99 07 6a fa FFK.!'...nr...j. 00:20:33.961 00000080 3c 32 2e 8e c5 d5 13 19 49 d2 2b 7b 72 80 15 1c <2......I.+{r... 00:20:33.961 00000090 39 1d 93 5c 12 ef 13 92 07 22 3d 1e 85 85 c4 a3 9..\....."=..... 00:20:33.961 000000a0 06 81 c7 8c 64 c3 c2 04 54 21 bf ae 71 d4 ae fb ....d...T!..q... 00:20:33.961 000000b0 80 d3 0e 6f ab 27 e2 51 83 87 31 7d 18 f8 6e e2 ...o.'.Q..1}..n. 00:20:33.961 000000c0 fa 5d d4 fd d3 ec 42 ae 99 b4 e1 1b 39 9a f2 a3 .]....B.....9... 00:20:33.961 000000d0 d2 6f ab 81 e1 69 dc 06 10 ba 6a 77 c9 16 a7 86 .o...i....jw.... 00:20:33.961 000000e0 aa b8 c7 d5 a2 6c 1b 14 4a 05 70 1e d1 09 ff 99 .....l..J.p..... 00:20:33.961 000000f0 4f f6 66 c4 f3 fe 86 b8 45 35 f2 7b bd 70 c4 84 O.f.....E5.{.p.. 00:20:33.961 00000100 4f 36 aa 83 04 ae ee ef c4 36 77 e2 da 83 00 9a O6.......6w..... 00:20:33.961 00000110 d5 c4 e0 4e 6e 2b 6d 77 75 a9 86 08 f9 ac 33 b3 ...Nn+mwu.....3. 00:20:33.961 00000120 07 d6 d5 2c 35 49 38 79 d9 d8 1a df 87 9f 3a 87 ...,5I8y......:. 00:20:33.961 00000130 c3 33 13 e1 52 70 31 91 48 ca 8d 0e 10 68 38 1c .3..Rp1.H....h8. 00:20:33.961 00000140 32 33 3f 8b 91 51 0e 4d e9 41 ad 2a 55 4e c9 89 23?..Q.M.A.*UN.. 00:20:33.962 00000150 77 be a7 0b 89 56 ad 0d b3 72 eb 95 f6 7c 5f 2a w....V...r...|_* 00:20:33.962 00000160 a5 50 6c 32 88 48 6f bf 05 b7 3a 9e 4d d5 ea ae .Pl2.Ho...:.M... 00:20:33.962 00000170 79 93 16 af c3 ef be d3 82 5b 18 ad 9b 81 97 98 y........[...... 00:20:33.962 dh secret: 00:20:33.962 00000000 f5 49 40 88 04 45 1d 4f 91 a9 b9 af e6 cc fe 16 .I@..E.O........ 00:20:33.962 00000010 a7 14 17 a1 bb db 1b b6 79 ac 1c df e0 43 ea 2b ........y....C.+ 00:20:33.962 00000020 7b 98 ff 3e 31 4b 81 f7 d2 49 26 c2 26 ec 69 0c {..>1K...I&.&.i. 00:20:33.962 00000030 8f e7 57 ab f9 1a e1 2b 06 35 22 ec 73 c0 ee 02 ..W....+.5".s... 00:20:33.962 00000040 ff ed 7a 0a ed 46 28 d5 b5 11 cb e4 85 4b 79 d2 ..z..F(......Ky. 00:20:33.962 00000050 ff cc 45 8c cf 0d 9e 79 95 e6 3d 5c 10 57 57 e6 ..E....y..=\.WW. 00:20:33.962 00000060 f2 d7 97 f2 59 94 0c 51 00 0b 59 8e 6f 97 66 41 ....Y..Q..Y.o.fA 00:20:33.962 00000070 84 e6 4f 00 76 d9 e5 3e d4 94 39 cb 89 47 a8 af ..O.v..>..9..G.. 00:20:33.962 00000080 02 7e 00 64 1b e4 dc 43 5c 78 6b bd 64 6c b9 7e .~.d...C\xk.dl.~ 00:20:33.962 00000090 e6 53 6e 30 f7 ef 91 f0 a3 53 70 1b 73 66 89 42 .Sn0.....Sp.sf.B 00:20:33.962 000000a0 e2 f7 3a 7a 8f 23 1a e5 fa 0d 94 e5 07 30 ac f2 ..:z.#.......0.. 00:20:33.962 000000b0 05 61 0c e0 9e f6 ae ff 5e a3 92 8e a1 75 1c cd .a......^....u.. 00:20:33.962 000000c0 56 7a ad 86 c6 65 80 6b b0 dc 4f 87 a7 79 c3 ca Vz...e.k..O..y.. 00:20:33.962 000000d0 00 d4 a6 35 ee 52 7a 99 c1 ad 3c 1a 14 04 94 32 ...5.Rz...<....2 00:20:33.962 000000e0 7f 71 5d 4d 8b 06 13 66 3a ca f1 a2 1d 06 0b ef .q]M...f:....... 00:20:33.962 000000f0 c3 10 e9 38 a2 c8 39 82 44 08 d6 8c ff 64 a2 50 ...8..9.D....d.P 00:20:33.962 00000100 5e 7f 23 7f 50 52 34 a6 6b 82 18 c8 d5 db 3f 30 ^.#.PR4.k.....?0 00:20:33.962 00000110 8e 38 32 aa 3c 94 0b 0b 75 89 c8 41 36 32 0a cc .82.<...u..A62.. 00:20:33.962 00000120 50 c4 06 af 08 dc 8e 6a 31 20 72 a7 58 b0 24 00 P......j1 r.X.$. 00:20:33.962 00000130 82 ed 72 8f 5a 10 4e f3 e3 b6 8d f7 0b 01 d6 8f ..r.Z.N......... 00:20:33.962 00000140 9a 27 ab 7b 3b 82 c1 72 2b d7 5f b8 6a fd 29 66 .'.{;..r+._.j.)f 00:20:33.962 00000150 65 11 6e b4 1d d4 4a 42 7e c2 9c af 50 8c 31 76 e.n...JB~...P.1v 00:20:33.962 00000160 79 c7 e1 88 90 10 4f 75 68 87 ac 32 fc 3d b9 14 y.....Ouh..2.=.. 00:20:33.962 00000170 c0 22 88 d7 84 dd eb 21 f7 22 58 00 fe 6f 41 0d .".....!."X..oA. 00:20:33.962 [2024-09-27 15:25:27.128318] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=3, dhgroup=2, seq=3428451811, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.962 [2024-09-27 15:25:27.133570] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.962 [2024-09-27 15:25:27.133606] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.962 [2024-09-27 15:25:27.133623] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.962 [2024-09-27 15:25:27.133648] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.962 [2024-09-27 15:25:27.133658] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.962 [2024-09-27 15:25:27.240717] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.962 [2024-09-27 15:25:27.240735] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.962 [2024-09-27 15:25:27.240743] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.962 [2024-09-27 15:25:27.240752] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.962 [2024-09-27 15:25:27.240809] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.962 ctrlr pubkey: 00:20:33.962 00000000 29 40 68 10 33 4f 0b 2e 8e 63 8f 4f 8a 1e f7 b0 )@h.3O...c.O.... 00:20:33.962 00000010 aa 2b 01 28 ef b2 3b 70 d2 d4 97 16 a1 b9 67 c1 .+.(..;p......g. 00:20:33.962 00000020 32 5a 69 1a ce 95 c1 b7 49 de 58 de 91 7f b3 bf 2Zi.....I.X..... 00:20:33.962 00000030 78 0c 4b 30 63 54 28 b9 6c 17 fb ef 98 d3 a3 41 x.K0cT(.l......A 00:20:33.962 00000040 9f 36 3b 3a c4 37 ab dd 44 d8 0e 17 b0 0c 35 f9 .6;:.7..D.....5. 00:20:33.962 00000050 72 36 58 26 19 70 fa b4 70 ce 0c b1 02 62 f2 cd r6X&.p..p....b.. 00:20:33.962 00000060 b8 91 85 08 f3 a0 68 bf fe 94 0f 96 f7 b6 25 83 ......h.......%. 00:20:33.962 00000070 14 a7 6d 60 7d 63 7d 77 cc b1 78 02 f1 ae e8 d7 ..m`}c}w..x..... 00:20:33.962 00000080 e0 9c 5b f7 7f 58 0a 3f 0c c4 b6 1d 2f 7f 48 ba ..[..X.?..../.H. 00:20:33.962 00000090 bc 66 7d 3e a3 f0 3a 21 d8 a1 fe cc 69 68 79 52 .f}>..:!....ihyR 00:20:33.962 000000a0 0a 6d 2b 2c 85 39 70 77 ab 90 69 4c 2e 7d 7e c0 .m+,.9pw..iL.}~. 00:20:33.962 000000b0 ff b5 fd 9a 2f 30 6a 26 a0 97 a7 94 9b 22 0e 53 ..../0j&.....".S 00:20:33.962 000000c0 49 5b 58 9a 43 39 37 3b ea 6e 7f 26 49 75 60 c0 I[X.C97;.n.&Iu`. 00:20:33.962 000000d0 d3 1a bd 12 55 30 8e 15 4a b7 f3 d3 5e 89 0f 75 ....U0..J...^..u 00:20:33.962 000000e0 7a f4 56 ad fc 59 7d 70 b9 4a 14 84 9f c9 86 e4 z.V..Y}p.J...... 00:20:33.962 000000f0 c9 00 d8 42 9e 59 26 c3 64 8c 74 54 a1 01 b5 74 ...B.Y&.d.tT...t 00:20:33.962 00000100 53 d1 71 f0 71 33 09 ff 2c 1a 00 df 37 85 b1 34 S.q.q3..,...7..4 00:20:33.962 00000110 5a bb 5a 3b 70 2b 41 ef eb d8 34 d6 56 14 80 07 Z.Z;p+A...4.V... 00:20:33.962 00000120 57 7b 3a 1b 02 6f 1e 9f 8c 2a 1f a6 9f 9d 66 d0 W{:..o...*....f. 00:20:33.962 00000130 6d c5 e3 da e0 25 5a b1 71 78 40 8b 80 fb ec 13 m....%Z.qx@..... 00:20:33.962 00000140 d7 e3 0d de e3 be e6 2e c7 7a fe 86 1b d0 b7 7b .........z.....{ 00:20:33.962 00000150 ae 33 b6 ef 7a 2f 26 46 94 e4 87 e7 d1 de cd c0 .3..z/&F........ 00:20:33.962 00000160 78 25 78 65 a6 ff a9 a3 2c ac 96 91 fa f6 6c 49 x%xe....,.....lI 00:20:33.962 00000170 90 aa 41 bb 97 27 c9 5f 4b 67 05 de 39 37 55 f0 ..A..'._Kg..97U. 00:20:33.962 host pubkey: 00:20:33.962 00000000 80 e1 16 2b 7c 0d d9 71 03 71 14 b5 12 9f 7f 4d ...+|..q.q.....M 00:20:33.962 00000010 8c fd 15 99 ca 11 0d d9 a7 2a 00 d3 8f fc 45 33 .........*....E3 00:20:33.962 00000020 fe f1 b9 a7 1d 36 59 23 a6 ba 23 87 0a bd 9d 0c .....6Y#..#..... 00:20:33.962 00000030 13 96 21 e2 fa 2b 4d 6f 88 04 cb bc 98 f4 cb 97 ..!..+Mo........ 00:20:33.962 00000040 5c 83 04 2b 1a 8e b5 22 b8 8c 6f 2d 21 53 e9 5b \..+..."..o-!S.[ 00:20:33.962 00000050 05 ed 4f 2c 82 36 90 af fc 0b 9e 80 86 a1 6b 45 ..O,.6........kE 00:20:33.962 00000060 24 5f 4d d3 23 0a 81 c0 9d 83 ea 67 19 fa 0e c3 $_M.#......g.... 00:20:33.962 00000070 42 ad 02 3d af 58 48 4a eb 3c 5b 62 03 94 d5 17 B..=.XHJ.<[b.... 00:20:33.962 00000080 fb da c6 f2 fc 64 7a 5a 80 4c 21 33 01 8f bc 1b .....dzZ.L!3.... 00:20:33.962 00000090 be ec 5e b9 dd e0 1d 84 d8 3f 9a 59 29 56 7a 83 ..^......?.Y)Vz. 00:20:33.962 000000a0 ef d8 34 c6 3d 24 3d 84 54 ff fb 50 63 bb 57 50 ..4.=$=.T..Pc.WP 00:20:33.962 000000b0 a9 8d 0e 41 63 c1 91 cb 09 d0 11 e0 d6 3d 57 3a ...Ac........=W: 00:20:33.962 000000c0 d0 a8 fd eb 2d 6a 1e b7 26 6a 16 04 b0 e6 89 17 ....-j..&j...... 00:20:33.962 000000d0 0c e8 ea 04 ca f7 60 47 7d 6f f0 06 ca 5b 7d 67 ......`G}o...[}g 00:20:33.962 000000e0 ab 24 e2 95 5c ba 45 e7 c7 5d 6b 45 6d 71 23 43 .$..\.E..]kEmq#C 00:20:33.962 000000f0 a2 eb f9 8b 63 9d 9f 26 ce 5e 16 7c fd 1e 0e 72 ....c..&.^.|...r 00:20:33.962 00000100 a0 cc 91 27 9a b7 50 e2 fa 22 9d 68 53 0e b6 64 ...'..P..".hS..d 00:20:33.962 00000110 47 35 6e 1b 07 91 00 41 b8 a5 7f b5 15 f7 89 08 G5n....A........ 00:20:33.962 00000120 1a 20 af e1 32 58 e6 d0 71 49 ce e9 c6 94 f6 92 . ..2X..qI...... 00:20:33.962 00000130 34 da 08 7b ac 1d 01 48 82 30 7d df b9 8c 73 57 4..{...H.0}...sW 00:20:33.962 00000140 86 b9 44 52 39 2f 34 24 83 a6 00 6e a3 15 3c cd ..DR9/4$...n..<. 00:20:33.962 00000150 bb 0b f0 83 04 b4 16 3b 7e d9 99 a2 49 fb 33 6e .......;~...I.3n 00:20:33.962 00000160 57 46 63 f2 ca ed 32 62 bb e1 53 63 2e 49 98 77 WFc...2b..Sc.I.w 00:20:33.962 00000170 41 49 d3 9a 5f dc 97 f8 d4 5a 44 ab 08 ca 4b 08 AI.._....ZD...K. 00:20:33.962 dh secret: 00:20:33.962 00000000 21 86 5c 7a 3d 6d 80 5d 88 ae 0a 78 4b e3 33 ed !.\z=m.]...xK.3. 00:20:33.962 00000010 5f 0d f3 a3 c4 22 79 8c f8 b1 fc bf 4f 0e 10 25 _...."y.....O..% 00:20:33.962 00000020 b5 ba cf 0e 17 68 52 66 52 3f 55 97 74 fe d1 53 .....hRfR?U.t..S 00:20:33.962 00000030 45 2d eb fb 4e 0a 2c 30 3a e3 79 06 ae 22 06 14 E-..N.,0:.y..".. 00:20:33.962 00000040 79 43 30 ce 3c de 01 46 fe 58 8f d7 34 13 57 df yC0.<..F.X..4.W. 00:20:33.962 00000050 88 5d 43 4c 56 ab 2b 2a dc f5 af 0c 01 f8 bf 60 .]CLV.+*.......` 00:20:33.962 00000060 e0 94 a2 6d bb ba cb ba 59 c0 02 b5 9a 99 54 50 ...m....Y.....TP 00:20:33.962 00000070 78 8b db 9e a7 48 f3 72 65 d0 41 f4 bd 17 7a 60 x....H.re.A...z` 00:20:33.962 00000080 20 9c 5b fc 18 22 84 5b 08 2f 3a 85 2d f3 37 48 .[..".[./:.-.7H 00:20:33.962 00000090 5d 1a e9 b1 5a 86 1d e6 6b dc 42 4e 34 da 9e 54 ]...Z...k.BN4..T 00:20:33.962 000000a0 64 a6 d4 fc d7 33 f2 3b 17 ea 35 68 12 16 a6 c0 d....3.;..5h.... 00:20:33.962 000000b0 2a 65 42 cd 9c a3 a3 55 fe 2a 87 1e 3a b6 85 98 *eB....U.*..:... 00:20:33.962 000000c0 bf 58 e9 9d a1 08 c2 b1 70 11 90 50 41 27 c2 8c .X......p..PA'.. 00:20:33.962 000000d0 c3 87 3d 0f 99 f8 45 91 af 55 b5 6c 1d 7b f8 2f ..=...E..U.l.{./ 00:20:33.962 000000e0 d0 47 57 41 8d 8e b2 30 08 c8 fe 65 31 8e 8f 75 .GWA...0...e1..u 00:20:33.962 000000f0 33 84 9e 07 95 08 2a 38 b3 96 50 a2 e0 a8 81 98 3.....*8..P..... 00:20:33.962 00000100 e2 ba 8a 11 35 71 c4 f7 87 ac 4b fc 0c 37 c2 a6 ....5q....K..7.. 00:20:33.962 00000110 a0 0b 4f 4b f8 ac 81 10 b7 6b fe 5e 57 69 0f 43 ..OK.....k.^Wi.C 00:20:33.962 00000120 93 c1 e3 a5 47 97 86 b3 86 01 a8 51 35 0a 7c ae ....G......Q5.|. 00:20:33.962 00000130 aa 75 42 58 c1 da 8a 05 a0 9c b1 f5 aa d9 45 8b .uBX..........E. 00:20:33.962 00000140 b3 25 a5 91 49 49 a6 3d 97 21 49 df f7 17 ed a2 .%..II.=.!I..... 00:20:33.962 00000150 76 67 42 d9 ec eb 6d 11 1d dc 76 39 30 36 97 e1 vgB...m...v906.. 00:20:33.962 00000160 24 9b e7 4c 3e 91 03 4d 3a 2a 0c 01 75 3f ac 37 $..L>..M:*..u?.7 00:20:33.962 00000170 48 fb 59 ba 15 3e b8 dc cf 41 1c f4 33 5e 56 bb H.Y..>...A..3^V. 00:20:33.962 [2024-09-27 15:25:27.248333] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=3, dhgroup=2, seq=3428451812, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.962 [2024-09-27 15:25:27.248456] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.962 [2024-09-27 15:25:27.264859] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.962 [2024-09-27 15:25:27.264923] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.962 [2024-09-27 15:25:27.264934] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.963 [2024-09-27 15:25:27.264973] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.963 [2024-09-27 15:25:27.423034] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.963 [2024-09-27 15:25:27.423060] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.963 [2024-09-27 15:25:27.423068] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 2 (ffdhe3072) 00:20:33.963 [2024-09-27 15:25:27.423115] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.963 [2024-09-27 15:25:27.423139] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.963 ctrlr pubkey: 00:20:33.963 00000000 8f 86 98 34 de af 9f 41 bf de ad a6 77 6c cd 55 ...4...A....wl.U 00:20:33.963 00000010 40 ff 14 a7 d1 0a 3f 56 40 c1 b3 41 51 14 95 cd @.....?V@..AQ... 00:20:33.963 00000020 fd 02 c9 49 11 b0 52 7f ce 25 f5 02 75 76 f9 49 ...I..R..%..uv.I 00:20:33.963 00000030 3b 1d 05 63 4c 2c 30 04 d0 6f 3b 34 9a 83 4b c5 ;..cL,0..o;4..K. 00:20:33.963 00000040 3e 4d 11 1a 9a be d8 24 6a 5a 37 98 38 d3 3d b6 >M.....$jZ7.8.=. 00:20:33.963 00000050 6e 6e eb 64 b2 77 52 37 a5 75 8f af 01 cf 59 b9 nn.d.wR7.u....Y. 00:20:33.963 00000060 d9 7b cf cc 06 93 b8 ad 2c 88 a7 9e 84 7d aa f4 .{......,....}.. 00:20:33.963 00000070 5b 63 33 82 fb f8 05 2c 88 52 57 49 e2 b0 cc 2f [c3....,.RWI.../ 00:20:33.963 00000080 63 75 a9 40 f6 22 ba 6f 58 28 98 34 bf aa a0 d1 cu.@.".oX(.4.... 00:20:33.963 00000090 cc 70 5b 33 bb 8d 1c 8d 6b 63 98 48 96 51 b5 14 .p[3....kc.H.Q.. 00:20:33.963 000000a0 ec a5 33 4d 01 94 68 36 52 a1 5c c2 9b 23 4f fb ..3M..h6R.\..#O. 00:20:33.963 000000b0 c6 3d 5a 76 84 d4 4c d1 60 8e be ea 71 9d b1 31 .=Zv..L.`...q..1 00:20:33.963 000000c0 71 80 f9 b9 b2 f8 8b 39 cd bb 96 cc 2a af c7 0a q......9....*... 00:20:33.963 000000d0 5d a9 0e 55 05 29 b4 8d 78 cd 72 b6 10 4c 4e af ]..U.)..x.r..LN. 00:20:33.963 000000e0 e7 78 4a 60 f7 bf 0c 6d 9c 5f bb a2 66 71 fc 83 .xJ`...m._..fq.. 00:20:33.963 000000f0 33 5a 93 f6 fd 1f e3 ad 26 64 a4 d6 d7 cb f2 9f 3Z......&d...... 00:20:33.963 00000100 be 31 fc 78 e3 a9 a9 a3 b5 a0 bf a2 30 20 88 71 .1.x........0 .q 00:20:33.963 00000110 79 27 0c 2c 1f 2a 05 c7 d8 2d aa 55 02 99 70 ff y'.,.*...-.U..p. 00:20:33.963 00000120 c9 25 9a 1d 92 4c 24 e0 7e cc b3 33 71 de 21 53 .%...L$.~..3q.!S 00:20:33.963 00000130 27 0f f8 a2 e4 c7 5e 03 8a f0 38 65 f4 f6 94 65 '.....^...8e...e 00:20:33.963 00000140 72 c5 4f 5f 3f 1c 97 5a dd b0 be 3f c8 72 73 f8 r.O_?..Z...?.rs. 00:20:33.963 00000150 43 48 b9 95 c0 7f 83 78 b9 fe fa 2a 71 02 48 fc CH.....x...*q.H. 00:20:33.963 00000160 42 17 3d 94 38 ad 90 ea 77 28 00 11 8e 94 f2 f6 B.=.8...w(...... 00:20:33.963 00000170 b1 b6 12 6e b8 91 cc 70 ff bf f0 34 5a ee 5f cb ...n...p...4Z._. 00:20:33.963 host pubkey: 00:20:33.963 00000000 f5 a4 21 64 51 92 ea ab dc 78 6b 70 4a a3 1d 55 ..!dQ....xkpJ..U 00:20:33.963 00000010 e7 80 de ef 15 05 e3 14 c6 cc 2b 12 ad ae 4a 83 ..........+...J. 00:20:33.963 00000020 4b 70 88 5e 00 3e 7e 19 b9 97 f8 ee de e3 5b a1 Kp.^.>~.......[. 00:20:33.963 00000030 2e 54 9b d1 13 c8 8c d7 6b 3a e3 21 0b 68 94 5e .T......k:.!.h.^ 00:20:33.963 00000040 e3 ce 13 77 77 eb a3 43 b1 19 29 72 c7 5a bc af ...ww..C..)r.Z.. 00:20:33.963 00000050 5e b5 e0 b9 97 e7 f7 06 6a 81 b5 cd 84 ee 46 d6 ^.......j.....F. 00:20:33.963 00000060 11 dc aa 9a 94 85 8a 48 6e 27 51 6f 16 32 bb 54 .......Hn'Qo.2.T 00:20:33.963 00000070 87 e7 b8 43 d1 6f da 98 49 b9 bc df 04 e1 91 c8 ...C.o..I....... 00:20:33.963 00000080 be 2e 8c c9 16 36 23 d5 08 22 3f 38 60 aa 23 d5 .....6#.."?8`.#. 00:20:33.963 00000090 35 a0 b8 1f 06 40 90 c2 b8 ec 5b 13 66 91 0d 7d 5....@....[.f..} 00:20:33.963 000000a0 8d 63 5e 35 e6 e0 21 1e ac 50 d9 8e a9 72 00 53 .c^5..!..P...r.S 00:20:33.963 000000b0 d5 5c 81 69 39 86 18 d7 e3 47 74 65 a3 dd 7c eb .\.i9....Gte..|. 00:20:33.963 000000c0 97 25 ed aa 62 d9 80 61 1f 6d 80 50 57 3f 03 44 .%..b..a.m.PW?.D 00:20:33.963 000000d0 14 b5 63 80 ae 29 41 57 54 db f8 66 1a a5 5a ab ..c..)AWT..f..Z. 00:20:33.963 000000e0 e4 e9 bb b6 2e 61 0a c6 09 bf 39 e0 bb 11 e4 2b .....a....9....+ 00:20:33.963 000000f0 1a 0f c4 15 f1 27 ce a2 00 3c 5c a9 fd 7d 1c b0 .....'...<\..}.. 00:20:33.963 00000100 d2 95 bb 48 48 66 59 3d 68 4b 2f b9 47 a3 11 26 ...HHfY=hK/.G..& 00:20:33.963 00000110 e4 69 87 37 0b 2e 89 e6 62 bc 74 e5 ad 34 91 b7 .i.7....b.t..4.. 00:20:33.963 00000120 76 cd d2 63 e2 a0 6b 8e 20 29 bb 5e f3 87 35 6b v..c..k. ).^..5k 00:20:33.963 00000130 8b f5 e6 87 0a b4 22 5d cb 5d 7e 8f 1c 04 cf c0 ......"].]~..... 00:20:33.963 00000140 2c a3 a0 3e 1f 29 ce e8 f0 83 e1 50 d5 0f 4b 0b ,..>.).....P..K. 00:20:33.963 00000150 61 af b5 92 30 77 8e 54 44 cb 5f 56 9d ec b6 69 a...0w.TD._V...i 00:20:33.963 00000160 30 e8 a5 8a 9d ec 0b 8c 99 66 00 14 7c d6 fd 5a 0........f..|..Z 00:20:33.963 00000170 92 5b d5 05 3f 6f 9c 24 92 2e 8a 73 93 3f 96 f9 .[..?o.$...s.?.. 00:20:33.963 dh secret: 00:20:33.963 00000000 5c c3 a2 c1 73 1b 69 a9 02 3b fd 3b 8e e7 b2 59 \...s.i..;.;...Y 00:20:33.963 00000010 c1 94 14 38 8c de 78 86 48 53 a5 4c 42 2f 33 58 ...8..x.HS.LB/3X 00:20:33.963 00000020 a9 e7 d1 ea b7 9d 08 97 50 c8 c6 b2 df 25 d2 69 ........P....%.i 00:20:33.963 00000030 30 44 c6 55 a1 f4 85 d7 85 8b c6 7e 93 31 ea 88 0D.U.......~.1.. 00:20:33.963 00000040 91 56 7d b7 71 aa 8e 2f 9f c6 33 91 5e da a1 ce .V}.q../..3.^... 00:20:33.963 00000050 95 8f e1 96 88 91 1a 87 65 ff 42 a4 e0 31 20 51 ........e.B..1 Q 00:20:33.963 00000060 81 de fd f2 f3 45 f6 cc e8 66 70 9b a9 0f 4f 4d .....E...fp...OM 00:20:33.963 00000070 5f 76 ae cd 10 be 5c 4f b2 de 9b d5 cc be 25 a1 _v....\O......%. 00:20:33.963 00000080 62 d8 63 23 b8 18 05 51 80 a6 e6 21 ba ce 85 8f b.c#...Q...!.... 00:20:33.963 00000090 cc 2e f6 81 32 53 03 e4 4e 00 94 01 19 6c e3 df ....2S..N....l.. 00:20:33.963 000000a0 d4 b4 b7 37 65 5d 5e 9a f4 a6 f1 20 9f 8c 34 e9 ...7e]^.... ..4. 00:20:33.963 000000b0 c5 ca 9f 57 7b 9e 22 fb 42 ae 34 28 ed 68 26 01 ...W{.".B.4(.h&. 00:20:33.963 000000c0 55 b8 a5 1a f5 4a 08 32 7b 3b fc 90 56 cd f5 5f U....J.2{;..V.._ 00:20:33.963 000000d0 50 7a 7d ac d4 52 a1 58 07 d9 21 85 33 9a 0b 48 Pz}..R.X..!.3..H 00:20:33.963 000000e0 1d 04 12 91 39 42 f2 ba af 14 a2 d9 b9 ab e5 eb ....9B.......... 00:20:33.963 000000f0 0a 3b 38 d4 ee ad 22 92 34 59 f3 91 b7 fe 05 a0 .;8...".4Y...... 00:20:33.963 00000100 fd 65 64 a3 dc 38 57 da fb 0a 62 47 2e b9 12 1f .ed..8W...bG.... 00:20:33.963 00000110 a2 0c c6 a7 5b 34 a8 59 f8 da a5 3e 0a 0c 2f 15 ....[4.Y...>../. 00:20:33.963 00000120 33 cb d2 94 38 12 eb d7 87 41 9d 80 96 3a ec 40 3...8....A...:.@ 00:20:33.963 00000130 9a 8d 0c 32 e5 e3 b4 b1 48 b8 f1 97 c5 72 af fd ...2....H....r.. 00:20:33.963 00000140 8d 1d 47 70 a6 bc e4 ab d8 9d 18 b6 ae b7 fb 49 ..Gp...........I 00:20:33.963 00000150 9c fb c9 44 be ed 89 8b ed 53 c2 f4 71 0f 23 fa ...D.....S..q.#. 00:20:33.963 00000160 12 cc e8 b7 0d d9 a6 f1 0b 1a 24 d1 50 84 0b 7f ..........$.P... 00:20:33.963 00000170 b7 6d e6 ab 9c 1b 9b f3 c1 5b cb f9 1f fe 24 99 .m.......[....$. 00:20:33.963 [2024-09-27 15:25:27.430714] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=3, dhgroup=2, seq=3428451813, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.963 [2024-09-27 15:25:27.436311] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.963 [2024-09-27 15:25:27.436339] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.963 [2024-09-27 15:25:27.436362] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.963 [2024-09-27 15:25:27.436369] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.963 [2024-09-27 15:25:27.542652] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.963 [2024-09-27 15:25:27.542675] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.963 [2024-09-27 15:25:27.542688] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 2 (ffdhe3072) 00:20:33.963 [2024-09-27 15:25:27.542698] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.963 [2024-09-27 15:25:27.542755] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.963 ctrlr pubkey: 00:20:33.963 00000000 8f 86 98 34 de af 9f 41 bf de ad a6 77 6c cd 55 ...4...A....wl.U 00:20:33.963 00000010 40 ff 14 a7 d1 0a 3f 56 40 c1 b3 41 51 14 95 cd @.....?V@..AQ... 00:20:33.963 00000020 fd 02 c9 49 11 b0 52 7f ce 25 f5 02 75 76 f9 49 ...I..R..%..uv.I 00:20:33.963 00000030 3b 1d 05 63 4c 2c 30 04 d0 6f 3b 34 9a 83 4b c5 ;..cL,0..o;4..K. 00:20:33.963 00000040 3e 4d 11 1a 9a be d8 24 6a 5a 37 98 38 d3 3d b6 >M.....$jZ7.8.=. 00:20:33.963 00000050 6e 6e eb 64 b2 77 52 37 a5 75 8f af 01 cf 59 b9 nn.d.wR7.u....Y. 00:20:33.963 00000060 d9 7b cf cc 06 93 b8 ad 2c 88 a7 9e 84 7d aa f4 .{......,....}.. 00:20:33.963 00000070 5b 63 33 82 fb f8 05 2c 88 52 57 49 e2 b0 cc 2f [c3....,.RWI.../ 00:20:33.963 00000080 63 75 a9 40 f6 22 ba 6f 58 28 98 34 bf aa a0 d1 cu.@.".oX(.4.... 00:20:33.963 00000090 cc 70 5b 33 bb 8d 1c 8d 6b 63 98 48 96 51 b5 14 .p[3....kc.H.Q.. 00:20:33.963 000000a0 ec a5 33 4d 01 94 68 36 52 a1 5c c2 9b 23 4f fb ..3M..h6R.\..#O. 00:20:33.963 000000b0 c6 3d 5a 76 84 d4 4c d1 60 8e be ea 71 9d b1 31 .=Zv..L.`...q..1 00:20:33.963 000000c0 71 80 f9 b9 b2 f8 8b 39 cd bb 96 cc 2a af c7 0a q......9....*... 00:20:33.963 000000d0 5d a9 0e 55 05 29 b4 8d 78 cd 72 b6 10 4c 4e af ]..U.)..x.r..LN. 00:20:33.963 000000e0 e7 78 4a 60 f7 bf 0c 6d 9c 5f bb a2 66 71 fc 83 .xJ`...m._..fq.. 00:20:33.963 000000f0 33 5a 93 f6 fd 1f e3 ad 26 64 a4 d6 d7 cb f2 9f 3Z......&d...... 00:20:33.963 00000100 be 31 fc 78 e3 a9 a9 a3 b5 a0 bf a2 30 20 88 71 .1.x........0 .q 00:20:33.963 00000110 79 27 0c 2c 1f 2a 05 c7 d8 2d aa 55 02 99 70 ff y'.,.*...-.U..p. 00:20:33.963 00000120 c9 25 9a 1d 92 4c 24 e0 7e cc b3 33 71 de 21 53 .%...L$.~..3q.!S 00:20:33.963 00000130 27 0f f8 a2 e4 c7 5e 03 8a f0 38 65 f4 f6 94 65 '.....^...8e...e 00:20:33.963 00000140 72 c5 4f 5f 3f 1c 97 5a dd b0 be 3f c8 72 73 f8 r.O_?..Z...?.rs. 00:20:33.963 00000150 43 48 b9 95 c0 7f 83 78 b9 fe fa 2a 71 02 48 fc CH.....x...*q.H. 00:20:33.963 00000160 42 17 3d 94 38 ad 90 ea 77 28 00 11 8e 94 f2 f6 B.=.8...w(...... 00:20:33.963 00000170 b1 b6 12 6e b8 91 cc 70 ff bf f0 34 5a ee 5f cb ...n...p...4Z._. 00:20:33.963 host pubkey: 00:20:33.963 00000000 e0 2e 58 09 5f ab 8a d1 1b c2 0e 14 02 df b8 35 ..X._..........5 00:20:33.963 00000010 a4 00 50 0f f2 7e 4b 96 fe 01 19 29 32 e4 8c 8b ..P..~K....)2... 00:20:33.963 00000020 20 ea 51 a8 a9 f2 e1 4b ad cd 86 5a ab 82 b5 31 .Q....K...Z...1 00:20:33.963 00000030 ce 4e bd 58 cb 34 25 5a 67 1d 16 6d fb 6c 0a e4 .N.X.4%Zg..m.l.. 00:20:33.964 00000040 09 11 81 b2 88 1b 1e e5 88 0c b8 75 09 94 2c 70 ...........u..,p 00:20:33.964 00000050 6d 1d a5 a4 52 f8 3a ad f2 ad 2b 2b d3 e4 ea ca m...R.:...++.... 00:20:33.964 00000060 af d5 f9 8d a2 1f 82 88 1a cf 72 89 be 2a ee eb ..........r..*.. 00:20:33.964 00000070 a7 32 72 34 77 de 2c 03 1b fa 60 c4 a8 28 19 f5 .2r4w.,...`..(.. 00:20:33.964 00000080 a6 b4 52 6a 34 f8 60 99 05 67 12 af 53 e8 c2 a5 ..Rj4.`..g..S... 00:20:33.964 00000090 ad 3a f9 28 97 e4 24 2b ea c4 eb 44 69 3b f6 6d .:.(..$+...Di;.m 00:20:33.964 000000a0 4b da ae db 2f 95 80 8f 66 24 5b 5e 57 41 b0 da K.../...f$[^WA.. 00:20:33.964 000000b0 ee d0 f7 78 46 6a 31 eb eb 54 9f 0a 58 a9 a8 74 ...xFj1..T..X..t 00:20:33.964 000000c0 f0 32 27 8e c0 fc ea 75 e0 bd e7 35 06 c3 72 03 .2'....u...5..r. 00:20:33.964 000000d0 9c bc 67 db 98 3e 7d fe 39 ed 37 ec 01 96 f6 8d ..g..>}.9.7..... 00:20:33.964 000000e0 9d a3 8a 55 90 03 8e 2c e5 1a fa 15 39 67 27 1e ...U...,....9g'. 00:20:33.964 000000f0 23 e2 81 a8 fb 40 61 41 6b 03 56 92 6c a2 47 14 #....@aAk.V.l.G. 00:20:33.964 00000100 cb da da 0d 84 0d dd a2 ae 36 6d 23 47 4f 39 e3 .........6m#GO9. 00:20:33.964 00000110 68 d8 2f 5f 12 ca 1a bc cf d0 a5 9c 1c 4f 42 c9 h./_.........OB. 00:20:33.964 00000120 3d 4d 60 06 a5 4f 80 b5 d4 7f 90 7e 00 d3 98 f2 =M`..O.....~.... 00:20:33.964 00000130 19 ca 79 12 4e 4a 4d 9e b9 e3 3e 05 14 94 eb 12 ..y.NJM...>..... 00:20:33.964 00000140 94 22 38 47 da e3 55 7f 8d b6 fd e3 0c 2f e6 61 ."8G..U....../.a 00:20:33.964 00000150 1c fa 32 2a 7a 0b f8 93 4c 35 85 a3 13 90 3a a8 ..2*z...L5....:. 00:20:33.964 00000160 71 91 8a e5 24 00 34 1c a6 00 ca a9 33 d7 03 87 q...$.4.....3... 00:20:33.964 00000170 db 24 26 1e 8c 47 e3 49 ef dc f4 33 ef 28 49 60 .$&..G.I...3.(I` 00:20:33.964 dh secret: 00:20:33.964 00000000 5f 69 a4 aa 95 22 90 24 50 bf 1e 56 82 18 2e b9 _i...".$P..V.... 00:20:33.964 00000010 44 55 fa dd b4 ae 38 8d 1a 93 49 20 7a 97 a5 79 DU....8...I z..y 00:20:33.964 00000020 3d 0a 44 d6 b6 d1 df 83 55 1b ce 47 b6 7c 10 04 =.D.....U..G.|.. 00:20:33.964 00000030 93 ea 64 a3 60 6b 07 7b e4 6b 95 30 1f c8 11 8f ..d.`k.{.k.0.... 00:20:33.964 00000040 90 29 19 43 6e ff a2 46 48 9f 0d d8 98 ed 6c 79 .).Cn..FH.....ly 00:20:33.964 00000050 e3 db 94 eb c7 5d 0e ae 09 23 55 1d 99 53 ad 96 .....]...#U..S.. 00:20:33.964 00000060 37 be 65 5d ef af 70 0c 69 b4 e9 1a e9 80 9c 3f 7.e]..p.i......? 00:20:33.964 00000070 2d 73 6d 9e 57 83 a9 74 a9 c8 19 80 e0 02 86 a9 -sm.W..t........ 00:20:33.964 00000080 f6 b5 5f c5 39 5c 7c dd 68 7c 6c 30 b1 16 6a cb .._.9\|.h|l0..j. 00:20:33.964 00000090 9e f6 0d 3a 1c 10 b5 4c 1c 3d 50 83 d5 44 98 56 ...:...L.=P..D.V 00:20:33.964 000000a0 09 9f 8a bd 30 41 91 1f 26 07 18 1d b9 ce 75 3d ....0A..&.....u= 00:20:33.964 000000b0 69 5e 17 e2 37 d4 f5 7e 89 44 08 6c df c6 1d d7 i^..7..~.D.l.... 00:20:33.964 000000c0 13 23 2f 66 49 20 51 86 f1 78 34 fa 0a 79 93 f6 .#/fI Q..x4..y.. 00:20:33.964 000000d0 57 5b 60 5a 0a 5c ea 50 14 49 36 eb 97 e2 e0 e5 W[`Z.\.P.I6..... 00:20:33.964 000000e0 32 cc 50 18 55 f4 b2 c7 ec a2 79 c0 60 ec 1b f7 2.P.U.....y.`... 00:20:33.964 000000f0 60 d8 06 3f 14 5e d8 bc 73 0a 83 48 6b f2 29 e2 `..?.^..s..Hk.). 00:20:33.964 00000100 f4 13 ba 1c 16 34 4f 9d 8e 02 20 48 1f 3b 9a 02 .....4O... H.;.. 00:20:33.964 00000110 42 f0 a9 ec 18 01 f0 2e 9e 4c 56 6d f1 b8 ec d5 B........LVm.... 00:20:33.964 00000120 28 d0 56 9b 2f 44 f4 87 df 5c 8a 62 74 c5 fa 8b (.V./D...\.bt... 00:20:33.964 00000130 28 3e 53 de 65 62 96 c3 50 c1 50 86 fb 8a 9c c1 (>S.eb..P.P..... 00:20:33.964 00000140 fd 10 83 a4 ea 24 23 8f 1f 9a 28 d1 23 33 90 84 .....$#...(.#3.. 00:20:33.964 00000150 97 7e f6 3c f0 d9 d4 4f 03 8d a4 bb 06 40 d1 d7 .~.<...O.....@.. 00:20:33.964 00000160 9b a0 96 6b 53 2e b6 74 e1 a0 fe 51 46 91 22 9c ...kS..t...QF.". 00:20:33.964 00000170 2c 21 84 b1 bd b2 07 a5 9d a1 d8 97 d8 1d 58 3a ,!............X: 00:20:33.964 [2024-09-27 15:25:27.550412] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=3, dhgroup=2, seq=3428451814, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.964 [2024-09-27 15:25:27.550501] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.964 [2024-09-27 15:25:27.566465] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.964 [2024-09-27 15:25:27.566499] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.964 [2024-09-27 15:25:27.566505] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.964 [2024-09-27 15:25:27.732405] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.964 [2024-09-27 15:25:27.732428] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.964 [2024-09-27 15:25:27.732436] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.964 [2024-09-27 15:25:27.732485] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.964 [2024-09-27 15:25:27.732509] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.964 ctrlr pubkey: 00:20:33.964 00000000 c2 e7 bd 5a 02 1e 52 06 fa fd 4e a5 7d f6 58 46 ...Z..R...N.}.XF 00:20:33.964 00000010 39 ff 3a 92 8d 64 19 44 09 20 9c fa 46 c1 af f1 9.:..d.D. ..F... 00:20:33.964 00000020 ac e4 a5 95 7b 48 8f 1a f0 9b 57 97 87 5a 4e 5c ....{H....W..ZN\ 00:20:33.964 00000030 48 a8 48 fe 04 a2 f1 d0 9d ba 17 9f b2 82 ad e2 H.H............. 00:20:33.964 00000040 c4 b2 db e8 be a9 c8 9b ee 80 4d 25 ec 2a 3b e9 ..........M%.*;. 00:20:33.964 00000050 bd cd ad 73 be d6 07 7f c9 96 a7 e8 5f c3 a0 8e ...s........_... 00:20:33.964 00000060 d4 45 36 c5 64 59 88 74 ef 46 50 2e f8 19 e9 e6 .E6.dY.t.FP..... 00:20:33.964 00000070 4b 93 25 df 71 5e 11 10 39 51 d0 dc a0 51 f8 59 K.%.q^..9Q...Q.Y 00:20:33.964 00000080 d1 a6 87 09 d8 1f 9a 17 a7 ac ca b2 c9 0d 6c 61 ..............la 00:20:33.964 00000090 3f 8b 6b 49 d0 2a 49 45 ae 72 d1 33 62 a3 e1 16 ?.kI.*IE.r.3b... 00:20:33.964 000000a0 a1 06 3f 57 45 2e 8e cd 1b 75 3a e0 7e a3 ac 56 ..?WE....u:.~..V 00:20:33.964 000000b0 b4 ae 0e 0b 24 c2 50 6f 25 0a 8b 53 bd 2b 12 95 ....$.Po%..S.+.. 00:20:33.964 000000c0 6b d8 8f 87 5f 29 ab 4d ee 8d 64 a8 7b dd 0a 72 k..._).M..d.{..r 00:20:33.964 000000d0 45 76 bc 0b f8 89 44 77 7b 91 e2 29 e4 7c ed 67 Ev....Dw{..).|.g 00:20:33.964 000000e0 d8 94 b7 3a 90 be 7d a0 31 fd eb 59 de 01 69 16 ...:..}.1..Y..i. 00:20:33.964 000000f0 9e fd 68 44 9e 7f 19 ca e7 d8 ac e0 6b 87 10 5c ..hD........k..\ 00:20:33.964 00000100 bb fc 8b 27 56 74 eb 2c 60 10 5f b3 f4 23 a5 43 ...'Vt.,`._..#.C 00:20:33.964 00000110 15 61 5d 45 ef 26 b7 27 f5 5c 9a a7 1d 72 06 9e .a]E.&.'.\...r.. 00:20:33.964 00000120 7c aa 6f 6f 80 a9 20 c1 26 eb 3f a7 81 ac d9 a9 |.oo.. .&.?..... 00:20:33.964 00000130 86 54 de f6 25 52 2a c0 fa e0 1a a5 f7 59 fd 43 .T..%R*......Y.C 00:20:33.964 00000140 a1 81 47 5c 23 c6 bd 27 f8 21 33 26 0a d6 38 41 ..G\#..'.!3&..8A 00:20:33.964 00000150 03 5f 42 04 1b 8c 86 97 9a 20 0f b9 b9 64 d8 d1 ._B...... ...d.. 00:20:33.964 00000160 43 f5 67 c8 91 e5 f2 41 36 be 55 a7 66 ff b6 63 C.g....A6.U.f..c 00:20:33.964 00000170 12 a9 9d 84 b1 0b 52 df 5c ce 32 62 4e c5 6b 2e ......R.\.2bN.k. 00:20:33.964 00000180 30 ac e7 71 db 11 c4 1f f7 b4 3c f1 59 24 55 82 0..q......<.Y$U. 00:20:33.964 00000190 1b 68 a8 79 b7 69 0d 91 37 2e 1f 52 ba 22 58 41 .h.y.i..7..R."XA 00:20:33.964 000001a0 87 41 1f 02 dd 64 8f 77 ab f4 15 1e ab a3 c9 48 .A...d.w.......H 00:20:33.964 000001b0 e7 e5 e1 ba b9 e1 e5 bc 8c d2 55 92 31 b5 e2 a0 ..........U.1... 00:20:33.964 000001c0 de c4 b6 9a 99 b5 cc 48 92 48 1d bf 95 e4 55 bc .......H.H....U. 00:20:33.964 000001d0 29 a6 86 4d a5 32 3d b2 7e 50 12 d8 3b 5e 00 bd )..M.2=.~P..;^.. 00:20:33.964 000001e0 3b a6 79 db 58 69 97 fe 00 2c 25 89 cb 77 ef 85 ;.y.Xi...,%..w.. 00:20:33.964 000001f0 47 6e 49 3d c4 a3 84 7e 91 fb 29 1c f9 ca b2 95 GnI=...~..)..... 00:20:33.964 host pubkey: 00:20:33.964 00000000 12 4f e1 b3 6d 55 57 89 98 86 24 4d 1c 6d 87 93 .O..mUW...$M.m.. 00:20:33.964 00000010 1f 6a 1b 2e 31 cb 70 56 ae 77 ec 0d 3d 1f 41 f3 .j..1.pV.w..=.A. 00:20:33.964 00000020 a8 65 a0 49 00 bb df 48 b1 f7 e6 68 38 ca 2c 33 .e.I...H...h8.,3 00:20:33.964 00000030 1d bc dc f5 eb 14 71 59 e5 01 af c7 1a 0c 2e 59 ......qY.......Y 00:20:33.964 00000040 4b c8 a4 72 f1 65 b6 01 58 ef 60 29 d5 d6 93 94 K..r.e..X.`).... 00:20:33.964 00000050 0f 3c 14 7e 2f 14 a5 4f d3 6c f0 b4 fa fb 38 54 .<.~/..O.l....8T 00:20:33.964 00000060 61 d6 d7 9d bc 0d 3e 6e 4d 2a 88 44 f3 d8 8e 65 a.....>nM*.D...e 00:20:33.964 00000070 f9 ae 33 a1 1f b8 b2 c6 60 aa a6 c5 31 16 00 20 ..3.....`...1.. 00:20:33.964 00000080 1a ff c8 4b 36 0a d9 d6 86 e7 86 10 86 2e 10 43 ...K6..........C 00:20:33.964 00000090 79 b4 e5 87 41 27 d2 94 b3 da f5 d3 72 03 2b 82 y...A'......r.+. 00:20:33.964 000000a0 67 9a 76 a9 3f 76 7c 3b 0d 29 b2 f1 03 c9 37 96 g.v.?v|;.)....7. 00:20:33.964 000000b0 b0 92 16 d3 ec d3 f2 68 81 ad 6f 09 aa 74 6b f0 .......h..o..tk. 00:20:33.964 000000c0 9a 55 28 56 e7 15 87 1a fd 33 4e 5b 96 00 93 9e .U(V.....3N[.... 00:20:33.964 000000d0 65 e5 b5 7f c1 de 2f 74 c8 15 56 2a 2c 11 97 f5 e...../t..V*,... 00:20:33.964 000000e0 8c b4 05 46 fd c8 16 06 82 9a 06 25 ba 26 35 34 ...F.......%.&54 00:20:33.964 000000f0 0a 71 80 c2 eb d1 08 f1 a3 75 20 5f 84 78 11 d4 .q.......u _.x.. 00:20:33.964 00000100 be 2a e6 8f 5e f5 76 4a e3 ba 12 fc f6 30 ed 7b .*..^.vJ.....0.{ 00:20:33.964 00000110 82 1f 4e 81 e0 c7 25 10 b6 ea bd a4 74 ec 31 39 ..N...%.....t.19 00:20:33.964 00000120 3b 07 5e 6f a6 43 15 ca 3c 3e 73 2d 4a 49 85 f0 ;.^o.C..<>s-JI.. 00:20:33.964 00000130 23 f6 eb 72 62 b0 90 cf 16 91 cd 2f 4e 63 73 7e #..rb....../Ncs~ 00:20:33.964 00000140 f6 ab 30 a8 84 a9 eb 82 60 ae f3 02 65 07 7e 83 ..0.....`...e.~. 00:20:33.965 00000150 55 49 51 74 79 5f 9e 97 c6 77 81 71 55 d3 99 ea UIQty_...w.qU... 00:20:33.965 00000160 75 19 40 78 9a fd 59 47 15 b9 2a 34 ee 82 a0 74 u.@x..YG..*4...t 00:20:33.965 00000170 4d fa 6a 85 4a 90 d9 ac 9d 94 d9 d7 95 ad 46 3f M.j.J.........F? 00:20:33.965 00000180 ef fc 53 01 ed 89 a2 14 64 bb 82 a9 22 10 e4 50 ..S.....d..."..P 00:20:33.965 00000190 9b 17 87 26 cd 88 3f 0a 1d 6b 68 a2 9c 25 da d5 ...&..?..kh..%.. 00:20:33.965 000001a0 c2 c9 6a e0 77 4d ac 3c d1 91 3a 2e 1b 66 a8 ef ..j.wM.<..:..f.. 00:20:33.965 000001b0 b3 ac fa a9 42 65 75 99 38 e6 3b 8f 24 69 05 77 ....Beu.8.;.$i.w 00:20:33.965 000001c0 ba fd 41 31 cb f9 de 93 eb 59 52 0d a1 08 eb 2a ..A1.....YR....* 00:20:33.965 000001d0 60 2f d2 aa 8d 47 2f e8 27 b5 f6 ce 3d aa 96 c3 `/...G/.'...=... 00:20:33.965 000001e0 e5 6f c2 b1 26 6e 5c e2 5d dc b8 06 22 f9 7b 69 .o..&n\.]...".{i 00:20:33.965 000001f0 f8 33 13 68 75 77 1c 6c df b9 f3 50 e9 88 9c 31 .3.huw.l...P...1 00:20:33.965 dh secret: 00:20:33.965 00000000 01 fc 27 26 e2 60 95 e1 63 6f 5d 4e 77 1b b7 59 ..'&.`..co]Nw..Y 00:20:33.965 00000010 85 0e e4 49 2d 27 7b 71 19 04 97 b0 06 ee e2 a2 ...I-'{q........ 00:20:33.965 00000020 c7 48 37 ae f0 36 95 80 20 24 9f 71 20 cf 3a 28 .H7..6.. $.q .:( 00:20:33.965 00000030 98 d8 2d 68 81 da 90 3b 05 30 31 41 b9 bb 36 80 ..-h...;.01A..6. 00:20:33.965 00000040 9e 34 0c f6 b5 c8 2d 6a 2e b9 d8 3c a3 2d 1f 21 .4....-j...<.-.! 00:20:33.965 00000050 e4 b9 8c 43 78 ec 8f dd 56 18 73 52 37 b2 6b 46 ...Cx...V.sR7.kF 00:20:33.965 00000060 29 dd 27 ff 3b c4 9d ff ea 76 f2 af 64 bf 6a a9 ).'.;....v..d.j. 00:20:33.965 00000070 b2 fd 9d 34 e9 7e af 3d 61 64 40 aa fd ac 78 35 ...4.~.=ad@...x5 00:20:33.965 00000080 19 36 25 fb 9e 05 e6 6f 09 79 ef b4 bb 81 94 2f .6%....o.y...../ 00:20:33.965 00000090 f5 02 aa c0 09 25 4d 12 1e 7f 40 93 44 05 21 a0 .....%M...@.D.!. 00:20:33.965 000000a0 e6 d1 dc 90 63 62 bc ae 04 ad 83 8d 5e c9 54 0b ....cb......^.T. 00:20:33.965 000000b0 b0 53 cd 3c 42 24 ea b5 ec 72 9e 72 54 22 59 85 .S....G.Y.f. 00:20:33.965 00000100 37 78 5a e6 ea 30 66 3c 92 29 c3 e0 dc d8 3e ea 7xZ..0f<.)....>. 00:20:33.965 00000110 9e a4 3b 51 da 3e d3 24 8a 52 b9 9c 83 47 5a 16 ..;Q.>.$.R...GZ. 00:20:33.965 00000120 43 d3 f7 bb 1c ec 13 6b bf f3 32 1f 0d 5c c4 b0 C......k..2..\.. 00:20:33.965 00000130 cd e6 86 00 69 a3 db aa 2c ab 1c b2 85 ed 94 4f ....i...,......O 00:20:33.965 00000140 43 44 7d 00 63 9c 65 06 8f ff e9 d7 82 5e 78 20 CD}.c.e......^x 00:20:33.965 00000150 99 52 b7 7d d1 83 e6 94 dd 52 1e 51 94 41 a7 17 .R.}.....R.Q.A.. 00:20:33.965 00000160 b9 2e ef 2c f9 73 65 63 75 59 27 51 b6 12 ff 76 ...,.secuY'Q...v 00:20:33.965 00000170 d9 0f 61 42 75 9b cb 8c 8c ac c6 bf 27 29 e7 2a ..aBu.......').* 00:20:33.965 00000180 4d 68 89 3e ea 6e 88 ff 5c 51 25 96 81 ef 89 96 Mh.>.n..\Q%..... 00:20:33.965 00000190 7c 3b a5 4d 65 4b 25 36 11 b8 6f ac b0 79 09 c1 |;.MeK%6..o..y.. 00:20:33.965 000001a0 a3 a6 6e 84 4d 50 b5 df 2a 04 ae 0b a2 ab 68 e7 ..n.MP..*.....h. 00:20:33.965 000001b0 6f ef da 20 84 18 60 8e 28 4c c1 5d de 06 11 2a o.. ..`.(L.]...* 00:20:33.965 000001c0 54 ea e0 63 09 99 12 a0 fc 51 26 23 1f 8a 73 49 T..c.....Q&#..sI 00:20:33.965 000001d0 2d fc 95 6f df 95 06 95 78 59 85 61 a3 d6 a9 b3 -..o....xY.a.... 00:20:33.965 000001e0 c3 68 b5 28 81 7b 91 06 ab 49 0a 59 73 60 88 b6 .h.(.{...I.Ys`.. 00:20:33.965 000001f0 fa 0b 75 71 6f 76 24 9f dc eb 76 d4 bb 45 37 22 ..uqov$...v..E7" 00:20:33.965 [2024-09-27 15:25:27.749034] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=3, dhgroup=3, seq=3428451815, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.965 [2024-09-27 15:25:27.765960] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.965 [2024-09-27 15:25:27.766002] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.965 [2024-09-27 15:25:27.766021] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.965 [2024-09-27 15:25:27.766043] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.965 [2024-09-27 15:25:27.766058] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.965 [2024-09-27 15:25:27.872395] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.965 [2024-09-27 15:25:27.872420] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.965 [2024-09-27 15:25:27.872428] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.965 [2024-09-27 15:25:27.872438] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.965 [2024-09-27 15:25:27.872495] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.965 ctrlr pubkey: 00:20:33.965 00000000 c2 e7 bd 5a 02 1e 52 06 fa fd 4e a5 7d f6 58 46 ...Z..R...N.}.XF 00:20:33.965 00000010 39 ff 3a 92 8d 64 19 44 09 20 9c fa 46 c1 af f1 9.:..d.D. ..F... 00:20:33.965 00000020 ac e4 a5 95 7b 48 8f 1a f0 9b 57 97 87 5a 4e 5c ....{H....W..ZN\ 00:20:33.965 00000030 48 a8 48 fe 04 a2 f1 d0 9d ba 17 9f b2 82 ad e2 H.H............. 00:20:33.965 00000040 c4 b2 db e8 be a9 c8 9b ee 80 4d 25 ec 2a 3b e9 ..........M%.*;. 00:20:33.965 00000050 bd cd ad 73 be d6 07 7f c9 96 a7 e8 5f c3 a0 8e ...s........_... 00:20:33.965 00000060 d4 45 36 c5 64 59 88 74 ef 46 50 2e f8 19 e9 e6 .E6.dY.t.FP..... 00:20:33.965 00000070 4b 93 25 df 71 5e 11 10 39 51 d0 dc a0 51 f8 59 K.%.q^..9Q...Q.Y 00:20:33.965 00000080 d1 a6 87 09 d8 1f 9a 17 a7 ac ca b2 c9 0d 6c 61 ..............la 00:20:33.965 00000090 3f 8b 6b 49 d0 2a 49 45 ae 72 d1 33 62 a3 e1 16 ?.kI.*IE.r.3b... 00:20:33.965 000000a0 a1 06 3f 57 45 2e 8e cd 1b 75 3a e0 7e a3 ac 56 ..?WE....u:.~..V 00:20:33.965 000000b0 b4 ae 0e 0b 24 c2 50 6f 25 0a 8b 53 bd 2b 12 95 ....$.Po%..S.+.. 00:20:33.965 000000c0 6b d8 8f 87 5f 29 ab 4d ee 8d 64 a8 7b dd 0a 72 k..._).M..d.{..r 00:20:33.965 000000d0 45 76 bc 0b f8 89 44 77 7b 91 e2 29 e4 7c ed 67 Ev....Dw{..).|.g 00:20:33.965 000000e0 d8 94 b7 3a 90 be 7d a0 31 fd eb 59 de 01 69 16 ...:..}.1..Y..i. 00:20:33.965 000000f0 9e fd 68 44 9e 7f 19 ca e7 d8 ac e0 6b 87 10 5c ..hD........k..\ 00:20:33.965 00000100 bb fc 8b 27 56 74 eb 2c 60 10 5f b3 f4 23 a5 43 ...'Vt.,`._..#.C 00:20:33.965 00000110 15 61 5d 45 ef 26 b7 27 f5 5c 9a a7 1d 72 06 9e .a]E.&.'.\...r.. 00:20:33.965 00000120 7c aa 6f 6f 80 a9 20 c1 26 eb 3f a7 81 ac d9 a9 |.oo.. .&.?..... 00:20:33.965 00000130 86 54 de f6 25 52 2a c0 fa e0 1a a5 f7 59 fd 43 .T..%R*......Y.C 00:20:33.965 00000140 a1 81 47 5c 23 c6 bd 27 f8 21 33 26 0a d6 38 41 ..G\#..'.!3&..8A 00:20:33.965 00000150 03 5f 42 04 1b 8c 86 97 9a 20 0f b9 b9 64 d8 d1 ._B...... ...d.. 00:20:33.965 00000160 43 f5 67 c8 91 e5 f2 41 36 be 55 a7 66 ff b6 63 C.g....A6.U.f..c 00:20:33.965 00000170 12 a9 9d 84 b1 0b 52 df 5c ce 32 62 4e c5 6b 2e ......R.\.2bN.k. 00:20:33.965 00000180 30 ac e7 71 db 11 c4 1f f7 b4 3c f1 59 24 55 82 0..q......<.Y$U. 00:20:33.965 00000190 1b 68 a8 79 b7 69 0d 91 37 2e 1f 52 ba 22 58 41 .h.y.i..7..R."XA 00:20:33.965 000001a0 87 41 1f 02 dd 64 8f 77 ab f4 15 1e ab a3 c9 48 .A...d.w.......H 00:20:33.965 000001b0 e7 e5 e1 ba b9 e1 e5 bc 8c d2 55 92 31 b5 e2 a0 ..........U.1... 00:20:33.965 000001c0 de c4 b6 9a 99 b5 cc 48 92 48 1d bf 95 e4 55 bc .......H.H....U. 00:20:33.965 000001d0 29 a6 86 4d a5 32 3d b2 7e 50 12 d8 3b 5e 00 bd )..M.2=.~P..;^.. 00:20:33.965 000001e0 3b a6 79 db 58 69 97 fe 00 2c 25 89 cb 77 ef 85 ;.y.Xi...,%..w.. 00:20:33.965 000001f0 47 6e 49 3d c4 a3 84 7e 91 fb 29 1c f9 ca b2 95 GnI=...~..)..... 00:20:33.965 host pubkey: 00:20:33.965 00000000 5e 9f 75 e7 0d 3c f0 e4 11 29 a2 15 a8 2b 0a a1 ^.u..<...)...+.. 00:20:33.965 00000010 98 21 d9 c6 56 60 1c cb 55 4c 83 2d ab ea 28 a7 .!..V`..UL.-..(. 00:20:33.965 00000020 6e 03 fe ac f0 82 b9 78 27 63 db 64 cf f9 99 aa n......x'c.d.... 00:20:33.965 00000030 c6 8d a5 84 21 a7 3b 0d 8f eb ca 01 7d 2d 92 21 ....!.;.....}-.! 00:20:33.965 00000040 af b2 4d 73 27 55 f3 37 4d d7 cb 6c e7 68 00 dc ..Ms'U.7M..l.h.. 00:20:33.965 00000050 a7 16 50 83 70 48 99 d2 f5 57 0c 9c 9e 60 2e 9a ..P.pH...W...`.. 00:20:33.965 00000060 99 fc e4 91 f7 70 6b 24 23 57 d7 23 53 45 40 46 .....pk$#W.#SE@F 00:20:33.965 00000070 84 93 e6 77 21 56 20 06 57 28 98 6f e5 fc c4 01 ...w!V .W(.o.... 00:20:33.965 00000080 e3 b8 9e c6 a4 b5 1b d4 bd 40 65 7e e9 21 85 80 .........@e~.!.. 00:20:33.965 00000090 3c 67 1e a9 9a c0 d4 dd de 05 3e 2d 6b 63 ac 84 -kc.. 00:20:33.965 000000a0 e2 fd 6d 95 ab 31 3f 30 e5 11 9c c8 30 55 7a ef ..m..1?0....0Uz. 00:20:33.965 000000b0 f6 36 00 eb 33 23 5e 8d be 35 1f b8 82 32 66 1a .6..3#^..5...2f. 00:20:33.965 000000c0 74 9c 89 76 2e 16 fd 13 75 e9 5e 76 e5 e5 d5 a2 t..v....u.^v.... 00:20:33.965 000000d0 6c 40 91 f8 4b c6 7d 5d 96 2b 4b 0c 81 f9 6a f1 l@..K.}].+K...j. 00:20:33.965 000000e0 ab c1 c4 59 75 c1 b7 b6 17 73 ce 0a 26 a8 50 26 ...Yu....s..&.P& 00:20:33.965 000000f0 e8 f3 a8 8e a8 d4 8e 3f 2d c9 6e 84 3c d4 6b 6f .......?-.n.<.ko 00:20:33.965 00000100 6d 58 4f 26 4e 42 f4 6e bd 4d 0c 81 b5 ca e8 97 mXO&NB.n.M...... 00:20:33.965 00000110 a9 55 19 79 2e 3b 90 d9 03 a7 e7 ee 07 3e f0 6a .U.y.;.......>.j 00:20:33.965 00000120 b7 3f 90 1c 83 1e 86 0e 89 5c 55 70 a0 04 a0 17 .?.......\Up.... 00:20:33.965 00000130 db ff 8a 29 96 78 73 00 3c 8f c6 c7 a5 62 2e f9 ...).xs.<....b.. 00:20:33.965 00000140 fe 5c 7a 86 9d e3 94 d6 c1 c7 f9 de af f6 41 70 .\z...........Ap 00:20:33.965 00000150 92 0a 40 4c fc 85 b6 d6 cd 5e fe 32 af b6 59 98 ..@L.....^.2..Y. 00:20:33.965 00000160 d3 17 17 0f e0 b2 5e 4f 23 0e 54 30 b8 b4 a6 6f ......^O#.T0...o 00:20:33.965 00000170 ee 08 d1 ef 45 9e 84 54 51 7f 4d f3 7d 49 c9 06 ....E..TQ.M.}I.. 00:20:33.965 00000180 a6 74 78 a8 18 bc e7 b1 c2 05 21 1e 36 36 bd af .tx.......!.66.. 00:20:33.965 00000190 a1 d5 90 c1 43 1c 8e 3d 66 44 db b0 b1 44 51 fd ....C..=fD...DQ. 00:20:33.965 000001a0 59 e8 fe d2 df 49 5d 27 3b a0 13 e2 d5 26 47 51 Y....I]';....&GQ 00:20:33.965 000001b0 59 b6 bb 95 76 bf 97 67 08 25 09 d0 ad 31 79 b7 Y...v..g.%...1y. 00:20:33.965 000001c0 06 97 27 05 91 0b dc ff cd 57 90 fb 3a 03 28 3b ..'......W..:.(; 00:20:33.965 000001d0 7f 67 28 e1 11 37 87 7e 4b 41 56 17 24 d3 e4 93 .g(..7.~KAV.$... 00:20:33.965 000001e0 fe bb 41 19 f3 a9 03 ae dc 67 cc fe f4 2f f0 38 ..A......g.../.8 00:20:33.965 000001f0 cb aa 94 6c df cc 00 70 94 a0 1e c9 46 ea 1b ae ...l...p....F... 00:20:33.965 dh secret: 00:20:33.965 00000000 00 3b fa ae d0 8e 65 01 f3 9f 43 6a f7 33 13 73 .;....e...Cj.3.s 00:20:33.965 00000010 77 8c 74 1d fd 83 a6 a9 7e 88 dd 3d 8e 5a 0e ef w.t.....~..=.Z.. 00:20:33.966 00000020 9c eb af c8 f3 cc 4c f9 d4 9d 4c 4a a1 fb ea 9e ......L...LJ.... 00:20:33.966 00000030 38 5d 52 8c 94 51 e0 ed 32 d8 5a 1f 31 43 4a db 8]R..Q..2.Z.1CJ. 00:20:33.966 00000040 f2 25 29 0c 7a 15 d9 84 12 93 59 6a f0 33 70 10 .%).z.....Yj.3p. 00:20:33.966 00000050 d4 9d 63 57 e8 7a 08 e8 00 e8 00 96 aa cb 0f 28 ..cW.z.........( 00:20:33.966 00000060 60 34 0c dc 3c d8 f1 54 e9 64 78 a7 2d 9b 09 f0 `4..<..T.dx.-... 00:20:33.966 00000070 2b 43 a6 ed cf 05 76 e9 c3 8c 6b 6e bb 4b 97 70 +C....v...kn.K.p 00:20:33.966 00000080 d0 e1 25 32 9c 75 d3 87 e8 64 b2 d8 eb f0 95 7f ..%2.u...d...... 00:20:33.966 00000090 24 3c 19 74 e7 12 34 92 4c 61 5d d1 0d 55 aa b3 $<.t..4.La]..U.. 00:20:33.966 000000a0 08 ca ed 23 9c d5 fb 10 0b 05 b9 20 a3 79 2a ea ...#....... .y*. 00:20:33.966 000000b0 6a d2 48 27 b2 ee 0f 7c db ad 24 a5 f8 fa 64 2f j.H'...|..$...d/ 00:20:33.966 000000c0 f3 9b fc 3f 54 e5 88 da 73 11 ba c8 ab c8 75 54 ...?T...s.....uT 00:20:33.966 000000d0 6b aa 3e cf 98 63 6f c3 33 8e 50 66 c1 e6 6f fe k.>..co.3.Pf..o. 00:20:33.966 000000e0 14 5f 3e ca 1b 59 e2 12 15 b2 e2 cd 70 df 03 72 ._>..Y......p..r 00:20:33.966 000000f0 da b7 a2 76 98 ed b4 4e 95 c2 5f 83 d0 54 0a 94 ...v...N.._..T.. 00:20:33.966 00000100 79 de 2e d9 55 2c af f1 cb 5c 6f 8b 81 b0 e9 b5 y...U,...\o..... 00:20:33.966 00000110 e5 11 43 71 00 97 43 54 ca d3 15 37 37 75 0a 45 ..Cq..CT...77u.E 00:20:33.966 00000120 a0 57 9c da 7e ff 9d 88 52 ef 2f 5d c2 81 82 d2 .W..~...R./].... 00:20:33.966 00000130 c3 1c 50 27 8d a7 3a cf f5 73 9e 64 fc d3 ec 8c ..P'..:..s.d.... 00:20:33.966 00000140 3f 14 77 db e9 04 fb 3e bf b5 18 c5 11 f5 b4 39 ?.w....>.......9 00:20:33.966 00000150 a0 89 e1 8e e9 5c 1f dd a9 53 28 d1 3f 35 a8 9e .....\...S(.?5.. 00:20:33.966 00000160 65 13 55 aa 17 c3 b1 40 09 1b 9b 74 dc 29 07 42 e.U....@...t.).B 00:20:33.966 00000170 e2 0b 91 b4 f8 93 b3 df 4c 92 6a 58 0a d9 85 fa ........L.jX.... 00:20:33.966 00000180 ce 73 1a ae e6 07 99 fa 22 37 64 81 68 8e ad 8c .s......"7d.h... 00:20:33.966 00000190 ef 61 63 23 55 cb e0 01 29 be cb 54 8c b5 c7 64 .ac#U...)..T...d 00:20:33.966 000001a0 1d d8 e2 60 da f8 a8 12 c6 29 78 71 11 c3 8d 6e ...`.....)xq...n 00:20:33.966 000001b0 56 a8 2c 9a dc 15 d3 f7 41 bd d7 84 1a 49 78 10 V.,.....A....Ix. 00:20:33.966 000001c0 f2 08 0b 6a 52 fe 57 83 7e f1 0b 08 d2 50 b8 10 ...jR.W.~....P.. 00:20:33.966 000001d0 a4 19 44 3c 6d 3d 6f 78 8e b2 cf ca 30 49 f5 4d ..D...... 00:20:33.966 00000150 80 b1 05 1e c7 d7 53 19 6b 5e d7 5a 0c cf cb 5c ......S.k^.Z...\ 00:20:33.966 00000160 98 58 a7 a8 8a 7d ab 55 61 2f 53 84 81 a3 f1 cc .X...}.Ua/S..... 00:20:33.966 00000170 d4 2a 4b c9 cc 76 13 04 05 42 fc d4 77 b3 f4 ff .*K..v...B..w... 00:20:33.966 00000180 b3 ec d6 b4 31 cf 5a b7 bb 5b 07 73 ca 0a 22 17 ....1.Z..[.s..". 00:20:33.966 00000190 44 04 40 5d a7 ec f5 e2 96 68 99 64 c0 a5 b6 41 D.@].....h.d...A 00:20:33.966 000001a0 92 0b c1 a0 91 3b 4a ec 4c 7c 32 14 8f 28 b5 80 .....;J.L|2..(.. 00:20:33.966 000001b0 14 da a3 95 1b 07 d4 d4 8d c5 3d fa a9 82 a0 37 ..........=....7 00:20:33.966 000001c0 54 67 cc 65 d3 d0 92 b1 68 5f ae dd 5b 1e c1 47 Tg.e....h_..[..G 00:20:33.966 000001d0 ca 87 24 29 45 f1 0d df aa 37 b0 c9 db 56 db 8d ..$)E....7...V.. 00:20:33.966 000001e0 e1 7d 23 66 2b 68 52 08 1f 7f f0 43 88 80 6a 67 .}#f+hR....C..jg 00:20:33.966 000001f0 9a 87 81 d7 42 88 2f 25 84 80 b8 db 4d a9 cd ed ....B./%....M... 00:20:33.966 host pubkey: 00:20:33.966 00000000 47 80 d6 1e e4 68 28 1a 94 0f 07 b3 ec d1 06 ee G....h(......... 00:20:33.966 00000010 7c c3 2e a7 9c c7 e1 e9 c2 c0 4a de 66 84 aa 35 |.........J.f..5 00:20:33.966 00000020 87 8f d8 e9 f2 e2 89 1e 90 2b 97 ec 42 a9 54 c1 .........+..B.T. 00:20:33.966 00000030 7d 98 4c 9b 3e f7 b4 db b5 3d 47 11 91 87 d8 0b }.L.>....=G..... 00:20:33.966 00000040 23 8d 40 19 e0 62 28 df 0f d5 ac 68 6a 19 2d b5 #.@..b(....hj.-. 00:20:33.966 00000050 e6 00 1d ac c5 a6 32 1d 84 eb f9 82 b5 53 35 ae ......2......S5. 00:20:33.966 00000060 3b e8 96 2c cd 32 88 22 44 ae 2d 18 0f f9 3d e5 ;..,.2."D.-...=. 00:20:33.966 00000070 30 d6 6e a9 91 ac 14 c7 db f6 cb 02 02 4c 7f e7 0.n..........L.. 00:20:33.966 00000080 1c 8a d1 ba db ca 05 4c 40 b8 56 96 4a 53 c4 85 .......L@.V.JS.. 00:20:33.966 00000090 8f 9e fc 33 d0 54 d9 0b 76 05 47 32 4d 9a 1c 2a ...3.T..v.G2M..* 00:20:33.966 000000a0 ec 02 73 2f fa 43 e7 59 e9 0e 82 4d 79 2c 52 10 ..s/.C.Y...My,R. 00:20:33.966 000000b0 f5 4f 38 6d f7 56 6c e1 22 6f e7 17 86 19 2d 5f .O8m.Vl."o....-_ 00:20:33.966 000000c0 b8 ce 72 bf f1 50 81 00 f9 3b c9 e2 7d 14 c1 36 ..r..P...;..}..6 00:20:33.966 000000d0 80 f6 7b e3 53 1f 58 93 d6 25 de 19 bc 16 b5 83 ..{.S.X..%...... 00:20:33.966 000000e0 82 39 9e 4a 90 87 16 c2 fb 48 70 6c 96 fe e7 cc .9.J.....Hpl.... 00:20:33.966 000000f0 bf 51 22 37 92 2f 23 7b 78 9d eb e9 e8 ee af 7f .Q"7./#{x....... 00:20:33.966 00000100 36 28 cc 71 b0 63 9b 55 fd 79 8b e3 c3 56 ad c7 6(.q.c.U.y...V.. 00:20:33.966 00000110 db 86 bc fa e4 3e 76 c7 b8 61 9f 4e 2a 7d fd 00 .....>v..a.N*}.. 00:20:33.966 00000120 9f b6 7c fa 73 96 52 19 73 8b 24 67 a2 be b8 a8 ..|.s.R.s.$g.... 00:20:33.966 00000130 4b f7 50 a2 03 19 7e 05 2f ae 61 a0 dc 08 ab 2e K.P...~./.a..... 00:20:33.966 00000140 89 72 2c b0 a2 c0 3e 58 5a 2f 24 b2 22 87 bb ea .r,...>XZ/$."... 00:20:33.966 00000150 30 6f d9 29 7f 83 8f 05 8d d9 52 07 bc 57 88 04 0o.)......R..W.. 00:20:33.966 00000160 b7 66 7b 0b 0d 84 04 d2 c4 79 4d 78 84 0b 4b fe .f{......yMx..K. 00:20:33.966 00000170 71 f2 fa 1d a7 eb 97 11 02 3c d9 59 6d 79 4e e8 q........<.YmyN. 00:20:33.966 00000180 22 6e 8b ac d5 8f 5d 7b 1b 41 21 0c 1e f1 c5 22 "n....]{.A!...." 00:20:33.966 00000190 27 ce d4 19 f2 12 80 cc f1 ff 96 81 62 0f da 66 '...........b..f 00:20:33.966 000001a0 74 52 32 32 27 e1 37 eb 1c 6d 56 9b 28 26 68 5f tR22'.7..mV.(&h_ 00:20:33.966 000001b0 97 5c 7a b3 37 e8 ed cb 9c 09 ca c5 f4 54 ed 5d .\z.7........T.] 00:20:33.966 000001c0 40 b2 fa 6f 97 ea 0c f9 d7 44 22 c0 c5 10 ba ac @..o.....D"..... 00:20:33.966 000001d0 0d 38 17 1e dd 22 b6 68 14 f0 91 ca a7 c8 4f b6 .8...".h......O. 00:20:33.966 000001e0 9b 1f 65 f3 1c 47 4b 13 db 92 20 cc f6 54 be e6 ..e..GK... ..T.. 00:20:33.966 000001f0 07 39 63 c8 26 99 f3 72 1f c6 6a cb cd b0 54 80 .9c.&..r..j...T. 00:20:33.966 dh secret: 00:20:33.966 00000000 e3 f9 bb 6d 09 3e d5 6b b5 24 4b 08 ac 9b 44 59 ...m.>.k.$K...DY 00:20:33.966 00000010 8b 56 d0 91 6e cd fd 6d 8a 4c d9 a2 8a 1a fb 88 .V..n..m.L...... 00:20:33.966 00000020 28 5e 58 09 31 65 e7 0f 33 f3 7e f8 9e f0 bb 7f (^X.1e..3.~..... 00:20:33.966 00000030 ef 31 ca fa e0 fe 74 57 3d 4c ba d6 74 8d 2f a1 .1....tW=L..t./. 00:20:33.966 00000040 3c 4a 89 65 78 98 5f 79 18 84 a7 2d 67 e9 90 fd 00:20:33.966 00000060 69 fb ab 59 d2 f4 f5 cf 16 f9 c4 9c 9a fe e9 cf i..Y............ 00:20:33.966 00000070 4b ad 82 4a d1 25 b5 59 ff 21 fe 40 1e 28 9e 16 K..J.%.Y.!.@.(.. 00:20:33.966 00000080 ea c3 e8 2e f8 a4 c2 8d 33 25 ec bc c4 95 f5 af ........3%...... 00:20:33.966 00000090 87 0a 5d fc 99 6e 5c 93 17 7b b2 a4 d9 71 cc 6c ..]..n\..{...q.l 00:20:33.966 000000a0 12 d7 5a 58 f0 0b f5 1b c7 9c 59 3f d9 2d a0 ac ..ZX......Y?.-.. 00:20:33.966 000000b0 dd c0 95 92 eb e0 b4 33 c9 d4 af 84 d8 5e 05 d1 .......3.....^.. 00:20:33.966 000000c0 18 5e d8 d5 f7 82 ff 01 93 37 e9 55 d4 41 10 c4 .^.......7.U.A.. 00:20:33.966 000000d0 e2 06 53 e5 87 9f a8 26 de 2b b5 72 97 3a 02 d3 ..S....&.+.r.:.. 00:20:33.966 000000e0 19 27 51 f6 a2 72 0c 3d dd ff e1 43 d9 d2 11 55 .'Q..r.=...C...U 00:20:33.967 000000f0 6e fe b4 ba 38 af cd a3 2e cc e6 8b bb d3 2f b9 n...8........./. 00:20:33.967 00000100 d4 04 c8 d3 4b a8 07 e8 7e b1 86 0f 81 56 ab d7 ....K...~....V.. 00:20:33.967 00000110 b2 94 48 6e 37 56 e7 df a8 5f 12 bb a7 70 01 a1 ..Hn7V..._...p.. 00:20:33.967 00000120 8f 71 02 2f 1d d9 c7 57 ef 69 e1 e7 15 16 c8 2b .q./...W.i.....+ 00:20:33.967 00000130 b8 7e e1 78 63 6c 29 62 c0 ab b8 78 40 78 33 62 .~.xcl)b...x@x3b 00:20:33.967 00000140 a2 b4 ff 0a f7 fa 8d ae 7d 27 04 d9 b6 5b d2 c8 ........}'...[.. 00:20:33.967 00000150 81 04 f5 18 50 78 04 05 95 12 38 39 73 17 9f ec ....Px....89s... 00:20:33.967 00000160 7e 5b f7 84 a3 d6 49 78 db 7d 0d 97 01 d0 43 3b ~[....Ix.}....C; 00:20:33.967 00000170 f7 44 f6 7e 2e 64 4e ac a3 7f 2c 4e 7e af 68 00 .D.~.dN...,N~.h. 00:20:33.967 00000180 a2 d0 8e 55 55 d2 93 39 6c 90 e4 6d 98 79 ed c3 ...UU..9l..m.y.. 00:20:33.967 00000190 f7 32 30 0d 23 2f 6a 33 46 a6 c0 e0 04 46 af 2b .20.#/j3F....F.+ 00:20:33.967 000001a0 2e dd 1a 24 2f 33 b7 98 d1 6e 97 33 c5 9a 67 d2 ...$/3...n.3..g. 00:20:33.967 000001b0 64 ca 83 7b 8c bb 56 64 6e 05 07 c8 5b bb c1 0b d..{..Vdn...[... 00:20:33.967 000001c0 2c d0 70 85 65 65 f6 00 3e 0a b9 39 d8 ee f7 b1 ,.p.ee..>..9.... 00:20:33.967 000001d0 e1 81 c9 cc 98 4b 39 71 1d 01 b0 d7 a4 16 22 ca .....K9q......". 00:20:33.967 000001e0 b1 08 3d 9a d2 b4 94 8b d8 31 01 30 85 90 7d 9e ..=......1.0..}. 00:20:33.967 000001f0 d4 5d c5 3f e9 d6 b8 16 31 2f dd 81 97 86 79 bb .].?....1/....y. 00:20:33.967 [2024-09-27 15:25:28.098138] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=3, dhgroup=3, seq=3428451817, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.967 [2024-09-27 15:25:28.114788] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.967 [2024-09-27 15:25:28.114831] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.967 [2024-09-27 15:25:28.114848] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.967 [2024-09-27 15:25:28.114872] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.967 [2024-09-27 15:25:28.114882] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.967 [2024-09-27 15:25:28.221144] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.967 [2024-09-27 15:25:28.221162] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.967 [2024-09-27 15:25:28.221170] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.967 [2024-09-27 15:25:28.221180] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.967 [2024-09-27 15:25:28.221237] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.967 ctrlr pubkey: 00:20:33.967 00000000 80 0a 06 99 b6 71 21 84 fd ef 13 93 61 b8 2a f0 .....q!.....a.*. 00:20:33.967 00000010 67 de 9e 7d fd a4 b8 2b 7f 18 e2 db 6e af a8 54 g..}...+....n..T 00:20:33.967 00000020 6a ce cf 28 3a f3 9a e6 fe 61 07 47 d0 71 ca 19 j..(:....a.G.q.. 00:20:33.967 00000030 c6 d0 54 f7 e7 27 60 1d 19 b4 00 94 52 45 6b 7f ..T..'`.....REk. 00:20:33.967 00000040 72 25 56 4d 5c 8c 2c 53 48 e8 4b 57 b1 57 71 f2 r%VM\.,SH.KW.Wq. 00:20:33.967 00000050 03 65 40 76 e3 8b 49 68 f9 4a b5 b2 56 99 34 02 .e@v..Ih.J..V.4. 00:20:33.967 00000060 a8 1b 35 50 c9 f5 25 e0 7c b5 e7 e2 5a 49 60 8b ..5P..%.|...ZI`. 00:20:33.967 00000070 07 2b b6 0c 6f be 94 d5 44 aa 95 ac d8 0d d8 c5 .+..o...D....... 00:20:33.967 00000080 d0 3a 40 b8 08 6c 42 5b fd f8 91 f4 86 2a 2a a4 .:@..lB[.....**. 00:20:33.967 00000090 62 9d 38 58 2d e5 89 00 d1 ed 90 93 b5 03 61 78 b.8X-.........ax 00:20:33.967 000000a0 70 5e e5 07 d8 56 05 1e 82 0f 4f 49 7b aa b2 10 p^...V....OI{... 00:20:33.967 000000b0 9d df 21 b4 30 db 71 b4 30 e4 a6 2a a0 c5 38 9e ..!.0.q.0..*..8. 00:20:33.967 000000c0 b1 8d 5d cd 9e 10 a0 ae c1 66 6c 27 5b 07 a4 e6 ..]......fl'[... 00:20:33.967 000000d0 58 1a a0 ce 0c 16 18 d1 57 53 91 2c 96 a6 8c 95 X.......WS.,.... 00:20:33.967 000000e0 25 32 d4 ec 1e dd 0d 0a f5 12 8c b8 bf 3a 49 57 %2...........:IW 00:20:33.967 000000f0 22 07 3b 87 c4 05 13 93 23 14 fd f6 db 65 57 85 ".;.....#....eW. 00:20:33.967 00000100 2a 4f ef 57 77 b2 d9 00 09 3c c5 d4 21 0b e4 ef *O.Ww....<..!... 00:20:33.967 00000110 19 0b 84 68 22 50 d6 af 9e 58 91 64 b7 4f 94 1b ...h"P...X.d.O.. 00:20:33.967 00000120 53 2f ca 83 ae eb a3 d5 57 40 be ec f5 05 2e f9 S/......W@...... 00:20:33.967 00000130 bd af bd 16 16 d9 c8 d5 78 49 88 49 93 7d 23 c3 ........xI.I.}#. 00:20:33.967 00000140 36 4a f8 98 31 ae 2e 59 cb 3e c2 fc 99 c4 05 a8 6J..1..Y.>...... 00:20:33.967 00000150 80 b1 05 1e c7 d7 53 19 6b 5e d7 5a 0c cf cb 5c ......S.k^.Z...\ 00:20:33.967 00000160 98 58 a7 a8 8a 7d ab 55 61 2f 53 84 81 a3 f1 cc .X...}.Ua/S..... 00:20:33.967 00000170 d4 2a 4b c9 cc 76 13 04 05 42 fc d4 77 b3 f4 ff .*K..v...B..w... 00:20:33.967 00000180 b3 ec d6 b4 31 cf 5a b7 bb 5b 07 73 ca 0a 22 17 ....1.Z..[.s..". 00:20:33.967 00000190 44 04 40 5d a7 ec f5 e2 96 68 99 64 c0 a5 b6 41 D.@].....h.d...A 00:20:33.967 000001a0 92 0b c1 a0 91 3b 4a ec 4c 7c 32 14 8f 28 b5 80 .....;J.L|2..(.. 00:20:33.967 000001b0 14 da a3 95 1b 07 d4 d4 8d c5 3d fa a9 82 a0 37 ..........=....7 00:20:33.967 000001c0 54 67 cc 65 d3 d0 92 b1 68 5f ae dd 5b 1e c1 47 Tg.e....h_..[..G 00:20:33.967 000001d0 ca 87 24 29 45 f1 0d df aa 37 b0 c9 db 56 db 8d ..$)E....7...V.. 00:20:33.967 000001e0 e1 7d 23 66 2b 68 52 08 1f 7f f0 43 88 80 6a 67 .}#f+hR....C..jg 00:20:33.967 000001f0 9a 87 81 d7 42 88 2f 25 84 80 b8 db 4d a9 cd ed ....B./%....M... 00:20:33.967 host pubkey: 00:20:33.967 00000000 15 f6 bc 79 d0 da 82 e6 d7 16 e8 79 d6 f0 e0 e3 ...y.......y.... 00:20:33.967 00000010 fc 7f dd 95 69 ae bc 4e aa 3d 6d 26 23 39 f0 24 ....i..N.=m .$ 00:20:33.967 00000020 08 e4 fb 9f 12 df a7 7b 64 df 99 98 ee 2b 4d 07 .......{d....+M. 00:20:33.967 00000030 2c 85 54 e9 31 1a 29 0a 8b b8 ed 0e be b2 af b1 ,.T.1.)......... 00:20:33.967 00000040 f7 ba 0c 50 5b 36 28 e6 62 06 6c d4 04 72 a0 68 ...P[6(.b.l..r.h 00:20:33.967 00000050 8a 9d 6c 5d 2f e7 8f 8d 21 97 bd 51 f9 ec 79 59 ..l]/...!..Q..yY 00:20:33.967 00000060 1f a0 36 4b e9 c9 72 9d 97 32 2d 08 78 56 6c 7d ..6K..r..2-.xVl} 00:20:33.967 00000070 60 5a ff 7b 55 d6 9a ef 91 72 f3 93 15 e5 4e f8 `Z.{U....r....N. 00:20:33.967 00000080 67 da 7e 36 ee 71 18 94 c0 2a 94 07 28 82 88 30 g.~6.q...*..(..0 00:20:33.967 00000090 55 22 7e d6 eb eb 6f ec ef 9d 0b 4a ec ad af 19 U"~...o....J.... 00:20:33.967 000000a0 e0 c6 91 61 f1 78 1c b0 06 75 1e c8 7d b7 73 62 ...a.x...u..}.sb 00:20:33.967 000000b0 e3 51 db e6 b7 43 eb 8b 05 65 52 a6 e9 77 eb 31 .Q...C...eR..w.1 00:20:33.967 000000c0 df 7b a0 2d ad 15 92 2b 63 ef af cc 53 40 aa 8e .{.-...+c...S@.. 00:20:33.967 000000d0 4f d0 e4 41 90 ed 62 69 cd 4d 77 d7 76 6d 65 6e O..A..bi.Mw.vmen 00:20:33.967 000000e0 79 0c f3 6c 58 9f ed 00 a8 ff 4d c0 23 9f 89 2f y..lX.....M.#../ 00:20:33.967 000000f0 7b 7e da f3 ae bd 8a 5d f5 05 fb 29 ec 42 5c 6b {~.....]...).B\k 00:20:33.967 00000100 65 7d 96 67 c8 35 13 c3 6c d4 85 b6 af 1d 31 cc e}.g.5..l.....1. 00:20:33.967 00000110 d3 e2 2c a7 ae 1a 82 69 d0 6b 48 98 7b 22 9e 49 ..,....i.kH.{".I 00:20:33.967 00000120 60 3c 6d 55 03 e9 b4 82 b3 27 45 04 a8 61 c0 9b `....".i... 00:20:33.967 00000190 3e b3 ae f2 79 ef 1c 69 f7 91 b5 6b 3b 16 2f 0f >...y..i...k;./. 00:20:33.967 000001a0 39 5e 20 b8 87 91 14 06 d7 f9 29 05 ff 70 7a 9f 9^ .......)..pz. 00:20:33.967 000001b0 18 09 b3 73 b9 5f 84 20 70 db cf 2b 6d 09 a6 7e ...s._. p..+m..~ 00:20:33.967 000001c0 b3 d2 4e ed 08 96 fe 5e ae 4c d3 ad 57 89 1c b8 ..N....^.L..W... 00:20:33.967 000001d0 f3 63 98 ad ed cc 61 8e f2 32 b2 ee af 77 43 3e .c....a..2...wC> 00:20:33.967 000001e0 69 56 65 66 d1 a9 39 07 de 85 28 9f df 14 da a4 iVef..9...(..... 00:20:33.967 000001f0 3e b3 ba d7 78 35 00 77 90 b4 45 24 4c 04 ed 64 >...x5.w..E$L..d 00:20:33.967 dh secret: 00:20:33.967 00000000 de 4d aa 75 bf a3 f9 b7 53 e7 a1 a1 6a ef 44 39 .M.u....S...j.D9 00:20:33.967 00000010 a5 e1 57 70 78 62 49 b7 db f0 c5 99 44 09 a3 a9 ..WpxbI.....D... 00:20:33.967 00000020 92 18 ec 0b 68 ef 43 07 64 bd 0b 37 d8 e8 df d4 ....h.C.d..7.... 00:20:33.967 00000030 d6 e0 8d 59 02 c9 82 32 0f 3e bf 5a b5 c8 b2 fe ...Y...2.>.Z.... 00:20:33.967 00000040 29 1c 5d d2 e7 63 f2 d0 b3 a3 7a e7 5a 2f 0d c1 ).]..c....z.Z/.. 00:20:33.967 00000050 0b 2a fb 76 15 45 f2 34 9b db 92 b4 f4 02 f3 d0 .*.v.E.4........ 00:20:33.967 00000060 b9 09 b9 b4 f7 1d cf bf c3 fd e0 67 4f d8 43 d9 ...........gO.C. 00:20:33.967 00000070 fb cf 61 79 6c e5 8e 69 a2 c4 e8 2c 47 d5 a5 47 ..ayl..i...,G..G 00:20:33.967 00000080 c2 db 3a fc 6a ec 5b a8 fc 52 1e 26 65 42 89 42 ..:.j.[..R.&eB.B 00:20:33.967 00000090 0b c6 1c 3d 00 7b 9d 3c 47 27 a8 1b 04 31 f2 b6 ...=.{.7....... 00:20:33.968 000000a0 9a e5 69 83 a8 82 c3 7a 65 a9 80 66 07 e7 08 c1 ..i....ze..f.... 00:20:33.968 000000b0 0d 15 61 1a 8b 9c 5e 82 18 c9 65 98 7e 77 e2 42 ..a...^...e.~w.B 00:20:33.968 000000c0 b3 a4 21 84 0e 58 02 8b 15 74 25 7e 17 0c 32 6c ..!..X...t%~..2l 00:20:33.968 000000d0 c2 71 0b 45 e8 ec e1 a3 83 4c f6 d2 8c 91 4f ef .q.E.....L....O. 00:20:33.968 000000e0 c5 c5 c2 0d d1 de da 98 a0 93 5e 4b ba a7 7f a2 ..........^K.... 00:20:33.968 000000f0 9e 05 8e 43 aa ce eb 74 87 df 81 42 05 d7 f7 78 ...C...t...B...x 00:20:33.968 00000100 17 68 c7 54 63 88 34 13 fd 66 21 8a 6c 56 78 b7 .h.Tc.4..f!.lVx. 00:20:33.968 00000110 c0 7f 5d 67 c8 a8 11 c6 7b ed 11 96 cb 8e 4b ff ..]g....{.....K. 00:20:33.968 00000120 9f 9d 5e 7d 56 26 e5 bf a0 32 d7 41 91 82 2e c5 ..^}V&...2.A.... 00:20:33.968 00000130 90 7b 69 6c 38 68 ce b2 d5 46 ec 84 b7 85 13 3f .{il8h...F.....? 00:20:33.968 00000140 f3 e0 e9 29 e0 9a 8b 81 68 00 3f fc 5c b9 7f eb ...)....h.?.\... 00:20:33.968 00000150 e7 f3 2c d1 38 43 9d 41 9e 00 85 a9 c7 7d b4 b8 ..,.8C.A.....}.. 00:20:33.968 00000160 d3 14 5a 6d 8e c4 b1 1b 49 42 03 0f 7a 21 3a 40 ..Zm....IB..z!:@ 00:20:33.968 00000170 6f 9f 94 e3 ea e5 68 79 ea 6a e6 e2 0d 40 94 a1 o.....hy.j...@.. 00:20:33.968 00000180 48 06 dc fc f5 4a 55 b9 90 3b 0f ca 63 63 64 eb H....JU..;..ccd. 00:20:33.968 00000190 96 09 1a 58 68 9a e5 e0 b0 0f 81 ba 4e a3 1d f6 ...Xh.......N... 00:20:33.968 000001a0 ed 37 e0 25 ba 0a c6 27 7d 82 ee 3d 70 85 b9 fe .7.%...'}..=p... 00:20:33.968 000001b0 c4 ac ec 12 0c ee 1f 3e e9 cc 84 dd 8e 8a 94 b6 .......>........ 00:20:33.968 000001c0 24 f8 72 ca 76 63 d5 09 07 14 fb 21 5b 37 9d e9 $.r.vc.....![7.. 00:20:33.968 000001d0 9d a7 3b 68 da 32 25 84 07 8a 03 db 60 18 09 dc ..;h.2%.....`... 00:20:33.968 000001e0 2f b6 7f 6b 2e 19 d5 c8 78 ce bf a6 c3 13 c6 dc /..k....x....... 00:20:33.968 000001f0 d4 28 e6 88 c4 06 67 dd b6 59 a8 9d fc 30 88 61 .(....g..Y...0.a 00:20:33.968 host pubkey: 00:20:33.968 00000000 fb 4f be c2 c8 c5 90 24 22 23 88 32 23 87 b4 15 .O.....$"#.2#... 00:20:33.968 00000010 82 38 7b a5 55 66 5a c7 9d de 13 90 15 f7 30 5e .8{.UfZ.......0^ 00:20:33.968 00000020 c5 b4 cc a2 89 66 cc be f2 d5 6d ca 46 83 69 30 .....f....m.F.i0 00:20:33.968 00000030 cd 90 0d b2 53 4c cd 32 0c 78 39 ed 92 96 86 7f ....SL.2.x9..... 00:20:33.968 00000040 a0 db 8f 1f 18 04 ef 8c 5b 3e 9d 85 f9 58 f8 ae ........[>...X.. 00:20:33.968 00000050 e4 9f 0b 4c 07 c9 1e 4c 0e 45 30 96 ad 2c bd c5 ...L...L.E0..,.. 00:20:33.968 00000060 6b ea 5b 8b 2a 38 50 9e 8f ee a7 e0 ac 2b 2b eb k.[.*8P......++. 00:20:33.968 00000070 42 2d 2f 32 a4 6d e0 3f d6 f1 ce 13 90 6e 84 52 B-/2.m.?.....n.R 00:20:33.968 00000080 11 61 f7 de b3 12 6e f2 18 2f a0 aa de 9b 06 0c .a....n../...... 00:20:33.968 00000090 ff 0e 58 50 a3 07 b7 ee f8 67 47 3f ca 2d 6a 3f ..XP.....gG?.-j? 00:20:33.968 000000a0 2a 98 1b 62 45 8b 32 9d 28 47 d1 6c 3a c4 0e eb *..bE.2.(G.l:... 00:20:33.968 000000b0 58 03 da db 1d 3b 5f a3 3a c0 37 58 db 65 67 bb X....;_.:.7X.eg. 00:20:33.968 000000c0 b2 d2 bc 20 12 71 53 cf 64 40 c8 08 64 90 01 6e ... .qS.d@..d..n 00:20:33.968 000000d0 42 92 f0 8b 41 2e cf 57 2f 5e 0c a4 d9 c3 2c c6 B...A..W/^....,. 00:20:33.968 000000e0 e1 b4 1c ae 8e 04 a5 4b 5a e0 fc e0 91 27 a9 a5 .......KZ....'.. 00:20:33.968 000000f0 90 d8 1a a8 70 61 b8 59 57 e3 52 bb 7c 1e 8d 83 ....pa.YW.R.|... 00:20:33.968 00000100 43 b7 c8 9b 2b 07 8f ec 29 93 cc 97 6b 67 03 96 C...+...)...kg.. 00:20:33.968 00000110 ac 20 24 65 4b 48 1f 25 24 98 fd 4c 25 09 9b fd . $eKH.%$..L%... 00:20:33.968 00000120 d2 c3 71 fd 40 b7 2d 60 ec 08 0d 8b ad 58 af 1f ..q.@.-`.....X.. 00:20:33.968 00000130 1d aa 92 25 6a a9 0d a1 b3 c3 62 df d9 bc 67 0a ...%j.....b...g. 00:20:33.968 00000140 9b 12 27 77 3c 0e 4a fb 4c 34 6f 3e 64 1d 7e c2 ..'w<.J.L4o>d.~. 00:20:33.968 00000150 b1 91 1a 2e d3 ee 6a 12 2a 18 fa 2d b8 49 a0 cb ......j.*..-.I.. 00:20:33.968 00000160 5d f8 31 4a 1b ca 9f 73 f3 5c 48 9e 3b a6 07 1e ].1J...s.\H.;... 00:20:33.968 00000170 a7 a1 e6 0c ff 68 85 09 3e 06 1f 34 03 ed 13 65 .....h..>..4...e 00:20:33.968 00000180 4a 55 b3 fe 14 7d cf 3d ca 4c b1 1b b9 6f 0c 61 JU...}.=.L...o.a 00:20:33.968 00000190 da c9 bd 19 33 61 3e be 8f 37 78 9c 23 16 db af ....3a>..7x.#... 00:20:33.968 000001a0 54 3e df ee 62 8c b5 58 ad 41 27 45 ab ae 94 c0 T>..b..X.A'E.... 00:20:33.968 000001b0 bf 07 60 b0 be 57 d7 54 42 3b 22 70 19 5e ad 89 ..`..W.TB;"p.^.. 00:20:33.968 000001c0 63 60 2c e4 92 85 02 64 61 ec 26 ba 41 0a 0f 16 c`,....da.&.A... 00:20:33.968 000001d0 97 4a e9 1d bf c4 1a a6 25 05 5a 6d e8 04 46 4e .J......%.Zm..FN 00:20:33.968 000001e0 97 0b d2 61 0f cc 70 18 c6 16 d2 8e 08 02 d2 e1 ...a..p......... 00:20:33.968 000001f0 98 e7 f9 64 13 c2 30 d8 80 57 c4 b5 59 71 c9 d8 ...d..0..W..Yq.. 00:20:33.968 dh secret: 00:20:33.968 00000000 e5 6d 80 3e 66 37 6d bb f7 49 a5 ba 75 9f 34 3b .m.>f7m..I..u.4; 00:20:33.968 00000010 73 4c 13 1b b5 63 f7 f1 20 cd da d5 a4 31 68 40 sL...c.. ....1h@ 00:20:33.968 00000020 4c bc 8e da 42 35 a4 18 75 e6 d9 76 96 e9 bf 78 L...B5..u..v...x 00:20:33.968 00000030 2f 40 01 81 d4 f6 f7 e4 70 fb 25 e7 b5 12 0b b9 /@......p.%..... 00:20:33.968 00000040 7a 89 ba 1b ef d8 e5 69 47 44 9f d4 28 78 65 d4 z......iGD..(xe. 00:20:33.968 00000050 13 c5 12 0e d0 82 9b 0f 40 b0 ab 33 b0 45 50 df ........@..3.EP. 00:20:33.968 00000060 72 82 57 d1 45 9c df ef fe 60 dd 3d ef ae 19 a4 r.W.E....`.=.... 00:20:33.968 00000070 5a 15 7e 21 8c ae d5 2d 18 4d 71 17 4f 90 1e 37 Z.~!...-.Mq.O..7 00:20:33.968 00000080 e2 07 54 8b d0 3c 99 22 8b 98 83 aa cc df cf f1 ..T..<."........ 00:20:33.968 00000090 f5 2b 62 fa 88 96 3a 82 73 84 66 59 6b c2 bd b5 .+b...:.s.fYk... 00:20:33.968 000000a0 69 e3 90 69 ad be 47 cb 59 cf 01 0a cc af ed 19 i..i..G.Y....... 00:20:33.968 000000b0 b2 c0 6c 42 13 e1 29 8d 3a 5c 5c 2f 83 71 84 5f ..lB..).:\\/.q._ 00:20:33.968 000000c0 a8 d6 1f ce ac 87 79 6c fb be 96 25 c3 f8 91 99 ......yl...%.... 00:20:33.968 000000d0 46 66 e3 af 80 31 e9 2e 59 63 3b c3 be e6 aa c6 Ff...1..Yc;..... 00:20:33.968 000000e0 b5 e6 d5 0f fc c2 04 77 40 ce 25 d2 93 82 3c 64 .......w@.%...G.xv...|. 00:20:33.968 00000160 8a c5 99 ab 98 72 21 f0 2e 68 ab 37 76 06 61 b9 .....r!..h.7v.a. 00:20:33.968 00000170 a4 17 31 4a d4 df cf c8 51 0e 45 55 39 6b 05 d0 ..1J....Q.EU9k.. 00:20:33.968 00000180 a6 72 4d 60 c8 cf 72 d4 40 ce 5b e0 b1 81 bc c0 .rM`..r.@.[..... 00:20:33.968 00000190 d8 ff 4d 06 60 c1 19 9f f0 f7 1b 6f e7 45 da 8a ..M.`......o.E.. 00:20:33.968 000001a0 59 9a 2b 3e 76 fb 5c a8 2a 4b 94 c4 7c bd d1 c0 Y.+>v.\.*K..|... 00:20:33.968 000001b0 5a 23 17 0a 80 2a b3 9d f6 14 21 10 31 d0 5b 03 Z#...*....!.1.[. 00:20:33.968 000001c0 04 98 a1 b0 16 24 af ae 77 d2 f2 62 c0 8a 05 6c .....$..w..b...l 00:20:33.968 000001d0 1a 72 c8 26 2c 57 27 46 6a 4d 94 e1 6c 61 50 e1 .r.&,W'FjM..laP. 00:20:33.968 000001e0 da 88 48 9b 73 a2 bb 64 fb 53 8c 2c 34 04 62 a2 ..H.s..d.S.,4.b. 00:20:33.968 000001f0 2b ee 33 6b c6 59 5a 7b af 77 95 08 2d f5 ec ab +.3k.YZ{.w..-... 00:20:33.968 [2024-09-27 15:25:28.455063] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=3, dhgroup=3, seq=3428451819, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.969 [2024-09-27 15:25:28.471943] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.969 [2024-09-27 15:25:28.471982] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.969 [2024-09-27 15:25:28.471998] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.969 [2024-09-27 15:25:28.472017] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.969 [2024-09-27 15:25:28.472034] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.969 [2024-09-27 15:25:28.578345] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.969 [2024-09-27 15:25:28.578363] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.969 [2024-09-27 15:25:28.578370] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.969 [2024-09-27 15:25:28.578380] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.969 [2024-09-27 15:25:28.578437] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.969 ctrlr pubkey: 00:20:33.969 00000000 d7 36 03 19 4f 27 3a 03 d3 28 04 99 82 5c 00 4c .6..O':..(...\.L 00:20:33.969 00000010 04 90 ad 05 36 af a6 1b 30 e3 51 fb a3 9e d6 85 ....6...0.Q..... 00:20:33.969 00000020 42 44 1e c9 ec 14 a1 01 e8 e0 48 0d d2 71 0d 81 BD........H..q.. 00:20:33.969 00000030 d8 e4 fa c9 bc ca cd 05 f8 8b 6e 2e 58 98 7e 35 ..........n.X.~5 00:20:33.969 00000040 92 50 3b 35 2d e1 2f 8e e0 b6 4d 19 ce cc 06 45 .P;5-./...M....E 00:20:33.969 00000050 06 19 2a 5d 08 76 05 4b 4d 03 66 e3 5e 35 d2 60 ..*].v.KM.f.^5.` 00:20:33.969 00000060 be cd 38 2d 8a 20 e2 6a dc 48 79 d7 77 81 24 11 ..8-. .j.Hy.w.$. 00:20:33.969 00000070 7b 6b ff 5b 9c 9d f1 80 cc f7 7b 63 fd 0b 18 37 {k.[......{c...7 00:20:33.969 00000080 35 8a 14 e5 fd 45 3f 70 76 1d 64 de 04 b6 ae 2b 5....E?pv.d....+ 00:20:33.969 00000090 2d bf d4 c0 99 8b c3 3e 37 92 c4 1b e4 8b fb 93 -......>7....... 00:20:33.969 000000a0 9a e5 69 83 a8 82 c3 7a 65 a9 80 66 07 e7 08 c1 ..i....ze..f.... 00:20:33.969 000000b0 0d 15 61 1a 8b 9c 5e 82 18 c9 65 98 7e 77 e2 42 ..a...^...e.~w.B 00:20:33.969 000000c0 b3 a4 21 84 0e 58 02 8b 15 74 25 7e 17 0c 32 6c ..!..X...t%~..2l 00:20:33.969 000000d0 c2 71 0b 45 e8 ec e1 a3 83 4c f6 d2 8c 91 4f ef .q.E.....L....O. 00:20:33.969 000000e0 c5 c5 c2 0d d1 de da 98 a0 93 5e 4b ba a7 7f a2 ..........^K.... 00:20:33.969 000000f0 9e 05 8e 43 aa ce eb 74 87 df 81 42 05 d7 f7 78 ...C...t...B...x 00:20:33.969 00000100 17 68 c7 54 63 88 34 13 fd 66 21 8a 6c 56 78 b7 .h.Tc.4..f!.lVx. 00:20:33.969 00000110 c0 7f 5d 67 c8 a8 11 c6 7b ed 11 96 cb 8e 4b ff ..]g....{.....K. 00:20:33.969 00000120 9f 9d 5e 7d 56 26 e5 bf a0 32 d7 41 91 82 2e c5 ..^}V&...2.A.... 00:20:33.969 00000130 90 7b 69 6c 38 68 ce b2 d5 46 ec 84 b7 85 13 3f .{il8h...F.....? 00:20:33.969 00000140 f3 e0 e9 29 e0 9a 8b 81 68 00 3f fc 5c b9 7f eb ...)....h.?.\... 00:20:33.969 00000150 e7 f3 2c d1 38 43 9d 41 9e 00 85 a9 c7 7d b4 b8 ..,.8C.A.....}.. 00:20:33.969 00000160 d3 14 5a 6d 8e c4 b1 1b 49 42 03 0f 7a 21 3a 40 ..Zm....IB..z!:@ 00:20:33.969 00000170 6f 9f 94 e3 ea e5 68 79 ea 6a e6 e2 0d 40 94 a1 o.....hy.j...@.. 00:20:33.969 00000180 48 06 dc fc f5 4a 55 b9 90 3b 0f ca 63 63 64 eb H....JU..;..ccd. 00:20:33.969 00000190 96 09 1a 58 68 9a e5 e0 b0 0f 81 ba 4e a3 1d f6 ...Xh.......N... 00:20:33.969 000001a0 ed 37 e0 25 ba 0a c6 27 7d 82 ee 3d 70 85 b9 fe .7.%...'}..=p... 00:20:33.969 000001b0 c4 ac ec 12 0c ee 1f 3e e9 cc 84 dd 8e 8a 94 b6 .......>........ 00:20:33.969 000001c0 24 f8 72 ca 76 63 d5 09 07 14 fb 21 5b 37 9d e9 $.r.vc.....![7.. 00:20:33.969 000001d0 9d a7 3b 68 da 32 25 84 07 8a 03 db 60 18 09 dc ..;h.2%.....`... 00:20:33.969 000001e0 2f b6 7f 6b 2e 19 d5 c8 78 ce bf a6 c3 13 c6 dc /..k....x....... 00:20:33.969 000001f0 d4 28 e6 88 c4 06 67 dd b6 59 a8 9d fc 30 88 61 .(....g..Y...0.a 00:20:33.969 host pubkey: 00:20:33.969 00000000 5a dc ec b7 78 12 31 8b 94 1c d6 41 d4 f2 0e 9f Z...x.1....A.... 00:20:33.969 00000010 a4 95 01 b0 4b ce 9e ff 03 39 d0 10 c2 01 84 a7 ....K....9...... 00:20:33.969 00000020 c6 02 f6 2d f9 b3 a6 70 83 f8 a1 7d 73 60 71 7e ...-...p...}s`q~ 00:20:33.969 00000030 e6 22 af b7 d0 b0 4a 84 be a4 b3 82 1e c1 06 ba ."....J......... 00:20:33.969 00000040 c6 c2 f1 54 a3 58 5a 4b 2b 7b 4b 22 e8 ba 81 51 ...T.XZK+{K"...Q 00:20:33.969 00000050 6a 9a b8 9b 5d 34 30 ed a1 c1 d4 0f 46 c5 98 d9 j...]40.....F... 00:20:33.969 00000060 64 41 35 5a d4 a3 02 94 06 21 1e 87 7f 82 1a 80 dA5Z.....!...... 00:20:33.969 00000070 6b 22 be da 47 c5 4b 7b 97 e1 0b 5c f1 b5 60 4e k"..G.K{...\..`N 00:20:33.969 00000080 e2 26 5c 92 bf 33 af 45 84 01 ce 65 fa 79 dd 65 .&\..3.E...e.y.e 00:20:33.969 00000090 85 21 6f cb 2a c7 fe d5 b4 9d b7 68 69 7b 79 eb .!o.*......hi{y. 00:20:33.969 000000a0 03 52 f1 bb c5 84 a8 3f e7 35 96 79 a0 2b 04 5e .R.....?.5.y.+.^ 00:20:33.969 000000b0 93 71 9f b8 f7 69 4d 13 66 b2 8a bf dc 92 69 56 .q...iM.f.....iV 00:20:33.969 000000c0 8d 34 a8 ab 53 98 15 36 88 20 9f ff 3a 34 91 13 .4..S..6. ..:4.. 00:20:33.969 000000d0 7b 69 1f b1 2d 40 13 49 4d f2 fa 54 ee 7d 87 c1 {i..-@.IM..T.}.. 00:20:33.969 000000e0 e2 53 40 9e e1 5e 46 2a 1a ec 67 01 66 87 1c be .S@..^F*..g.f... 00:20:33.969 000000f0 f6 53 07 02 75 b5 c2 dc 01 4a fa 4b 7d d6 06 70 .S..u....J.K}..p 00:20:33.969 00000100 d8 48 61 91 c1 73 f2 ad 10 44 5b be cd 7b 9b cb .Ha..s...D[..{.. 00:20:33.969 00000110 55 65 5d 70 fa 87 9d 02 cc 0e ec eb aa 46 f8 aa Ue]p.........F.. 00:20:33.969 00000120 35 54 3e a1 a5 74 cf ea 24 61 3d a4 2f a6 31 3b 5T>..t..$a=./.1; 00:20:33.969 00000130 71 13 0d 0c fe 87 a0 ff ef de 71 d3 b5 dc fb 54 q.........q....T 00:20:33.969 00000140 a1 eb a5 02 96 19 8a 6e 46 f1 a2 a1 7b 38 cd 6c .......nF...{8.l 00:20:33.969 00000150 fc f5 41 7b 01 5f e0 65 b5 ea 5c 22 9e bb db aa ..A{._.e..\".... 00:20:33.969 00000160 16 8c a7 47 4a fd 6d f2 5f 99 ec cb 36 8a 96 ed ...GJ.m._...6... 00:20:33.969 00000170 55 12 ec 21 10 74 63 d3 6c a2 f6 de 09 a1 98 8b U..!.tc.l....... 00:20:33.969 00000180 a6 76 04 9c 63 20 98 59 2a fc 6a 07 62 f7 10 57 .v..c .Y*.j.b..W 00:20:33.969 00000190 df 6a d9 5a c9 ec 73 09 b4 ae 22 a8 1d b7 28 b2 .j.Z..s..."...(. 00:20:33.969 000001a0 0a de 91 24 e9 bb de 1f f1 be b0 f8 59 de 28 70 ...$........Y.(p 00:20:33.969 000001b0 bc 66 d2 fa 1d 48 e7 5c 2f ea 21 3e 05 90 f6 9a .f...H.\/.!>.... 00:20:33.969 000001c0 e9 25 6e 0e 5e af 54 e8 9f 59 90 d5 af 48 59 33 .%n.^.T..Y...HY3 00:20:33.969 000001d0 da f2 f2 a5 ea 08 42 ee cf c5 2d 0e b2 e8 f7 03 ......B...-..... 00:20:33.969 000001e0 92 37 7c 34 f9 66 f0 62 bc c1 5e ac 69 3d 2d eb .7|4.f.b..^.i=-. 00:20:33.969 000001f0 02 62 f8 cf 03 5b 5e bd b9 0c f0 03 51 9a fb f2 .b...[^.....Q... 00:20:33.969 dh secret: 00:20:33.969 00000000 84 13 5a 4c 59 2a 57 8d db 09 bb 3a d0 aa 4f fe ..ZLY*W....:..O. 00:20:33.969 00000010 43 fa 43 28 1f 5b e8 93 20 c1 aa 2e 5a c9 8d 36 C.C(.[.. ...Z..6 00:20:33.969 00000020 2a 5f 84 8b 33 f6 44 97 98 1a 31 79 e7 3e cd 50 *_..3.D...1y.>.P 00:20:33.969 00000030 30 c6 d7 c4 5c 34 09 8f 40 8e e9 5e a7 b0 5c 69 0...\4..@..^..\i 00:20:33.969 00000040 8f 1e 48 7f 47 e2 99 98 57 69 42 a5 f5 ec 1b e5 ..H.G...WiB..... 00:20:33.969 00000050 33 a0 d7 db b7 f8 27 14 b7 04 ac 6e 82 f6 51 18 3.....'....n..Q. 00:20:33.969 00000060 99 3e 1e 5f d4 12 57 29 18 21 48 29 86 bf 72 aa .>._..W).!H)..r. 00:20:33.969 00000070 4b e2 40 7b 66 6e e9 f2 03 b2 fe 7f 2c fc 4f 02 K.@{fn......,.O. 00:20:33.969 00000080 62 6a e7 94 e1 94 b9 c0 e1 90 ff e4 dc 62 d6 f7 bj...........b.. 00:20:33.969 00000090 c0 a5 04 84 02 98 02 da e4 8a 78 79 6a 87 c5 c0 ..........xyj... 00:20:33.969 000000a0 5c 51 8a c8 1f 76 00 be c9 c0 b2 56 e6 f4 9a e6 \Q...v.....V.... 00:20:33.969 000000b0 5b 09 fc a9 50 79 a8 b5 2b 4c f6 0b 01 98 6b fa [...Py..+L....k. 00:20:33.969 000000c0 14 dd fc fe fb de 7e 59 42 50 33 b2 d0 2c 91 5c ......~YBP3..,.\ 00:20:33.969 000000d0 1d 70 18 95 dc b8 0c ec 92 2e b1 34 25 10 da 74 .p.........4%..t 00:20:33.969 000000e0 3c a2 7f 84 40 f7 bf ff 55 05 4b a2 c7 00 96 2e <...@...U.K..... 00:20:33.969 000000f0 23 d4 26 33 96 08 6d 3f ce db cb c9 a3 da 16 4c #.&3..m?.......L 00:20:33.969 00000100 2b 5e 1b 96 6a 43 97 4f 41 74 bc 19 ce d0 31 7e +^..jC.OAt....1~ 00:20:33.969 00000110 3f 61 27 01 48 92 d3 a0 5e 42 6a 22 1c f7 94 5a ?a'.H...^Bj"...Z 00:20:33.969 00000120 ff e3 be b7 fd c7 4a 32 0d 77 5e fd 37 8d 86 aa ......J2.w^.7... 00:20:33.969 00000130 0a cc a0 f1 ca e0 4b f3 d6 e3 29 3f 9c e5 4e cd ......K...)?..N. 00:20:33.969 00000140 d3 e8 89 35 05 99 a1 fe cf 0b 99 f8 fe 30 16 b0 ...5.........0.. 00:20:33.969 00000150 e4 0a 12 ec 4d 22 7c c6 01 80 66 b3 93 07 3f 4e ....M"|...f...?N 00:20:33.969 00000160 d0 a6 ca 28 ea bb 03 1b e5 4e b0 f4 d7 c0 55 b0 ...(.....N....U. 00:20:33.969 00000170 43 bd 28 7c db b9 71 8b 1d 0c 5e db a4 cf 06 34 C.(|..q...^....4 00:20:33.969 00000180 2b 6d 58 eb d8 2c ba f3 ff 59 97 81 cf b1 f0 e2 +mX..,...Y...... 00:20:33.969 00000190 3f ef ee f8 e3 f7 f2 53 7a 38 f2 bb 7e be 05 35 ?......Sz8..~..5 00:20:33.969 000001a0 90 be b5 e0 97 1a 36 81 ce 33 ce 04 99 fd 55 93 ......6..3....U. 00:20:33.969 000001b0 45 74 11 c5 ca 6c c0 7d bc 13 9b 5e 11 02 4d 79 Et...l.}...^..My 00:20:33.969 000001c0 4e 87 eb f1 7c 08 27 23 ab 18 f7 18 e3 30 e4 4c N...|.'#.....0.L 00:20:33.969 000001d0 6f fe aa 70 ed a4 ae e8 85 e4 1a 56 1e 3a c6 68 o..p.......V.:.h 00:20:33.969 000001e0 cc 38 28 1d 72 fb 12 8d 97 18 35 56 0a 99 59 1d .8(.r.....5V..Y. 00:20:33.969 000001f0 7e 29 28 2b 5c 9e 8b bc 83 cc ae 4d 1a 6a 39 4c ~)(+\......M.j9L 00:20:33.969 [2024-09-27 15:25:28.594729] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=3, dhgroup=3, seq=3428451820, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.969 [2024-09-27 15:25:28.594831] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.969 [2024-09-27 15:25:28.632037] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.969 [2024-09-27 15:25:28.632078] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.970 [2024-09-27 15:25:28.632088] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.970 [2024-09-27 15:25:28.632114] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.970 [2024-09-27 15:25:28.793728] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.970 [2024-09-27 15:25:28.793748] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.970 [2024-09-27 15:25:28.793755] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.970 [2024-09-27 15:25:28.793800] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.970 [2024-09-27 15:25:28.793823] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.970 ctrlr pubkey: 00:20:33.970 00000000 0c 0e 3f 7e 44 7d 14 6b 53 a8 13 35 83 bc b1 26 ..?~D}.kS..5...& 00:20:33.970 00000010 ae ff 81 e8 8e 1f 4d 29 bb e1 19 16 59 42 dd 06 ......M)....YB.. 00:20:33.970 00000020 52 18 e4 fc 2d 94 62 9d e7 25 b8 ed 5f c8 26 bf R...-.b..%.._.&. 00:20:33.970 00000030 42 a9 4c d1 02 9f 7d 96 ac 87 dc 1d d1 b9 bf bf B.L...}......... 00:20:33.970 00000040 99 00 f8 45 1b 74 c6 42 a9 1f 03 83 61 75 74 9c ...E.t.B....aut. 00:20:33.970 00000050 73 ae b9 63 45 9a d0 92 53 a5 d5 91 9d ab 24 c3 s..cE...S.....$. 00:20:33.970 00000060 1e d4 a9 3b a4 8b 57 30 00 7a e8 bc 46 c2 9d 74 ...;..W0.z..F..t 00:20:33.970 00000070 2b 1c 70 5f d2 10 60 ae 53 c9 0a 83 d7 70 e6 3e +.p_..`.S....p.> 00:20:33.970 00000080 35 22 3e f3 8a 1d 5d 83 ed 59 dd 74 27 6f d6 3d 5">...]..Y.t'o.= 00:20:33.970 00000090 53 2e ac 15 62 94 cc 43 b2 f9 67 db 30 1e 19 9b S...b..C..g.0... 00:20:33.970 000000a0 da ee b5 aa 24 b7 a4 e5 a0 37 fb 13 bf 12 47 29 ....$....7....G) 00:20:33.970 000000b0 d4 20 16 96 ea 7a bf 14 84 26 23 27 09 71 a8 3b . ...z...&#'.q.; 00:20:33.970 000000c0 03 59 ab cf 1d c5 e1 65 90 24 6a ea 8b 3f f2 31 .Y.....e.$j..?.1 00:20:33.970 000000d0 34 aa 11 ea 15 9f 8d 34 6a 04 4b 38 0e aa 41 b9 4......4j.K8..A. 00:20:33.970 000000e0 5e ae 3b 15 2e 0d f5 d2 18 73 b2 e4 9f 4d 94 01 ^.;......s...M.. 00:20:33.970 000000f0 f3 e6 61 2f 89 fc 6f c2 01 d5 d2 b6 ff e9 06 1f ..a/..o......... 00:20:33.970 00000100 ef c7 8e 28 41 79 cd 47 7f d1 7d 74 90 ab 0b 61 ...(Ay.G..}t...a 00:20:33.970 00000110 af 19 7e d5 37 e3 fc 99 8b f8 ba 60 dc 59 eb 22 ..~.7......`.Y." 00:20:33.970 00000120 c0 9c e9 0c a2 71 c6 d8 2d ae 77 6f a6 1c 3a b0 .....q..-.wo..:. 00:20:33.970 00000130 3a fb 62 66 68 5e 64 e1 c8 e7 32 be ce f0 85 d3 :.bfh^d...2..... 00:20:33.970 00000140 c9 98 df 34 b9 ba dd a7 3c f8 be 5b 8e 24 10 3d ...4....<..[.$.= 00:20:33.970 00000150 cd ab 08 18 ae 9b e3 1d 37 a0 5d 87 53 d8 8e 6e ........7.].S..n 00:20:33.970 00000160 17 c3 fa 0b ba 5a 41 93 fc a1 06 fb 07 08 60 6d .....ZA.......`m 00:20:33.970 00000170 1e e0 26 70 03 f9 99 a7 ff e3 cb 42 34 64 d3 e3 ..&p.......B4d.. 00:20:33.970 00000180 3d 0d 95 a2 2a 74 a3 04 f4 7a 45 78 cf 0a 94 49 =...*t...zEx...I 00:20:33.970 00000190 b1 9a c2 0a e6 33 59 f8 70 18 f5 33 8c bb 9c cc .....3Y.p..3.... 00:20:33.970 000001a0 d9 e5 d2 3c dc e4 4b e1 9f a1 55 79 8b 57 e1 fb ...<..K...Uy.W.. 00:20:33.970 000001b0 d3 d7 39 56 d7 fb 00 79 0c c1 ed 88 f7 c5 f6 bb ..9V...y........ 00:20:33.970 000001c0 6b 8a 37 54 dd de 54 78 e6 b3 6f 91 fb ef 18 75 k.7T..Tx..o....u 00:20:33.970 000001d0 b4 7f 94 22 aa 8e 2e ef 30 b4 ab 5f f7 bb e6 8b ..."....0.._.... 00:20:33.970 000001e0 5c 8c 69 c1 ee 25 01 3e 30 ca 8a 1c a2 4f 87 8d \.i..%.>0....O.. 00:20:33.970 000001f0 4f 8e 14 f0 d5 04 45 c9 65 0b 6c 78 5d af df 4d O.....E.e.lx]..M 00:20:33.970 host pubkey: 00:20:33.970 00000000 93 9d d4 87 82 e5 5b c0 be cf f0 c4 1c 8c 6f f6 ......[.......o. 00:20:33.970 00000010 0e 26 0a 26 be 8b dd 62 9e 40 c8 ab 47 94 b9 2a .&.&...b.@..G..* 00:20:33.970 00000020 21 6f 68 af 21 4a f8 28 60 94 28 f6 32 41 3c 97 !oh.!J.(`.(.2A<. 00:20:33.970 00000030 47 1d b5 87 96 67 39 da 0f f8 a5 eb 87 18 ad ef G....g9......... 00:20:33.970 00000040 cb de 08 28 82 83 15 11 8e bb 31 40 10 29 5a 4a ...(......1@.)ZJ 00:20:33.970 00000050 6c 20 07 f5 c6 00 89 70 08 65 3e 5a f2 71 27 7b l .....p.e>Z.q'{ 00:20:33.970 00000060 d0 7d 05 9c 46 35 c4 53 60 48 32 91 d2 81 fd 94 .}..F5.S`H2..... 00:20:33.970 00000070 8e be ed fb 6d 1e 4a a7 f2 92 ea a1 09 d1 4e ae ....m.J.......N. 00:20:33.970 00000080 be 25 1b 92 7f 5e d4 6d 0e da 23 9f 77 7e 69 73 .%...^.m..#.w~is 00:20:33.970 00000090 71 06 fd d8 52 46 15 03 78 1f a5 8d 27 56 1b dc q...RF..x...'V.. 00:20:33.970 000000a0 8f f6 84 13 1a de 5f d1 d8 1b 6c f8 61 45 48 32 ......_...l.aEH2 00:20:33.970 000000b0 be c9 2b 53 fc a3 9a 3d 47 62 26 fe d7 02 d1 17 ..+S...=Gb&..... 00:20:33.970 000000c0 57 48 63 ac 38 cd 49 48 ea a2 aa d0 9d 7b 8a f6 WHc.8.IH.....{.. 00:20:33.970 000000d0 5f 47 b9 e6 c7 00 40 97 f9 dc f4 82 63 e4 6c 7d _G....@.....c.l} 00:20:33.970 000000e0 7a cb fa 54 f2 2d 0a 68 a2 9d 0d 65 e5 c1 5b c0 z..T.-.h...e..[. 00:20:33.970 000000f0 d3 7d e8 13 a8 90 c7 e0 98 94 b3 e8 6c 28 97 d2 .}..........l(.. 00:20:33.970 00000100 fe 94 9d a9 42 f9 1c c3 f6 b3 ad c3 f6 23 b3 4c ....B........#.L 00:20:33.970 00000110 57 ac 4f 6e ca 8c 37 74 a6 70 ec 91 98 18 65 07 W.On..7t.p....e. 00:20:33.970 00000120 4d 61 dd a3 21 99 92 ce 9e 4c e9 6f f6 b8 3d cc Ma..!....L.o..=. 00:20:33.970 00000130 5b 1c ef 21 da f1 52 4c a6 71 ff e1 62 2e d9 9a [..!..RL.q..b... 00:20:33.970 00000140 f6 97 44 d7 28 17 99 20 3d b7 5f 99 ca 12 e4 d5 ..D.(.. =._..... 00:20:33.970 00000150 24 da 38 7d b5 fa e1 dc 1f d7 85 36 6b d3 2f af $.8}.......6k./. 00:20:33.970 00000160 bf e0 a1 67 db 90 c4 75 2f 27 df a2 c6 b3 b8 09 ...g...u/'...... 00:20:33.970 00000170 52 ed 21 22 7e aa 60 b3 db ab d6 96 f9 a8 ee 81 R.!"~.`......... 00:20:33.970 00000180 c5 28 41 df 70 ad ec 0f ab eb 4d 93 5b 8d 06 a7 .(A.p.....M.[... 00:20:33.970 00000190 3d b2 ef 87 75 66 01 4f a8 f9 be b4 e3 be 78 c0 =...uf.O......x. 00:20:33.970 000001a0 18 65 23 f0 d0 11 af ec 8c 39 3c 44 8a ad 05 aa .e#......9......L.s. 00:20:33.970 000001c0 16 ce 52 ab dc ba 85 92 4b 81 16 7a 2d e9 a4 a5 ..R.....K..z-... 00:20:33.970 000001d0 4e e7 33 c6 96 74 5b 96 c1 3e 89 b4 67 db 45 4e N.3..t[..>..g.EN 00:20:33.970 000001e0 85 ee 9d 01 5d 23 d8 34 1b b3 32 b4 d1 46 b7 97 ....]#.4..2..F.. 00:20:33.970 000001f0 40 31 3f ac 54 78 f3 c4 14 3e 58 5a b2 fd c0 f2 @1?.Tx...>XZ.... 00:20:33.970 dh secret: 00:20:33.970 00000000 69 03 26 94 bc 6a e4 76 bb 7a 7d c7 0c 3e 80 8b i.&..j.v.z}..>.. 00:20:33.970 00000010 2f fb f3 7d e5 ca 0b cb 2f ba 30 85 23 32 90 ac /..}..../.0.#2.. 00:20:33.970 00000020 c8 d8 9f e4 71 27 b9 91 3d e1 f6 19 2e b2 94 77 ....q'..=......w 00:20:33.970 00000030 d6 2f 26 2b 08 00 db b6 9e c0 e5 4b d2 75 22 01 ./&+.......K.u". 00:20:33.970 00000040 7c 42 2c 95 c0 2e 18 9b 7d 02 ea 5d c1 e0 bb 85 |B,.....}..].... 00:20:33.970 00000050 9f 13 51 2b 25 b6 8a 2c 30 6a e8 0e b9 8f 35 80 ..Q+%..,0j....5. 00:20:33.970 00000060 3f 47 ff 54 78 32 5b 79 26 41 ba 65 fa 63 06 59 ?G.Tx2[y&A.e.c.Y 00:20:33.970 00000070 69 95 ed 31 b1 1a c8 17 51 a5 30 80 0a 0e 3f 98 i..1....Q.0...?. 00:20:33.970 00000080 60 58 a4 6f af 13 b0 21 4b 31 a8 a8 02 1f 87 05 `X.o...!K1...... 00:20:33.970 00000090 a5 07 2c ca 8a 93 e2 50 b2 02 48 8b ab 83 db 35 ..,....P..H....5 00:20:33.970 000000a0 7b aa 22 24 13 32 b4 84 fb 50 cc 31 02 2e 43 76 {."$.2...P.1..Cv 00:20:33.970 000000b0 63 e9 eb 0a e3 9f 62 d0 96 8b f4 1f b4 3b 60 96 c.....b......;`. 00:20:33.970 000000c0 62 3a 07 22 96 fe 2e d3 7b 3e 10 5a 90 ff 95 a9 b:."....{>.Z.... 00:20:33.970 000000d0 57 fe e4 ab 8e d9 a3 77 e2 74 d1 25 33 2e 87 15 W......w.t.%3... 00:20:33.970 000000e0 5d 09 27 6a c2 c5 3d c5 e4 16 cd a8 fc 6b d3 f7 ].'j..=......k.. 00:20:33.970 000000f0 b6 ed 1e 88 73 ea 23 ce b9 e7 c3 d8 e6 d4 dc 9b ....s.#......... 00:20:33.970 00000100 22 85 74 ff fe db bd 22 36 0e 99 13 ff 7b ae 7b ".t...."6....{.{ 00:20:33.970 00000110 6e 6f 1e 17 2b d0 08 ce be 04 d6 d7 5d b7 af 6c no..+.......]..l 00:20:33.970 00000120 a0 b5 72 e8 46 a5 37 5b a1 71 12 64 48 28 93 69 ..r.F.7[.q.dH(.i 00:20:33.970 00000130 2a e7 ee e4 a5 b0 0a ef c7 b1 6d 99 0f 77 cb 48 *.........m..w.H 00:20:33.970 00000140 8f 5e 2e 6d f9 35 6d 87 3e cc 98 fd 2b 43 bb 4b .^.m.5m.>...+C.K 00:20:33.970 00000150 26 30 9b 0b 9f f9 e8 f9 2e e6 e8 b3 57 ea 04 c8 &0..........W... 00:20:33.970 00000160 5b 97 9a 30 00 65 41 1a 34 36 9c f1 da 94 5e bf [..0.eA.46....^. 00:20:33.970 00000170 74 55 0a 01 8f cf f1 65 75 13 f3 b6 37 80 ba ee tU.....eu...7... 00:20:33.970 00000180 b1 df 0a 36 fd cd 1f 26 40 dc 3d 8f e0 5c 89 d7 ...6...&@.=..\.. 00:20:33.970 00000190 ed 04 57 9d 9e e4 d0 57 68 f6 a5 52 f9 45 6d b2 ..W....Wh..R.Em. 00:20:33.970 000001a0 f3 0f f0 49 b8 04 87 f2 be 52 6f 14 97 cd b4 de ...I.....Ro..... 00:20:33.970 000001b0 74 ab a5 b7 3a d7 35 20 cb be 01 d1 95 3c 73 07 t...:.5 ...... 00:20:33.970 000001d0 13 42 b8 08 37 07 c5 1e c1 ed 2a 80 0e a0 a3 48 .B..7.....*....H 00:20:33.970 000001e0 ed 74 41 17 36 0a e2 7f b6 e2 fc f7 97 85 67 64 .tA.6.........gd 00:20:33.970 000001f0 4b 8e cd 50 80 51 47 06 08 2b 1a aa c8 e5 f3 fc K..P.QG..+...... 00:20:33.970 [2024-09-27 15:25:28.809851] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=3, dhgroup=3, seq=3428451821, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.970 [2024-09-27 15:25:28.827644] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.970 [2024-09-27 15:25:28.827686] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.970 [2024-09-27 15:25:28.827705] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.970 [2024-09-27 15:25:28.827732] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.970 [2024-09-27 15:25:28.827743] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.970 [2024-09-27 15:25:28.933521] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.970 [2024-09-27 15:25:28.933539] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.970 [2024-09-27 15:25:28.933546] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 3 (ffdhe4096) 00:20:33.970 [2024-09-27 15:25:28.933556] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.970 [2024-09-27 15:25:28.933610] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.970 ctrlr pubkey: 00:20:33.970 00000000 0c 0e 3f 7e 44 7d 14 6b 53 a8 13 35 83 bc b1 26 ..?~D}.kS..5...& 00:20:33.970 00000010 ae ff 81 e8 8e 1f 4d 29 bb e1 19 16 59 42 dd 06 ......M)....YB.. 00:20:33.971 00000020 52 18 e4 fc 2d 94 62 9d e7 25 b8 ed 5f c8 26 bf R...-.b..%.._.&. 00:20:33.971 00000030 42 a9 4c d1 02 9f 7d 96 ac 87 dc 1d d1 b9 bf bf B.L...}......... 00:20:33.971 00000040 99 00 f8 45 1b 74 c6 42 a9 1f 03 83 61 75 74 9c ...E.t.B....aut. 00:20:33.971 00000050 73 ae b9 63 45 9a d0 92 53 a5 d5 91 9d ab 24 c3 s..cE...S.....$. 00:20:33.971 00000060 1e d4 a9 3b a4 8b 57 30 00 7a e8 bc 46 c2 9d 74 ...;..W0.z..F..t 00:20:33.971 00000070 2b 1c 70 5f d2 10 60 ae 53 c9 0a 83 d7 70 e6 3e +.p_..`.S....p.> 00:20:33.971 00000080 35 22 3e f3 8a 1d 5d 83 ed 59 dd 74 27 6f d6 3d 5">...]..Y.t'o.= 00:20:33.971 00000090 53 2e ac 15 62 94 cc 43 b2 f9 67 db 30 1e 19 9b S...b..C..g.0... 00:20:33.971 000000a0 da ee b5 aa 24 b7 a4 e5 a0 37 fb 13 bf 12 47 29 ....$....7....G) 00:20:33.971 000000b0 d4 20 16 96 ea 7a bf 14 84 26 23 27 09 71 a8 3b . ...z...&#'.q.; 00:20:33.971 000000c0 03 59 ab cf 1d c5 e1 65 90 24 6a ea 8b 3f f2 31 .Y.....e.$j..?.1 00:20:33.971 000000d0 34 aa 11 ea 15 9f 8d 34 6a 04 4b 38 0e aa 41 b9 4......4j.K8..A. 00:20:33.971 000000e0 5e ae 3b 15 2e 0d f5 d2 18 73 b2 e4 9f 4d 94 01 ^.;......s...M.. 00:20:33.971 000000f0 f3 e6 61 2f 89 fc 6f c2 01 d5 d2 b6 ff e9 06 1f ..a/..o......... 00:20:33.971 00000100 ef c7 8e 28 41 79 cd 47 7f d1 7d 74 90 ab 0b 61 ...(Ay.G..}t...a 00:20:33.971 00000110 af 19 7e d5 37 e3 fc 99 8b f8 ba 60 dc 59 eb 22 ..~.7......`.Y." 00:20:33.971 00000120 c0 9c e9 0c a2 71 c6 d8 2d ae 77 6f a6 1c 3a b0 .....q..-.wo..:. 00:20:33.971 00000130 3a fb 62 66 68 5e 64 e1 c8 e7 32 be ce f0 85 d3 :.bfh^d...2..... 00:20:33.971 00000140 c9 98 df 34 b9 ba dd a7 3c f8 be 5b 8e 24 10 3d ...4....<..[.$.= 00:20:33.971 00000150 cd ab 08 18 ae 9b e3 1d 37 a0 5d 87 53 d8 8e 6e ........7.].S..n 00:20:33.971 00000160 17 c3 fa 0b ba 5a 41 93 fc a1 06 fb 07 08 60 6d .....ZA.......`m 00:20:33.971 00000170 1e e0 26 70 03 f9 99 a7 ff e3 cb 42 34 64 d3 e3 ..&p.......B4d.. 00:20:33.971 00000180 3d 0d 95 a2 2a 74 a3 04 f4 7a 45 78 cf 0a 94 49 =...*t...zEx...I 00:20:33.971 00000190 b1 9a c2 0a e6 33 59 f8 70 18 f5 33 8c bb 9c cc .....3Y.p..3.... 00:20:33.971 000001a0 d9 e5 d2 3c dc e4 4b e1 9f a1 55 79 8b 57 e1 fb ...<..K...Uy.W.. 00:20:33.971 000001b0 d3 d7 39 56 d7 fb 00 79 0c c1 ed 88 f7 c5 f6 bb ..9V...y........ 00:20:33.971 000001c0 6b 8a 37 54 dd de 54 78 e6 b3 6f 91 fb ef 18 75 k.7T..Tx..o....u 00:20:33.971 000001d0 b4 7f 94 22 aa 8e 2e ef 30 b4 ab 5f f7 bb e6 8b ..."....0.._.... 00:20:33.971 000001e0 5c 8c 69 c1 ee 25 01 3e 30 ca 8a 1c a2 4f 87 8d \.i..%.>0....O.. 00:20:33.971 000001f0 4f 8e 14 f0 d5 04 45 c9 65 0b 6c 78 5d af df 4d O.....E.e.lx]..M 00:20:33.971 host pubkey: 00:20:33.971 00000000 fb 80 bf b7 91 2c ec c8 6c 70 03 e0 79 8b 7c 42 .....,..lp..y.|B 00:20:33.971 00000010 fd 2d 30 75 d4 2b 0d 0c 5d f0 83 a3 b4 36 4c bf .-0u.+..]....6L. 00:20:33.971 00000020 fa cd 5a c1 0f 02 0e 75 54 37 b4 39 72 f3 65 36 ..Z....uT7.9r.e6 00:20:33.971 00000030 da 3f 11 d7 39 02 89 30 93 99 57 c6 60 b5 21 eb .?..9..0..W.`.!. 00:20:33.971 00000040 05 48 8b dc 57 98 2b 7f 17 cf 7b f1 ab 6f b5 ec .H..W.+...{..o.. 00:20:33.971 00000050 57 60 39 73 95 0b 6d 28 8c 86 8e 27 c9 b7 5f 05 W`9s..m(...'.._. 00:20:33.971 00000060 c5 22 95 5d 03 96 92 fd bf 41 bb 5c 1c 2a 67 51 .".].....A.\.*gQ 00:20:33.971 00000070 77 ef 0a 6b aa a2 d3 d0 15 e2 aa 01 f6 42 b3 6e w..k.........B.n 00:20:33.971 00000080 8e aa dd 84 62 c0 85 dc ee f8 81 32 9b 8b 13 26 ....b......2...& 00:20:33.971 00000090 a3 2c f7 9a f5 47 29 b1 cb 1d 8f 38 31 7a cb e1 .,...G)....81z.. 00:20:33.971 000000a0 dd 62 e4 3b 89 0c f5 48 bd 2f 0e 48 90 4b 15 d4 .b.;...H./.H.K.. 00:20:33.971 000000b0 46 e2 c1 c0 14 79 12 47 90 40 59 f6 4f 50 50 70 F....y.G.@Y.OPPp 00:20:33.971 000000c0 61 cb af 1e ce a0 20 6f ec 2b 2a 9b 66 f6 ee 4d a..... o.+*.f..M 00:20:33.971 000000d0 3d 20 51 dc dc a4 38 5b 4c 7d f8 b9 2e 60 94 7d = Q...8[L}...`.} 00:20:33.971 000000e0 21 ef e6 de 7b f6 b6 1b 5e 59 88 53 6a a2 31 3b !...{...^Y.Sj.1; 00:20:33.971 000000f0 a6 8f 29 27 14 40 03 12 e6 73 49 34 22 49 47 f5 ..)'.@...sI4"IG. 00:20:33.971 00000100 af e8 23 c6 0c 5f d2 48 a2 b3 ba c7 29 dc 5c 47 ..#.._.H....).\G 00:20:33.971 00000110 8b e6 65 7e c4 7d 48 cc c1 6f f1 ed 75 80 33 51 ..e~.}H..o..u.3Q 00:20:33.971 00000120 86 87 a8 56 f7 f9 4c c9 36 24 ae cf 87 d9 d9 4f ...V..L.6$.....O 00:20:33.971 00000130 85 72 23 c7 59 d7 3f 2f 68 56 d9 7a c5 8a 23 37 .r#.Y.?/hV.z..#7 00:20:33.971 00000140 bc 45 40 be 8b 47 7c c5 4c c2 ec 25 67 9f 38 58 .E@..G|.L..%g.8X 00:20:33.971 00000150 bd b1 07 92 53 b0 51 32 03 fe fe 21 16 2e a1 9a ....S.Q2...!.... 00:20:33.971 00000160 6e f3 b9 c9 24 3d 6c 2f 3d f1 0a 46 84 21 f7 6c n...$=l/=..F.!.l 00:20:33.971 00000170 4a e7 00 46 4e d1 79 9a 33 6f f1 61 82 5b ec 36 J..FN.y.3o.a.[.6 00:20:33.971 00000180 6c 6a ee d0 49 c6 37 c8 29 77 7a d9 c6 4f 28 72 lj..I.7.)wz..O(r 00:20:33.971 00000190 08 0c 79 1e 63 3f e2 0a 13 ce 7f 62 2e 30 ce 60 ..y.c?.....b.0.` 00:20:33.971 000001a0 44 70 fa 2a d1 bc fc cc 1c 19 fc 0f 87 d1 05 06 Dp.*............ 00:20:33.971 000001b0 27 1d ef 0f 1b c6 33 b5 b6 a8 b0 b3 03 08 68 18 '.....3.......h. 00:20:33.971 000001c0 40 6e 22 6e d9 76 aa 62 67 6d 88 40 38 83 ff 19 @n"n.v.bgm.@8... 00:20:33.971 000001d0 8c d4 a3 82 4f 48 93 27 44 e7 0b e7 ad a7 48 9d ....OH.'D.....H. 00:20:33.971 000001e0 41 80 81 9d 13 21 1a fe 52 f0 12 a2 b5 f2 c8 97 A....!..R....... 00:20:33.971 000001f0 23 3c 37 4f 29 73 79 85 84 02 c0 ba c0 b8 f1 75 #<7O)sy........u 00:20:33.971 dh secret: 00:20:33.971 00000000 ea 45 47 f6 d2 7e 84 70 6e 9c dc 6a 38 c5 ad e1 .EG..~.pn..j8... 00:20:33.971 00000010 6c d1 d8 32 d7 93 f3 7d 65 01 be 74 39 f2 b5 6d l..2...}e..t9..m 00:20:33.971 00000020 6a 4b 78 49 f3 0d a2 1d 7e 57 2f 71 85 75 3b fa jKxI....~W/q.u;. 00:20:33.971 00000030 d7 fb cc 88 ee 9a 96 3b de 52 8e 0d 87 79 e5 f3 .......;.R...y.. 00:20:33.971 00000040 07 7c df 5c 9c ac 89 c6 06 a0 ce cb 1b 22 6d 94 .|.\........."m. 00:20:33.971 00000050 bb b1 c4 b3 ad 84 64 b5 51 9b 89 a3 47 40 00 c6 ......d.Q...G@.. 00:20:33.971 00000060 9d b2 f5 0a e7 9b 45 be d3 00 62 64 e1 fd 6e 47 ......E...bd..nG 00:20:33.971 00000070 bb 14 e9 00 ca 2f ab e5 9a da a2 66 c2 65 13 dd ...../.....f.e.. 00:20:33.971 00000080 c1 f8 36 74 eb 39 b6 27 45 c6 ac d7 e9 89 18 e1 ..6t.9.'E....... 00:20:33.971 00000090 f5 e2 40 aa 7c ad 03 8d 90 27 ad df 3e 08 2a b2 ..@.|....'..>.*. 00:20:33.971 000000a0 65 8d bf 00 b3 be 23 82 11 cb 0a bf 85 5f 94 35 e.....#......_.5 00:20:33.971 000000b0 f1 99 6e a1 31 21 52 09 7e 19 61 7c b7 6a b3 6c ..n.1!R.~.a|.j.l 00:20:33.971 000000c0 03 fb e2 e2 f8 76 3c 38 2c 76 88 59 38 c4 dc ab .....v<8,v.Y8... 00:20:33.971 000000d0 71 80 00 8c ba 14 38 4a f2 12 66 49 27 67 f9 5d q.....8J..fI'g.] 00:20:33.971 000000e0 e4 a4 2b 45 99 78 44 46 30 85 3e f3 ea fc 99 64 ..+E.xDF0.>....d 00:20:33.971 000000f0 59 ab 65 5f f6 e5 d6 b2 45 3a 17 13 53 99 41 82 Y.e_....E:..S.A. 00:20:33.971 00000100 37 77 94 9c 43 46 68 bf 47 6d 66 71 a2 70 80 fc 7w..CFh.Gmfq.p.. 00:20:33.971 00000110 9c cd 75 67 e7 fb 21 d8 95 95 27 b1 9c 70 c6 3f ..ug..!...'..p.? 00:20:33.971 00000120 58 e1 a0 a2 fe fd d7 47 44 1e 0e ec 9f cc b8 6f X......GD......o 00:20:33.971 00000130 dd 8a 2f 97 6d a3 9e 59 49 73 82 44 c6 33 32 73 ../.m..YIs.D.32s 00:20:33.971 00000140 98 17 8b af 54 49 15 b0 f5 24 1e 78 09 8c a6 62 ....TI...$.x...b 00:20:33.971 00000150 7e a8 35 20 d2 1f bd fb ab 41 4d 32 ce 0b 8b 3a ~.5 .....AM2...: 00:20:33.971 00000160 1a bb 67 45 9f 6a 87 3a 4a e2 a4 9e b0 38 e9 79 ..gE.j.:J....8.y 00:20:33.971 00000170 46 e2 91 f3 d2 78 ae 2c 5f 96 51 3c b8 dd f5 3e F....x.,_.Q<...> 00:20:33.971 00000180 b5 76 20 f9 ea 9e a6 c5 c5 36 07 ec 18 cc 68 92 .v ......6....h. 00:20:33.971 00000190 c2 7b e2 ee 66 e3 2f ae 56 2b 0e b6 3d 31 c5 df .{..f./.V+..=1.. 00:20:33.971 000001a0 48 13 b1 99 f6 28 b0 16 cc c6 80 87 50 b0 c0 d4 H....(......P... 00:20:33.971 000001b0 02 1c 97 84 8e 41 59 4d fb a0 27 94 25 1b 67 37 .....AYM..'.%.g7 00:20:33.971 000001c0 88 51 ea 8c 42 98 1d 63 58 fa 80 03 0e c4 4e 37 .Q..B..cX.....N7 00:20:33.971 000001d0 b1 ff fc 30 a4 b9 79 35 51 97 c8 b2 0f 10 65 aa ...0..y5Q.....e. 00:20:33.971 000001e0 b5 b1 eb 15 a1 83 d8 ef a3 5b bf 33 11 87 9e ad .........[.3.... 00:20:33.971 000001f0 06 ff 9b 72 ad b5 78 b4 b9 82 a5 b6 fa 85 e7 93 ...r..x......... 00:20:33.971 [2024-09-27 15:25:28.950208] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=3, dhgroup=3, seq=3428451822, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.971 [2024-09-27 15:25:28.950304] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.971 [2024-09-27 15:25:28.986447] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.971 [2024-09-27 15:25:28.986485] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.971 [2024-09-27 15:25:28.986496] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.971 [2024-09-27 15:25:28.986538] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.971 [2024-09-27 15:25:29.148012] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.971 [2024-09-27 15:25:29.148034] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.971 [2024-09-27 15:25:29.148041] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 3 (ffdhe4096) 00:20:33.971 [2024-09-27 15:25:29.148088] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.971 [2024-09-27 15:25:29.148112] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.971 ctrlr pubkey: 00:20:33.971 00000000 47 84 f6 86 98 05 d4 3d bd d1 b5 32 4a ed 9f 0f G......=...2J... 00:20:33.971 00000010 e6 fa d5 01 92 07 d6 e9 2c 36 97 a1 af 17 76 ee ........,6....v. 00:20:33.971 00000020 3d 22 60 f9 f2 79 61 eb 54 32 8c f7 84 b0 47 62 ="`..ya.T2....Gb 00:20:33.971 00000030 ca 2d 48 0c 6d 44 0a db aa 42 19 dd 1f b7 0b d7 .-H.mD...B...... 00:20:33.971 00000040 64 f8 9b 28 fa e2 a4 f8 e0 ba 79 6c f4 2a 17 26 d..(......yl.*.& 00:20:33.971 00000050 0e 30 e6 2e f6 4a a5 ee dd b1 01 85 c0 22 5e 89 .0...J......."^. 00:20:33.971 00000060 60 79 f2 3e 9a ef 41 b8 e2 2c 96 25 6f d7 91 43 `y.>..A..,.%o..C 00:20:33.971 00000070 50 c7 20 d5 cb f4 1e af 62 28 24 cf 92 05 ae 2f P. .....b($..../ 00:20:33.971 00000080 25 48 92 0f 95 3c ac d6 64 f1 ab 2a c9 01 ee a1 %H...<..d..*.... 00:20:33.971 00000090 0d cd 3f c1 35 69 3c 1b 7e 76 9f 17 25 9d 0c 24 ..?.5i<.~v..%..$ 00:20:33.971 000000a0 52 70 2c 6f 19 14 e4 c0 80 e9 1e b2 7d 73 35 9c Rp,o........}s5. 00:20:33.971 000000b0 eb 39 68 59 08 ea 7b 05 19 ff 5c 9e df d4 3b db .9hY..{...\...;. 00:20:33.971 000000c0 17 c6 d9 a6 9d 71 f6 b5 4a fd 10 43 d9 81 d8 f9 .....q..J..C.... 00:20:33.971 000000d0 1e 74 30 78 51 62 52 fc eb 47 e6 3d 74 c0 a6 5f .t0xQbR..G.=t.._ 00:20:33.971 000000e0 7c 9e b1 08 a4 ef 7d 55 ce 42 2d 52 30 41 43 49 |.....}U.B-R0ACI 00:20:33.971 000000f0 0c 1e 6c 98 bd 57 65 50 7e f7 7b 5a 99 62 ba 8f ..l..WeP~.{Z.b.. 00:20:33.972 00000100 db 58 1d d8 d8 c3 58 3a 50 8f 18 39 f1 9a 6d 4b .X....X:P..9..mK 00:20:33.972 00000110 7a 1f d3 c8 41 b0 8e 33 5e 47 bf 70 2a bb 14 a2 z...A..3^G.p*... 00:20:33.972 00000120 d9 57 42 d1 6d fc c1 d8 97 b6 d9 20 30 b3 47 2f .WB.m...... 0.G/ 00:20:33.972 00000130 64 24 18 13 50 dc d9 2b ce c5 db 69 d2 57 fa 94 d$..P..+...i.W.. 00:20:33.972 00000140 15 9e e0 7a 38 a8 9c 06 be 2f 9a 3f a4 f4 73 a4 ...z8..../.?..s. 00:20:33.972 00000150 6e 05 f0 22 b8 11 10 7a 67 88 2e 55 ed 5d 5f 2b n.."...zg..U.]_+ 00:20:33.972 00000160 6b 44 78 dc 32 c6 65 d6 0c 55 c5 5c f3 7f 83 ff kDx.2.e..U.\.... 00:20:33.972 00000170 6d a2 f6 60 a4 da 73 b2 97 83 5d 04 8b 34 fc 79 m..`..s...]..4.y 00:20:33.972 00000180 65 7a 10 23 90 ac 0d e5 bb 59 a0 9c 69 87 b0 e8 ez.#.....Y..i... 00:20:33.972 00000190 01 b1 bf 47 68 5f 7f 46 63 7e af 7c 88 44 06 bd ...Gh_.Fc~.|.D.. 00:20:33.972 000001a0 ee 77 06 eb 2b 3f c5 1a f9 10 4f c6 84 d4 24 1c .w..+?....O...$. 00:20:33.972 000001b0 15 db f3 1d 4d 13 a9 12 ef 24 4e ec 39 2d 9d c6 ....M....$N.9-.. 00:20:33.972 000001c0 70 48 4e 57 6a 49 87 5a f4 e3 3a 3a 0c f2 34 19 pHNWjI.Z..::..4. 00:20:33.972 000001d0 58 8e ee 49 a5 09 61 31 23 7e 52 8a 05 73 42 92 X..I..a1#~R..sB. 00:20:33.972 000001e0 cd 4d 4a 7f 94 c0 e7 96 62 f9 55 cf 83 fd b6 4c .MJ.....b.U....L 00:20:33.972 000001f0 2e c0 e0 a8 e1 05 3b ad b8 63 39 d9 63 9b e6 ab ......;..c9.c... 00:20:33.972 host pubkey: 00:20:33.972 00000000 75 fc 55 26 d1 03 46 c9 00 c6 e6 05 88 a8 76 57 u.U&..F.......vW 00:20:33.972 00000010 b8 ba e2 ef 4d c2 0f 61 fd 70 51 1e 74 a6 0a 1f ....M..a.pQ.t... 00:20:33.972 00000020 99 68 05 f1 da ca 14 09 17 ab af 96 65 0c bf 0f .h..........e... 00:20:33.972 00000030 46 a4 f3 96 c9 b8 71 29 f4 4a e1 6c da 40 89 3d F.....q).J.l.@.= 00:20:33.972 00000040 11 9d 4e a8 c2 f0 1f e7 86 e0 8e 48 6f 32 d9 5a ..N........Ho2.Z 00:20:33.972 00000050 a0 f4 6c 38 96 3c 9a 6a 2a cf d0 cb 13 53 0f f6 ..l8.<.j*....S.. 00:20:33.972 00000060 db 08 41 d1 a8 ff 2a 16 b6 5e 8f d0 30 55 c5 87 ..A...*..^..0U.. 00:20:33.972 00000070 f2 95 86 f6 9f 6e 78 ac 00 a6 b7 09 11 5f f1 80 .....nx......_.. 00:20:33.972 00000080 ac 2d 91 53 1a 49 3a 63 36 73 28 c0 6e 74 a5 2b .-.S.I:c6s(.nt.+ 00:20:33.972 00000090 89 91 d0 95 9d ef 02 29 ec 45 64 f4 58 a9 56 97 .......).Ed.X.V. 00:20:33.972 000000a0 80 35 17 77 95 63 77 c9 bf 6c 4e 4f ea c1 dd b7 .5.w.cw..lNO.... 00:20:33.972 000000b0 cb d6 f8 be 8f 3e 1c e2 17 57 9b ca 0b ec 15 9c .....>...W...... 00:20:33.972 000000c0 b2 96 ef 07 87 9d 50 02 96 84 3b 69 bb 9d e4 19 ......P...;i.... 00:20:33.972 000000d0 cf b4 d9 78 2b a5 30 7d e5 61 86 fa 46 2a 1c 61 ...x+.0}.a..F*.a 00:20:33.972 000000e0 4a 26 89 9e ef 71 03 9e eb 4c 1e 44 6c e7 3a 4c J&...q...L.Dl.:L 00:20:33.972 000000f0 47 c1 7e 61 66 75 cc 11 4f e2 aa cb b1 72 05 0c G.~afu..O....r.. 00:20:33.972 00000100 f6 6e bd 39 c0 93 43 64 7f df ed 6c 9d de f8 a6 .n.9..Cd...l.... 00:20:33.972 00000110 7b 43 0e 0b 86 e2 64 7d 27 67 86 ef dd 53 e1 4b {C....d}'g...S.K 00:20:33.972 00000120 f3 00 51 5c b2 59 7d 15 9c e5 04 8f 7c 1d 2c 91 ..Q\.Y}.....|.,. 00:20:33.972 00000130 5b d8 bf e8 ba b5 5c c6 88 fe a4 ac 81 8f 7e f3 [.....\.......~. 00:20:33.972 00000140 51 91 20 e4 b8 0f 8d b6 ca a0 ff 9b cc 89 26 81 Q. ...........&. 00:20:33.972 00000150 1d e5 00 31 5b 75 3b e6 4c 7f ed e8 5a 12 e8 4e ...1[u;.L...Z..N 00:20:33.972 00000160 6a 32 a6 63 26 a2 4d 28 89 63 ee e3 48 f4 16 2e j2.c&.M(.c..H... 00:20:33.972 00000170 e3 d4 57 84 be cc ad d1 f1 a9 61 b1 6e 79 04 ef ..W.......a.ny.. 00:20:33.972 00000180 11 25 c2 23 6d 16 66 60 15 6b 19 52 cd 7b 59 ac .%.#m.f`.k.R.{Y. 00:20:33.972 00000190 63 a0 68 5e df b0 e1 20 8f 5e e4 b1 6f 34 32 af c.h^... .^..o42. 00:20:33.972 000001a0 e0 8b 3e c7 64 24 15 fa 53 d1 d2 71 3f 3e cd 54 ..>.d$..S..q?>.T 00:20:33.972 000001b0 80 f0 e1 85 2e 4a c4 68 c2 fa 33 08 44 ee ba 2c .....J.h..3.D.., 00:20:33.972 000001c0 e5 2f aa e7 9b df 1d 8b f7 1c 46 7a 53 57 7b 00 ./........FzSW{. 00:20:33.972 000001d0 eb 77 9c 84 a9 fe 0c fc 9f bc b9 f7 99 96 74 b6 .w............t. 00:20:33.972 000001e0 06 96 4c 88 75 d2 91 5d ce c4 98 a3 36 62 86 81 ..L.u..]....6b.. 00:20:33.972 000001f0 6b bd 6c 25 a8 0e bb 58 d5 09 9e 61 23 a7 ea 37 k.l%...X...a#..7 00:20:33.972 dh secret: 00:20:33.972 00000000 79 f5 01 38 a0 aa 4d a1 a7 6c f2 cf 16 d0 54 01 y..8..M..l....T. 00:20:33.972 00000010 3b 52 3a 07 fb 72 86 1b 5c fb fd fb 8e b3 86 13 ;R:..r..\....... 00:20:33.972 00000020 f1 af ec 59 64 73 fe dc cf e1 da e4 79 18 62 bb ...Yds......y.b. 00:20:33.972 00000030 99 20 f8 61 a4 84 4f 44 4c b3 15 d7 c2 87 cf 4f . .a..ODL......O 00:20:33.972 00000040 a6 f5 27 72 90 f4 57 44 c0 1b da 9b c0 cb 79 c1 ..'r..WD......y. 00:20:33.972 00000050 45 48 2b 94 5f d2 1e ca a8 98 85 36 bb 39 9d 81 EH+._......6.9.. 00:20:33.972 00000060 80 34 88 bb a6 e1 d4 12 04 2c cc ff 9d d3 3a 48 .4.......,....:H 00:20:33.972 00000070 7f 62 ac 8d d8 b8 d8 93 c2 33 4f 97 bc 4b fe 79 .b.......3O..K.y 00:20:33.972 00000080 a6 04 d7 19 9c aa 76 33 ba b1 d1 6e 5d bc d5 a8 ......v3...n]... 00:20:33.972 00000090 84 57 83 be e6 87 f5 6f e7 e6 6d 77 14 58 c2 a3 .W.....o..mw.X.. 00:20:33.972 000000a0 2a 6c 61 d5 b5 ea 46 f0 2a 47 c8 0e b3 85 76 f1 *la...F.*G....v. 00:20:33.972 000000b0 07 52 ed 94 49 72 03 2f 2c d7 0b f4 a3 a7 ac d4 .R..Ir./,....... 00:20:33.972 000000c0 4d 48 b7 7b b1 63 ea 7d 22 67 95 3c ec 9a 83 3c MH.{.c.}"g.<...< 00:20:33.972 000000d0 fb 3c 79 7a 7c 7c 5d dd 02 58 06 45 45 b4 fd 9d .... 00:20:33.972 00000100 26 74 0b e5 ff 06 1d 0e 46 2c 61 f7 36 14 c3 80 &t......F,a.6... 00:20:33.972 00000110 e4 9f 9c 49 cf ff 72 64 68 f3 92 29 8f 67 c6 bd ...I..rdh..).g.. 00:20:33.972 00000120 7a c1 0a 6d c1 df 10 b9 92 6e ef 8a 57 d3 bd bd z..m.....n..W... 00:20:33.972 00000130 7a 13 45 10 74 f6 a2 a2 ff 8f c2 fa be 9a b3 44 z.E.t..........D 00:20:33.972 00000140 a4 24 11 5b 3b db 74 cd f3 4d 27 f9 69 d0 28 4b .$.[;.t..M'.i.(K 00:20:33.972 00000150 c1 26 de 23 aa 2f ab f5 f4 f5 e4 73 c2 36 81 77 .&.#./.....s.6.w 00:20:33.972 00000160 20 ee 59 00 c5 b3 4b 51 e0 b6 fc d9 c9 09 4e 69 .Y...KQ......Ni 00:20:33.972 00000170 93 d2 3b be ce e7 dd 78 c4 98 5b b0 f7 29 c8 6f ..;....x..[..).o 00:20:33.972 00000180 38 53 5a 1d 02 95 69 fe 3e 83 3c 80 e3 48 f3 3d 8SZ...i.>.<..H.= 00:20:33.972 00000190 65 ab b0 c5 94 09 ff 3f be b3 d4 e1 2d cc 3f 9a e......?....-.?. 00:20:33.972 000001a0 1b 39 2c 87 8f 73 ad c2 65 c3 e0 25 ae f8 0c 04 .9,..s..e..%.... 00:20:33.972 000001b0 75 ca 4e b5 18 4f 25 71 6b 98 b0 b7 b7 14 14 14 u.N..O%qk....... 00:20:33.972 000001c0 85 da 78 45 62 2e d3 37 a7 ee 01 4d ad f3 bf 57 ..xEb..7...M...W 00:20:33.972 000001d0 17 5a 1b dc 84 da ff fc 0c 3c 55 b4 a8 84 d8 7d .Z.........A..,.%o..C 00:20:33.972 00000070 50 c7 20 d5 cb f4 1e af 62 28 24 cf 92 05 ae 2f P. .....b($..../ 00:20:33.972 00000080 25 48 92 0f 95 3c ac d6 64 f1 ab 2a c9 01 ee a1 %H...<..d..*.... 00:20:33.972 00000090 0d cd 3f c1 35 69 3c 1b 7e 76 9f 17 25 9d 0c 24 ..?.5i<.~v..%..$ 00:20:33.972 000000a0 52 70 2c 6f 19 14 e4 c0 80 e9 1e b2 7d 73 35 9c Rp,o........}s5. 00:20:33.972 000000b0 eb 39 68 59 08 ea 7b 05 19 ff 5c 9e df d4 3b db .9hY..{...\...;. 00:20:33.972 000000c0 17 c6 d9 a6 9d 71 f6 b5 4a fd 10 43 d9 81 d8 f9 .....q..J..C.... 00:20:33.972 000000d0 1e 74 30 78 51 62 52 fc eb 47 e6 3d 74 c0 a6 5f .t0xQbR..G.=t.._ 00:20:33.972 000000e0 7c 9e b1 08 a4 ef 7d 55 ce 42 2d 52 30 41 43 49 |.....}U.B-R0ACI 00:20:33.972 000000f0 0c 1e 6c 98 bd 57 65 50 7e f7 7b 5a 99 62 ba 8f ..l..WeP~.{Z.b.. 00:20:33.972 00000100 db 58 1d d8 d8 c3 58 3a 50 8f 18 39 f1 9a 6d 4b .X....X:P..9..mK 00:20:33.972 00000110 7a 1f d3 c8 41 b0 8e 33 5e 47 bf 70 2a bb 14 a2 z...A..3^G.p*... 00:20:33.972 00000120 d9 57 42 d1 6d fc c1 d8 97 b6 d9 20 30 b3 47 2f .WB.m...... 0.G/ 00:20:33.972 00000130 64 24 18 13 50 dc d9 2b ce c5 db 69 d2 57 fa 94 d$..P..+...i.W.. 00:20:33.972 00000140 15 9e e0 7a 38 a8 9c 06 be 2f 9a 3f a4 f4 73 a4 ...z8..../.?..s. 00:20:33.972 00000150 6e 05 f0 22 b8 11 10 7a 67 88 2e 55 ed 5d 5f 2b n.."...zg..U.]_+ 00:20:33.972 00000160 6b 44 78 dc 32 c6 65 d6 0c 55 c5 5c f3 7f 83 ff kDx.2.e..U.\.... 00:20:33.972 00000170 6d a2 f6 60 a4 da 73 b2 97 83 5d 04 8b 34 fc 79 m..`..s...]..4.y 00:20:33.972 00000180 65 7a 10 23 90 ac 0d e5 bb 59 a0 9c 69 87 b0 e8 ez.#.....Y..i... 00:20:33.972 00000190 01 b1 bf 47 68 5f 7f 46 63 7e af 7c 88 44 06 bd ...Gh_.Fc~.|.D.. 00:20:33.972 000001a0 ee 77 06 eb 2b 3f c5 1a f9 10 4f c6 84 d4 24 1c .w..+?....O...$. 00:20:33.972 000001b0 15 db f3 1d 4d 13 a9 12 ef 24 4e ec 39 2d 9d c6 ....M....$N.9-.. 00:20:33.972 000001c0 70 48 4e 57 6a 49 87 5a f4 e3 3a 3a 0c f2 34 19 pHNWjI.Z..::..4. 00:20:33.972 000001d0 58 8e ee 49 a5 09 61 31 23 7e 52 8a 05 73 42 92 X..I..a1#~R..sB. 00:20:33.973 000001e0 cd 4d 4a 7f 94 c0 e7 96 62 f9 55 cf 83 fd b6 4c .MJ.....b.U....L 00:20:33.973 000001f0 2e c0 e0 a8 e1 05 3b ad b8 63 39 d9 63 9b e6 ab ......;..c9.c... 00:20:33.973 host pubkey: 00:20:33.973 00000000 a2 a6 ec fb 2e d7 c7 22 72 b8 87 99 ef 4a fb 95 ......."r....J.. 00:20:33.973 00000010 6c 81 0d ec f4 46 a7 88 f4 1b 1e 37 80 50 2f 83 l....F.....7.P/. 00:20:33.973 00000020 d3 32 bb 06 9d b0 af 5d 8c 92 e2 11 fb 1f d4 30 .2.....].......0 00:20:33.973 00000030 0e f3 10 c5 8f 5c e1 a1 26 2f d7 15 a4 a9 bf 84 .....\..&/...... 00:20:33.973 00000040 1e 9d 05 80 63 43 96 4b 79 82 5d 06 2d 5b fe 7b ....cC.Ky.].-[.{ 00:20:33.973 00000050 37 c4 9c a7 9f e0 f9 c7 25 18 bf 0c 35 b3 66 cf 7.......%...5.f. 00:20:33.973 00000060 1d 38 37 41 06 d8 05 11 be eb 70 e3 35 23 d3 78 .87A......p.5#.x 00:20:33.973 00000070 63 04 2e 95 ce 7c 9d 26 d5 c6 4a c7 1c c0 cd c1 c....|.&..J..... 00:20:33.973 00000080 bd 21 9c 0e 03 ae 83 24 60 3f 2e a5 7c 45 ea 95 .!.....$`?..|E.. 00:20:33.973 00000090 cd 75 a7 3f b0 5e da a9 80 68 e8 72 60 f5 1b 24 .u.?.^...h.r`..$ 00:20:33.973 000000a0 14 ca ec 6b ed 88 50 8c cc 15 4e 7b 19 35 50 72 ...k..P...N{.5Pr 00:20:33.973 000000b0 13 1f 09 d1 af af b4 f9 3c 3c 22 55 ed 7f 6f 33 ........<<"U..o3 00:20:33.973 000000c0 a3 9c ba 45 cf e1 97 06 7d 22 33 86 59 10 15 2a ...E....}"3.Y..* 00:20:33.973 000000d0 28 49 2a b4 21 6f 1c 4d 14 48 8b ce ec 90 e7 4f (I*.!o.M.H.....O 00:20:33.973 000000e0 34 36 21 a5 f0 38 fa b4 ee bf c6 bf cb 47 64 18 46!..8.......Gd. 00:20:33.973 000000f0 cd ef 9c 4a 50 6d dd e4 af c6 a9 cb e2 fd ab c7 ...JPm.......... 00:20:33.973 00000100 55 01 c5 e7 cc 90 25 51 75 de ce 27 6d 8c 73 99 U.....%Qu..'m.s. 00:20:33.973 00000110 03 32 db 79 33 bf e7 cb f9 62 94 29 cf 1e ae 84 .2.y3....b.).... 00:20:33.973 00000120 a4 68 9f 2e 16 a6 66 f5 93 37 82 2a a1 71 ce 3d .h....f..7.*.q.= 00:20:33.973 00000130 61 8e 77 a2 27 49 5b 9e bf 33 81 47 c6 51 3e c8 a.w.'I[..3.G.Q>. 00:20:33.973 00000140 f7 36 d5 af 5f ae 68 c2 8a 3f 90 2a 17 8e 46 78 .6.._.h..?.*..Fx 00:20:33.973 00000150 e6 cb 20 16 75 e0 cc 16 12 dc 1d c0 6a 8b f1 d8 .. .u.......j... 00:20:33.973 00000160 e3 86 e1 61 5e 0d cb ad f6 fb f5 bb a7 7c 35 b2 ...a^........|5. 00:20:33.973 00000170 09 42 50 1d 0a 61 61 e4 14 77 ae 36 17 7d 6e c3 .BP..aa..w.6.}n. 00:20:33.973 00000180 a9 ba f8 84 b2 6d 44 d1 cc a0 3a 77 83 a5 f4 da .....mD...:w.... 00:20:33.973 00000190 4b 57 a2 b1 c8 af 24 a6 8d 51 e4 df 4f 70 8f b1 KW....$..Q..Op.. 00:20:33.973 000001a0 74 aa 53 12 ce 5b 36 a5 71 24 70 00 3b 9f d5 87 t.S..[6.q$p.;... 00:20:33.973 000001b0 4c 2e 8f 69 51 e4 f7 63 7b 7f 45 df 1a 5e 30 42 L..iQ..c{.E..^0B 00:20:33.973 000001c0 e8 f0 1f 46 7b e9 7d fb 82 6f be 3f 2c 28 11 1a ...F{.}..o.?,(.. 00:20:33.973 000001d0 8d 1e 8a bf 48 76 df 78 2a c4 6e 29 6b 6c 9a 6a ....Hv.x*.n)kl.j 00:20:33.973 000001e0 f9 78 7e c7 07 29 02 08 a3 85 f6 2a 27 05 79 90 .x~..).....*'.y. 00:20:33.973 000001f0 03 f7 55 7c 9f f0 c0 fc a1 12 82 c1 31 12 71 14 ..U|........1.q. 00:20:33.973 dh secret: 00:20:33.973 00000000 e1 00 0d 24 f6 34 be 75 97 e0 72 5f 4f 01 70 cd ...$.4.u..r_O.p. 00:20:33.973 00000010 12 88 b0 80 d2 3f 65 e9 15 31 b0 05 ee 8e 44 14 .....?e..1....D. 00:20:33.973 00000020 fd 8c 57 c1 c6 ba 4c 7d 75 eb 45 cd 28 a8 38 72 ..W...L}u.E.(.8r 00:20:33.973 00000030 8a d5 36 1d 75 7b 19 2b bc 3d 4f 06 33 8e 92 4e ..6.u{.+.=O.3..N 00:20:33.973 00000040 e5 d9 f7 03 14 91 f4 25 15 9e 33 d3 5b d7 7c 8e .......%..3.[.|. 00:20:33.973 00000050 20 d0 ec e2 47 19 43 eb 31 dc 66 ae bc 75 60 d8 ...G.C.1.f..u`. 00:20:33.973 00000060 a4 1f e4 3f 50 24 37 3e a0 d3 9d ff b2 5c 57 58 ...?P$7>.....\WX 00:20:33.973 00000070 f3 c5 b8 b6 c9 b9 53 b4 ec 74 21 c7 71 3a 31 ed ......S..t!.q:1. 00:20:33.973 00000080 a7 0a b1 4b a3 0a c8 ba 29 66 9c ff a2 d5 f7 c1 ...K....)f...... 00:20:33.973 00000090 95 19 67 c0 f3 0b ef 16 18 92 b9 0f a9 c3 97 70 ..g............p 00:20:33.973 000000a0 87 88 e6 ac 8b d9 34 2e eb 86 36 ae bc 64 40 05 ......4...6..d@. 00:20:33.973 000000b0 54 19 a2 b0 fc 30 b5 02 35 3f bd bb b8 f8 3d 2d T....0..5?....=- 00:20:33.973 000000c0 84 c1 c4 df e0 67 69 75 0f 5d ff 90 52 4f 48 c5 .....giu.]..ROH. 00:20:33.973 000000d0 71 0a 3f 98 ee bd 1a 45 f2 19 ed af 2a 26 5b 6f q.?....E....*&[o 00:20:33.973 000000e0 33 3b fd 3e 5a 53 d0 67 dd 3d f1 f4 5c 37 30 92 3;.>ZS.g.=..\70. 00:20:33.973 000000f0 38 95 0c f7 45 76 ab 57 db c1 01 8a 8d 87 79 69 8...Ev.W......yi 00:20:33.973 00000100 41 ab 96 64 1f 80 60 bd c7 2a a8 d0 90 4b da 99 A..d..`..*...K.. 00:20:33.973 00000110 ce ec 55 c1 1e 99 35 62 30 54 82 e9 e3 c1 8c 7b ..U...5b0T.....{ 00:20:33.973 00000120 8b 74 ba b4 f7 7d fb 78 cb 11 5b 0d 2c 01 01 61 .t...}.x..[.,..a 00:20:33.973 00000130 e8 a1 68 aa ed 88 92 2a e8 01 d2 6a cd 11 28 42 ..h....*...j..(B 00:20:33.973 00000140 37 40 c8 18 0e 38 8c 15 a1 c5 cd d8 28 cb 4c 76 7@...8......(.Lv 00:20:33.973 00000150 19 38 47 34 d0 0a 87 1a db 4e f4 6b af c4 f3 1f .8G4.....N.k.... 00:20:33.973 00000160 a7 40 19 02 a2 f9 6b 46 26 cd c9 64 ac 15 fe 47 .@....kF&..d...G 00:20:33.973 00000170 b9 74 8f 2f 17 8d 12 4d 31 15 c0 a3 72 51 06 aa .t./...M1...rQ.. 00:20:33.973 00000180 37 71 61 ba 94 81 11 fc 4b 84 37 2f 11 37 8f ca 7qa.....K.7/.7.. 00:20:33.973 00000190 3c 9b 62 5f cc 53 47 41 85 02 14 d8 3b f3 3c 14 <.b_.SGA....;.<. 00:20:33.973 000001a0 11 ca 03 ee fb d3 38 be 9d c3 79 b8 a2 ef b7 84 ......8...y..... 00:20:33.973 000001b0 5a 02 54 eb c1 63 86 69 be 53 19 e8 94 60 00 4f Z.T..c.i.S...`.O 00:20:33.973 000001c0 d5 90 3c 2b 47 7a 89 49 fb 52 b5 3c b1 da b6 8c ..<+Gz.I.R.<.... 00:20:33.973 000001d0 c6 d4 a5 93 47 8c e1 01 d5 24 60 5a 62 3d 95 dd ....G....$`Zb=.. 00:20:33.973 000001e0 f4 67 1b bd 17 a1 76 f8 c7 be 68 81 7e b0 bb 56 .g....v...h.~..V 00:20:33.973 000001f0 89 d6 4f 9e 2b 79 d4 94 60 37 6c 99 7a 05 b0 ff ..O.+y..`7l.z... 00:20:33.973 [2024-09-27 15:25:29.302070] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=3, dhgroup=3, seq=3428451824, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.973 [2024-09-27 15:25:29.302143] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.973 [2024-09-27 15:25:29.337743] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.973 [2024-09-27 15:25:29.337774] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.973 [2024-09-27 15:25:29.337781] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.973 [2024-09-27 15:25:29.521752] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.973 [2024-09-27 15:25:29.521772] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.973 [2024-09-27 15:25:29.521779] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.973 [2024-09-27 15:25:29.521828] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.973 [2024-09-27 15:25:29.521851] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.973 ctrlr pubkey: 00:20:33.973 00000000 a7 1e ec 24 1c 7e 33 14 39 a9 85 e2 d8 5c 6d 9d ...$.~3.9....\m. 00:20:33.973 00000010 b2 9f 57 8e 5a ce 56 3c 7b 00 a8 6a f8 01 5b 18 ..W.Z.V<{..j..[. 00:20:33.973 00000020 0d 50 e5 23 82 f5 12 4a 7c 84 4d 0c 95 f8 6f 08 .P.#...J|.M...o. 00:20:33.973 00000030 cd f5 a5 b5 f7 c3 38 11 5a 01 4c 53 28 74 a1 e9 ......8.Z.LS(t.. 00:20:33.973 00000040 bf 84 c2 19 d1 12 6c 5c 78 ba 0e ab ea 83 00 80 ......l\x....... 00:20:33.973 00000050 7a a8 bd 2b 88 40 66 6f f7 04 ec 16 c0 37 e3 c6 z..+.@fo.....7.. 00:20:33.973 00000060 97 c1 42 2a b0 67 2b 95 21 26 da 73 a2 94 a5 2a ..B*.g+.!&.s...* 00:20:33.973 00000070 3c 16 f7 eb 69 bc 7e d1 77 dd 12 2e cf 09 bc b3 <...i.~.w....... 00:20:33.973 00000080 1f 0a a9 37 ae 6b f8 d5 16 ee c4 71 98 de 5c 6b ...7.k.....q..\k 00:20:33.973 00000090 fd 50 2b 9d 0b eb 88 c9 5f d1 b2 20 76 fb 3c 2e .P+....._.. v.<. 00:20:33.973 000000a0 41 c0 bf 5b 3f ae 3a ef 93 77 47 77 31 96 be 56 A..[?.:..wGw1..V 00:20:33.973 000000b0 52 c8 76 59 50 32 36 04 f1 34 38 ee 9c 0b 4d 86 R.vYP26..48...M. 00:20:33.973 000000c0 04 de 38 6f f9 45 9c da 23 64 78 e6 9c 92 04 ac ..8o.E..#dx..... 00:20:33.973 000000d0 88 c9 11 76 c8 e9 a1 5d 4b 42 9e 6d 7d 02 b1 95 ...v...]KB.m}... 00:20:33.973 000000e0 fc 02 9c c5 fb cc a8 41 a8 6c 29 a9 9b 8e 9c f9 .......A.l)..... 00:20:33.973 000000f0 26 49 90 42 c8 2f 70 11 27 b0 c3 d0 a7 cf 26 42 &I.B./p.'.....&B 00:20:33.973 00000100 d5 d4 a6 ee d3 15 5e 79 40 b6 60 d9 22 6a 4b 53 ......^y@.`."jKS 00:20:33.973 00000110 c1 11 46 de d5 e7 69 a7 f6 30 f4 55 b0 5b 82 b5 ..F...i..0.U.[.. 00:20:33.973 00000120 36 03 f1 de 3d 20 c6 d3 b1 1d 17 fd 87 e0 ef 09 6...= .......... 00:20:33.973 00000130 dc f6 16 ed c8 2b 5e de c9 c0 11 d3 a3 6b a2 0b .....+^......k.. 00:20:33.973 00000140 f9 c0 68 50 f6 2c 77 f0 be 56 0f 57 69 62 4d 2a ..hP.,w..V.WibM* 00:20:33.973 00000150 ef f2 4c e9 b1 04 11 8e 86 a6 38 22 d0 89 05 c9 ..L.......8".... 00:20:33.973 00000160 2f 79 df f8 9e 7b 8a a8 22 0d 14 f4 3a f9 1b 1b /y...{.."...:... 00:20:33.973 00000170 c7 e3 5f 0e c5 d0 2f 59 be c2 79 12 dd 67 fe 23 .._.../Y..y..g.# 00:20:33.973 00000180 47 ca f5 d1 c5 c3 d2 c7 f9 dc 9a bc 45 39 8e 49 G...........E9.I 00:20:33.973 00000190 b4 14 9b d5 39 57 87 ca d2 a9 79 25 ce f6 1b 44 ....9W....y%...D 00:20:33.973 000001a0 c4 a8 f9 9d a6 43 87 31 ca 48 17 7b ad df c5 1a .....C.1.H.{.... 00:20:33.973 000001b0 79 24 59 dd 89 13 9d 84 39 cc f4 9b 9f f6 d3 a4 y$Y.....9....... 00:20:33.973 000001c0 be 4a 25 2a 8d 9f b6 01 df 83 39 f5 3d 7e 8d d8 .J%*......9.=~.. 00:20:33.973 000001d0 a5 7b 34 94 cd cc d7 ff 47 98 e8 af 47 b0 5a 41 .{4.....G...G.ZA 00:20:33.973 000001e0 db 80 ab 5b f5 15 2b 0a 95 ab 4e db db a5 ee dc ...[..+...N..... 00:20:33.974 000001f0 79 b5 a9 c6 cc fa d4 93 2e 9e a3 97 cc 8e 58 10 y.............X. 00:20:33.974 00000200 58 26 19 16 16 49 e1 52 f9 61 d5 69 b6 a8 df c1 X&...I.R.a.i.... 00:20:33.974 00000210 8f 81 0b 07 5a 54 f2 9d 86 5e 32 10 91 af a8 4a ....ZT...^2....J 00:20:33.974 00000220 22 fd a4 16 b5 14 17 01 85 47 0e 47 c0 b9 ab 28 "........G.G...( 00:20:33.974 00000230 0e 1c a1 ba a9 8d d8 72 9e b1 70 b4 be ca e0 00 .......r..p..... 00:20:33.974 00000240 4e 68 2b 35 a3 69 d0 19 fa 4c fe 6d ba 20 77 86 Nh+5.i...L.m. w. 00:20:33.974 00000250 16 9a 0c 23 a7 98 2b 86 06 8a 27 99 ab 61 a6 92 ...#..+...'..a.. 00:20:33.974 00000260 e0 87 22 0c d0 fb 52 75 e0 67 12 da a2 59 fe 66 .."...Ru.g...Y.f 00:20:33.974 00000270 59 76 4c 8d e2 49 3a 23 05 f6 02 c7 36 f5 c9 73 YvL..I:#....6..s 00:20:33.974 00000280 71 82 34 a5 b3 6e 3b 28 06 5a 5e 68 df 8a d3 75 q.4..n;(.Z^h...u 00:20:33.974 00000290 fc db 7f b9 6b 8a 3a 43 cf c3 72 07 a9 6a f7 e8 ....k.:C..r..j.. 00:20:33.974 000002a0 90 9f 6f 6b a5 c4 a1 d0 80 61 29 c0 37 d8 a6 fa ..ok.....a).7... 00:20:33.974 000002b0 30 8e 98 18 19 7b 44 bc df 65 c8 7f 4c 80 5c 27 0....{D..e..L.\' 00:20:33.974 000002c0 05 f8 49 c6 0e 2e e9 d1 70 fc 0a c9 7a cd a8 a4 ..I.....p...z... 00:20:33.974 000002d0 3c d5 58 b5 ed df 34 62 fb 81 cc 99 8a a5 56 e4 <.X...4b......V. 00:20:33.974 000002e0 cf 21 2a 9f 27 dc 49 ec 80 c2 da ef a7 1d f1 55 .!*.'.I........U 00:20:33.974 000002f0 d1 68 1c f6 a8 ed 21 69 8e cc e3 bd aa db 32 13 .h....!i......2. 00:20:33.974 host pubkey: 00:20:33.974 00000000 87 75 77 4e 02 21 38 2a 6a ee 03 bf 41 26 db 8c .uwN.!8*j...A&.. 00:20:33.974 00000010 87 e4 76 15 6e 68 6e 90 ee b6 96 32 0a 0e 61 25 ..v.nhn....2..a% 00:20:33.974 00000020 2c 99 cf 53 75 d3 ff b0 9d 6c da 39 82 bb 70 7d ,..Su....l.9..p} 00:20:33.974 00000030 a6 e2 c0 ab 80 94 87 26 a2 04 18 fe ca 49 82 c8 .......&.....I.. 00:20:33.974 00000040 cf de 52 ce da 10 b0 b5 3c a9 db f6 f1 c9 36 8f ..R.....<.....6. 00:20:33.974 00000050 b2 78 6c dc 7c c6 23 32 bb 6f 9e 7e dc 57 fb 49 .xl.|.#2.o.~.W.I 00:20:33.974 00000060 24 49 7f 4b 78 1b 95 fc 4a 5c 1f 1e 37 53 2d 24 $I.Kx...J\..7S-$ 00:20:33.974 00000070 4c b6 f0 21 6b 92 cd 5b c1 5a ed bc c6 5a 9c 64 L..!k..[.Z...Z.d 00:20:33.974 00000080 fa 8c 04 97 d3 ac a8 c1 17 6d b4 aa 7d 60 61 61 .........m..}`aa 00:20:33.974 00000090 ef 06 2e e4 3a b1 b4 7c 33 98 da d1 c3 85 ff 54 ....:..|3......T 00:20:33.974 000000a0 b2 11 45 c6 34 9d 85 20 93 bc 3f c8 b1 59 2e 48 ..E.4.. ..?..Y.H 00:20:33.974 000000b0 3c fb f7 15 64 2e 42 c0 8b 4a a5 5d 1f 6a e2 a4 <...d.B..J.].j.. 00:20:33.974 000000c0 f5 15 67 e2 f6 87 9c 90 80 e6 1f 43 b0 a3 d8 9f ..g........C.... 00:20:33.974 000000d0 8e 74 9a fc ef 0d 4e aa 76 35 48 55 9c 70 d9 7b .t....N.v5HU.p.{ 00:20:33.974 000000e0 57 cb 99 fc fc 53 40 62 8f ed 63 1a 0a 5f 25 28 W....S@b..c.._%( 00:20:33.974 000000f0 a9 61 57 a5 a1 d3 ff c3 30 a5 46 be f4 9b ae 67 .aW.....0.F....g 00:20:33.974 00000100 b8 8f f6 9d 45 aa 78 71 de 25 22 1c e9 77 79 00 ....E.xq.%"..wy. 00:20:33.974 00000110 cb c9 3f 58 19 c9 1f ab 16 4b 8f 19 9a 39 4d 48 ..?X.....K...9MH 00:20:33.974 00000120 87 12 5a 05 b7 df 3f fa 7c 6a ab e0 d8 fe 80 f2 ..Z...?.|j...... 00:20:33.974 00000130 d0 cd df 4a 6b 92 22 00 a2 e5 80 d0 46 20 1b 59 ...Jk.".....F .Y 00:20:33.974 00000140 40 ff 81 60 59 9d c2 df 72 65 03 9f 7f 43 47 42 @..`Y...re...CGB 00:20:33.974 00000150 e6 11 b5 88 ad 7c 6f 15 d5 03 6d 9f 1e 5d 36 c3 .....|o...m..]6. 00:20:33.974 00000160 0b 7f 93 53 98 ba 1f bd 4e a0 a5 6e f3 fc 13 d8 ...S....N..n.... 00:20:33.974 00000170 1e 1d db ae a2 42 e9 80 7a ca 59 8a 6c fb 5a 9b .....B..z.Y.l.Z. 00:20:33.974 00000180 da df c5 78 92 30 ff c3 b5 f4 1b 87 66 ab 86 36 ...x.0......f..6 00:20:33.974 00000190 5e 39 1c c5 10 78 ee cc 13 ac 1e d2 af 2b c0 50 ^9...x.......+.P 00:20:33.974 000001a0 57 4f 80 6e b1 61 b1 a1 4e 61 05 ef d7 69 dc 84 WO.n.a..Na...i.. 00:20:33.974 000001b0 08 b0 41 b6 ec f6 2f dd 6c dd b4 e1 a8 40 d8 3b ..A.../.l....@.; 00:20:33.974 000001c0 42 1e 79 9f 55 76 fc 26 53 c9 d7 b8 39 44 8e 48 B.y.Uv.&S...9D.H 00:20:33.974 000001d0 ed 12 80 e2 5c fc e1 8d ec 25 23 46 9c cb 74 ca ....\....%#F..t. 00:20:33.974 000001e0 05 ea 10 04 b3 d3 92 51 42 b3 fd 88 ca 13 f4 95 .......QB....... 00:20:33.974 000001f0 d2 a3 46 a3 67 f5 d8 e9 35 3d b3 39 8e 9b 00 42 ..F.g...5=.9...B 00:20:33.974 00000200 b8 c7 1c b8 61 d7 32 16 74 eb 05 cf 72 10 0a 2c ....a.2.t...r.., 00:20:33.974 00000210 a2 7c 78 26 72 bb ee 75 7b 79 87 8f d5 d0 19 41 .|x&r..u{y.....A 00:20:33.974 00000220 8d 2e 97 ed e6 51 38 a6 04 4e 67 b6 c2 d1 d6 50 .....Q8..Ng....P 00:20:33.974 00000230 98 6a ad 3a 8e 47 a4 6a bf 23 fa 51 60 c8 d3 c8 .j.:.G.j.#.Q`... 00:20:33.974 00000240 8e 9a 90 d0 2f dc ef 40 eb 81 1f 3f 0e d9 62 92 ..../..@...?..b. 00:20:33.974 00000250 a7 93 62 72 14 19 16 64 4b c3 fb 5a b1 f9 58 64 ..br...dK..Z..Xd 00:20:33.974 00000260 d6 9d f6 3f 30 eb e1 89 fa 74 08 6a c7 6f 56 97 ...?0....t.j.oV. 00:20:33.974 00000270 54 c1 3f a0 4b 39 fd 97 39 08 77 59 76 11 34 d2 T.?.K9..9.wYv.4. 00:20:33.974 00000280 d4 7c 91 62 10 c5 2a 91 47 db a8 1e f7 0b 96 46 .|.b..*.G......F 00:20:33.974 00000290 9e 41 4f 96 bd 3a 83 61 a0 28 a2 7a 2b c5 b3 19 .AO..:.a.(.z+... 00:20:33.974 000002a0 89 18 ad f1 e0 dd bb e6 50 9b 4e ab d8 ee a9 65 ........P.N....e 00:20:33.974 000002b0 21 4c 40 d0 e2 91 c8 42 78 ec cc f2 d4 1f e2 0c !L@....Bx....... 00:20:33.974 000002c0 24 5d dd a8 a1 94 b5 3d 21 80 2d 27 9e a9 1c 51 $].....=!.-'...Q 00:20:33.974 000002d0 50 df ee 3d 7e c4 c6 8f 9a 2c 35 91 7e 05 53 c4 P..=~....,5.~.S. 00:20:33.974 000002e0 4e ad 3c 75 4a 68 ad ea e6 58 6b 74 65 e6 d3 e7 N..}..#. 00:20:33.974 00000140 ce b4 0d 18 9e a4 67 69 c0 16 18 fd 6b 75 7c 63 ......gi....ku|c 00:20:33.974 00000150 ab 12 28 85 87 e3 60 9c ad 90 58 4f 23 2e a5 87 ..(...`...XO#... 00:20:33.974 00000160 90 f2 3a e1 98 23 42 07 7f 0e f4 69 3c d9 bc 5c ..:..#B....i<..\ 00:20:33.974 00000170 ad 60 36 20 0b 2f ce 5c 3c e4 82 c3 5f 49 07 62 .`6 ./.\<..._I.b 00:20:33.974 00000180 a2 53 53 07 43 45 96 10 55 29 9c 8f 63 6b e7 96 .SS.CE..U)..ck.. 00:20:33.974 00000190 4d 21 20 f5 cf 6a ed e0 2c 16 54 96 52 22 f8 9f M! ..j..,.T.R".. 00:20:33.974 000001a0 c7 72 67 1a 9d 47 6f 87 1b f8 aa 84 d1 70 72 f7 .rg..Go......pr. 00:20:33.974 000001b0 49 19 9f 4f 66 3a 3d ef 08 81 12 0b 07 bb 01 28 I..Of:=........( 00:20:33.974 000001c0 5b 96 1a e0 2d 75 2b 39 41 86 d4 f8 0c 17 d8 f7 [...-u+9A....... 00:20:33.974 000001d0 4f 96 bc 7b 8f 21 19 fa 37 a4 c4 e4 49 f9 3f 5f O..{.!..7...I.?_ 00:20:33.974 000001e0 c7 28 6c af 76 be ea 90 3c 72 19 eb fe a3 38 ce .(l.v....CX... 00:20:33.974 00000250 45 03 5b e6 c8 63 87 fa 9b 03 20 6b ec 65 23 b8 E.[..c.... k.e#. 00:20:33.974 00000260 bd fa 31 c9 2e 1f 1d c0 ea ae a9 32 f1 3c 0f 65 ..1........2.<.e 00:20:33.974 00000270 57 a7 d5 b3 1b 85 ce 45 39 44 ca 64 b5 a1 83 08 W......E9D.d.... 00:20:33.974 00000280 44 c7 dc 96 60 13 1e 4b 34 ba af f5 c3 e7 a1 84 D...`..K4....... 00:20:33.974 00000290 80 21 0b f2 f6 45 3a ad 0b 90 92 81 f7 d2 84 fd .!...E:......... 00:20:33.974 000002a0 9f ba 36 fe 1c 24 be 97 6e b3 9f 3e 29 a2 bf 55 ..6..$..n..>)..U 00:20:33.974 000002b0 55 14 e7 ba f1 47 f0 01 72 64 d8 ce f2 61 1b bd U....G..rd...a.. 00:20:33.974 000002c0 f2 d9 d3 fe 68 22 d1 b0 4a 68 fc d6 33 32 93 32 ....h"..Jh..32.2 00:20:33.974 000002d0 a0 eb 18 61 5e d0 89 3e fe 28 99 72 0b 3d 71 5a ...a^..>.(.r.=qZ 00:20:33.974 000002e0 00 02 9a b5 c8 19 c0 1a 1e 96 2f 07 e0 42 57 eb ........../..BW. 00:20:33.974 000002f0 69 0e 36 c0 5f a4 62 83 a5 aa 64 80 4e 26 7d d6 i.6._.b...d.N&}. 00:20:33.974 [2024-09-27 15:25:29.569525] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=3, dhgroup=4, seq=3428451825, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.974 [2024-09-27 15:25:29.605413] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.974 [2024-09-27 15:25:29.605456] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.974 [2024-09-27 15:25:29.605472] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.974 [2024-09-27 15:25:29.605492] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.974 [2024-09-27 15:25:29.605506] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.974 [2024-09-27 15:25:29.711699] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.974 [2024-09-27 15:25:29.711717] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.974 [2024-09-27 15:25:29.711724] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.975 [2024-09-27 15:25:29.711734] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.975 [2024-09-27 15:25:29.711791] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.975 ctrlr pubkey: 00:20:33.975 00000000 a7 1e ec 24 1c 7e 33 14 39 a9 85 e2 d8 5c 6d 9d ...$.~3.9....\m. 00:20:33.975 00000010 b2 9f 57 8e 5a ce 56 3c 7b 00 a8 6a f8 01 5b 18 ..W.Z.V<{..j..[. 00:20:33.975 00000020 0d 50 e5 23 82 f5 12 4a 7c 84 4d 0c 95 f8 6f 08 .P.#...J|.M...o. 00:20:33.975 00000030 cd f5 a5 b5 f7 c3 38 11 5a 01 4c 53 28 74 a1 e9 ......8.Z.LS(t.. 00:20:33.975 00000040 bf 84 c2 19 d1 12 6c 5c 78 ba 0e ab ea 83 00 80 ......l\x....... 00:20:33.975 00000050 7a a8 bd 2b 88 40 66 6f f7 04 ec 16 c0 37 e3 c6 z..+.@fo.....7.. 00:20:33.975 00000060 97 c1 42 2a b0 67 2b 95 21 26 da 73 a2 94 a5 2a ..B*.g+.!&.s...* 00:20:33.975 00000070 3c 16 f7 eb 69 bc 7e d1 77 dd 12 2e cf 09 bc b3 <...i.~.w....... 00:20:33.975 00000080 1f 0a a9 37 ae 6b f8 d5 16 ee c4 71 98 de 5c 6b ...7.k.....q..\k 00:20:33.975 00000090 fd 50 2b 9d 0b eb 88 c9 5f d1 b2 20 76 fb 3c 2e .P+....._.. v.<. 00:20:33.975 000000a0 41 c0 bf 5b 3f ae 3a ef 93 77 47 77 31 96 be 56 A..[?.:..wGw1..V 00:20:33.975 000000b0 52 c8 76 59 50 32 36 04 f1 34 38 ee 9c 0b 4d 86 R.vYP26..48...M. 00:20:33.975 000000c0 04 de 38 6f f9 45 9c da 23 64 78 e6 9c 92 04 ac ..8o.E..#dx..... 00:20:33.975 000000d0 88 c9 11 76 c8 e9 a1 5d 4b 42 9e 6d 7d 02 b1 95 ...v...]KB.m}... 00:20:33.975 000000e0 fc 02 9c c5 fb cc a8 41 a8 6c 29 a9 9b 8e 9c f9 .......A.l)..... 00:20:33.975 000000f0 26 49 90 42 c8 2f 70 11 27 b0 c3 d0 a7 cf 26 42 &I.B./p.'.....&B 00:20:33.975 00000100 d5 d4 a6 ee d3 15 5e 79 40 b6 60 d9 22 6a 4b 53 ......^y@.`."jKS 00:20:33.975 00000110 c1 11 46 de d5 e7 69 a7 f6 30 f4 55 b0 5b 82 b5 ..F...i..0.U.[.. 00:20:33.975 00000120 36 03 f1 de 3d 20 c6 d3 b1 1d 17 fd 87 e0 ef 09 6...= .......... 00:20:33.975 00000130 dc f6 16 ed c8 2b 5e de c9 c0 11 d3 a3 6b a2 0b .....+^......k.. 00:20:33.975 00000140 f9 c0 68 50 f6 2c 77 f0 be 56 0f 57 69 62 4d 2a ..hP.,w..V.WibM* 00:20:33.975 00000150 ef f2 4c e9 b1 04 11 8e 86 a6 38 22 d0 89 05 c9 ..L.......8".... 00:20:33.975 00000160 2f 79 df f8 9e 7b 8a a8 22 0d 14 f4 3a f9 1b 1b /y...{.."...:... 00:20:33.975 00000170 c7 e3 5f 0e c5 d0 2f 59 be c2 79 12 dd 67 fe 23 .._.../Y..y..g.# 00:20:33.975 00000180 47 ca f5 d1 c5 c3 d2 c7 f9 dc 9a bc 45 39 8e 49 G...........E9.I 00:20:33.975 00000190 b4 14 9b d5 39 57 87 ca d2 a9 79 25 ce f6 1b 44 ....9W....y%...D 00:20:33.975 000001a0 c4 a8 f9 9d a6 43 87 31 ca 48 17 7b ad df c5 1a .....C.1.H.{.... 00:20:33.975 000001b0 79 24 59 dd 89 13 9d 84 39 cc f4 9b 9f f6 d3 a4 y$Y.....9....... 00:20:33.975 000001c0 be 4a 25 2a 8d 9f b6 01 df 83 39 f5 3d 7e 8d d8 .J%*......9.=~.. 00:20:33.975 000001d0 a5 7b 34 94 cd cc d7 ff 47 98 e8 af 47 b0 5a 41 .{4.....G...G.ZA 00:20:33.975 000001e0 db 80 ab 5b f5 15 2b 0a 95 ab 4e db db a5 ee dc ...[..+...N..... 00:20:33.975 000001f0 79 b5 a9 c6 cc fa d4 93 2e 9e a3 97 cc 8e 58 10 y.............X. 00:20:33.975 00000200 58 26 19 16 16 49 e1 52 f9 61 d5 69 b6 a8 df c1 X&...I.R.a.i.... 00:20:33.975 00000210 8f 81 0b 07 5a 54 f2 9d 86 5e 32 10 91 af a8 4a ....ZT...^2....J 00:20:33.975 00000220 22 fd a4 16 b5 14 17 01 85 47 0e 47 c0 b9 ab 28 "........G.G...( 00:20:33.975 00000230 0e 1c a1 ba a9 8d d8 72 9e b1 70 b4 be ca e0 00 .......r..p..... 00:20:33.975 00000240 4e 68 2b 35 a3 69 d0 19 fa 4c fe 6d ba 20 77 86 Nh+5.i...L.m. w. 00:20:33.975 00000250 16 9a 0c 23 a7 98 2b 86 06 8a 27 99 ab 61 a6 92 ...#..+...'..a.. 00:20:33.975 00000260 e0 87 22 0c d0 fb 52 75 e0 67 12 da a2 59 fe 66 .."...Ru.g...Y.f 00:20:33.975 00000270 59 76 4c 8d e2 49 3a 23 05 f6 02 c7 36 f5 c9 73 YvL..I:#....6..s 00:20:33.975 00000280 71 82 34 a5 b3 6e 3b 28 06 5a 5e 68 df 8a d3 75 q.4..n;(.Z^h...u 00:20:33.975 00000290 fc db 7f b9 6b 8a 3a 43 cf c3 72 07 a9 6a f7 e8 ....k.:C..r..j.. 00:20:33.975 000002a0 90 9f 6f 6b a5 c4 a1 d0 80 61 29 c0 37 d8 a6 fa ..ok.....a).7... 00:20:33.975 000002b0 30 8e 98 18 19 7b 44 bc df 65 c8 7f 4c 80 5c 27 0....{D..e..L.\' 00:20:33.975 000002c0 05 f8 49 c6 0e 2e e9 d1 70 fc 0a c9 7a cd a8 a4 ..I.....p...z... 00:20:33.975 000002d0 3c d5 58 b5 ed df 34 62 fb 81 cc 99 8a a5 56 e4 <.X...4b......V. 00:20:33.975 000002e0 cf 21 2a 9f 27 dc 49 ec 80 c2 da ef a7 1d f1 55 .!*.'.I........U 00:20:33.975 000002f0 d1 68 1c f6 a8 ed 21 69 8e cc e3 bd aa db 32 13 .h....!i......2. 00:20:33.975 host pubkey: 00:20:33.975 00000000 13 9e bc 48 55 f8 51 82 ee 1a 4f aa fc 41 ee c3 ...HU.Q...O..A.. 00:20:33.975 00000010 a8 c0 4c 2e 04 d8 fa e8 1a 8d 9d 4b f4 92 d8 20 ..L........K... 00:20:33.975 00000020 48 35 10 78 91 0a 3a 95 e3 9f 85 a0 ea c9 4c 78 H5.x..:.......Lx 00:20:33.975 00000030 67 3f aa 1f ff e0 0b 5e 13 29 48 89 8d 92 b4 2a g?.....^.)H....* 00:20:33.975 00000040 18 5c 1e 94 b8 c9 4a 7a e5 19 5c 07 4c 87 40 50 .\....Jz..\.L.@P 00:20:33.975 00000050 ac 43 f2 94 3c 7c c1 df d7 15 1f a9 a6 cd 26 bb .C..<|........&. 00:20:33.975 00000060 4d a9 8c 92 ba 25 32 d1 1c 17 21 b4 88 4b 90 2d M....%2...!..K.- 00:20:33.975 00000070 9f fc ff 32 28 37 0f 3f 5a 8b 1b 05 6f f1 61 51 ...2(7.?Z...o.aQ 00:20:33.975 00000080 ad 5a 51 a0 01 7f da 59 45 d5 f6 19 fa 47 30 91 .ZQ....YE....G0. 00:20:33.975 00000090 67 60 a7 32 90 18 1f 19 83 96 db 00 33 68 b4 49 g`.2........3h.I 00:20:33.975 000000a0 3d c9 ff 5e 0e a2 f6 d0 d3 c2 9f eb 38 dc f2 0e =..^........8... 00:20:33.975 000000b0 7b 39 75 b7 e9 65 2c 99 39 22 ee 75 24 7f f8 43 {9u..e,.9".u$..C 00:20:33.975 000000c0 8d 90 13 91 0c eb 7a 57 5a 34 e2 e7 3c e3 ba aa ......zWZ4..<... 00:20:33.975 000000d0 5d a7 4a f0 c7 12 cb d2 a9 c2 b2 9a 92 c6 d8 5d ].J............] 00:20:33.975 000000e0 85 a2 48 ec a8 3c 59 4e 26 72 84 21 eb e2 57 ba ..H....D.$........ 00:20:33.975 00000180 bc 64 97 f2 b5 bd a6 13 9c d4 9a 0b 31 dd 50 05 .d..........1.P. 00:20:33.975 00000190 e0 d1 de 8f a9 f7 25 73 86 fa 1b 15 56 d3 bd 98 ......%s....V... 00:20:33.975 000001a0 fd 68 e6 55 5f cd 56 b8 01 90 11 05 75 6c f2 48 .h.U_.V.....ul.H 00:20:33.975 000001b0 a2 ca ad b6 0f 25 06 c6 d6 e7 f6 8f a3 fb 17 fc .....%.......... 00:20:33.975 000001c0 72 a9 05 ac b8 e4 3e 40 fc 5c c6 78 f1 69 d3 86 r.....>@.\.x.i.. 00:20:33.975 000001d0 78 46 a8 2a 77 9c d6 34 82 b9 f5 b5 18 41 4a 59 xF.*w..4.....AJY 00:20:33.975 000001e0 a9 7f 1c 17 31 62 fa 8b 61 5b b4 38 6c ac cb e6 ....1b..a[.8l... 00:20:33.975 000001f0 ed da ba ae b9 6a 27 5c fc c7 08 b8 23 4f 0e 4f .....j'\....#O.O 00:20:33.975 00000200 35 f5 3f e3 eb f5 ac 20 ad 8d 0a 67 c9 2a 1e 55 5.?.... ...g.*.U 00:20:33.975 00000210 92 65 7f 59 ea 36 21 0d c5 84 a4 d5 73 1a 77 23 .e.Y.6!.....s.w# 00:20:33.975 00000220 ff 05 19 5d 33 11 62 7f 3b 48 86 fb 08 72 62 03 ...]3.b.;H...rb. 00:20:33.975 00000230 03 ba 62 f3 9c 26 f6 dc 5e 4e 5a b6 bb f0 08 89 ..b..&..^NZ..... 00:20:33.975 00000240 38 62 63 d8 84 cf 5e 7a 69 94 8a 18 36 97 ca 70 8bc...^zi...6..p 00:20:33.975 00000250 db bc 88 1c 94 94 59 c0 95 b4 f2 95 44 50 56 df ......Y.....DPV. 00:20:33.975 00000260 b8 26 d0 5a 12 4e 80 ff e5 b4 5f bf 42 9c ef 82 .&.Z.N...._.B... 00:20:33.975 00000270 43 22 0c a7 a8 81 0c b0 fc 28 f2 62 d7 91 49 c0 C".......(.b..I. 00:20:33.975 00000280 9f 11 7d fa f8 63 33 9b 40 d0 dd 25 94 c1 8b ed ..}..c3.@..%.... 00:20:33.975 00000290 61 b6 24 62 f8 bc f4 bf 06 6a 3a c0 00 96 e4 a8 a.$b.....j:..... 00:20:33.975 000002a0 8b 93 1e ea ab a5 d5 df d5 30 46 86 d1 74 3c 5e .........0F..t<^ 00:20:33.975 000002b0 42 e1 65 a0 bd 83 e7 99 6c f3 f5 fa ed 24 7c 5d B.e.....l....$|] 00:20:33.975 000002c0 0d 14 dd 78 cc 8d d1 32 dc 1c 90 3d 6b e2 4b 40 ...x...2...=k.K@ 00:20:33.975 000002d0 1a b4 cd 6e ce 82 a0 d2 36 e8 d6 12 2d 80 3e 7c ...n....6...-.>| 00:20:33.975 000002e0 c6 89 79 84 7a 0b b4 1a c8 60 a4 17 cb a7 75 6f ..y.z....`....uo 00:20:33.975 000002f0 b4 5d 30 62 2d 3b 60 9d ca 33 a0 e7 7d 5a 8a f1 .]0b-;`..3..}Z.. 00:20:33.975 dh secret: 00:20:33.975 00000000 41 f8 62 df a6 b1 d5 56 59 f1 ef f6 a4 57 b9 94 A.b....VY....W.. 00:20:33.975 00000010 1e 89 41 89 da fa a2 fc d6 75 20 46 82 22 ad 04 ..A......u F.".. 00:20:33.975 00000020 85 74 7e 8b 15 3a fd af 57 0a 40 9f 29 f5 60 9e .t~..:..W.@.).`. 00:20:33.975 00000030 99 71 35 97 34 a7 17 85 c8 53 a1 c9 47 60 03 f0 .q5.4....S..G`.. 00:20:33.975 00000040 20 7c c4 03 f1 d7 fe 20 9f 75 2f 48 4b 04 00 7b |..... .u/HK..{ 00:20:33.975 00000050 62 96 73 b7 a1 e0 1f 26 43 c4 f2 f0 de f6 52 7f b.s....&C.....R. 00:20:33.975 00000060 b8 0f 71 82 06 a6 e2 21 d9 10 c8 b4 4a d7 59 d0 ..q....!....J.Y. 00:20:33.975 00000070 2a cd 7c f1 d5 55 3f cd c4 a4 f9 68 63 04 76 dd *.|..U?....hc.v. 00:20:33.975 00000080 8e f1 77 64 1f 1d 4f 17 28 35 a1 21 69 9f ce 54 ..wd..O.(5.!i..T 00:20:33.975 00000090 45 62 8c a9 ea dd ef c0 13 ed 3d 0e 4f e8 52 f0 Eb........=.O.R. 00:20:33.975 000000a0 f2 8b 48 b7 c1 87 99 ae 9f 71 99 25 00 50 32 ac ..H......q.%.P2. 00:20:33.975 000000b0 c9 2e 8b 88 1f a4 bd fa af d6 1d 1f b1 4e c8 b6 .............N.. 00:20:33.975 000000c0 6e 23 99 4f 78 6d 82 14 e0 36 92 28 5d 68 25 3d n#.Oxm...6.(]h%= 00:20:33.975 000000d0 77 c9 90 27 5f 9c b0 27 7a 83 bc 9d ec ec 5f fc w..'_..'z....._. 00:20:33.975 000000e0 51 b9 01 1d 61 79 43 9c cb 60 33 99 01 84 13 1c Q...ayC..`3..... 00:20:33.975 000000f0 d5 d8 d6 00 ba 6f b5 99 55 6c ef a3 f4 b7 a1 9b .....o..Ul...... 00:20:33.975 00000100 c8 a2 6e 76 73 93 6e 45 22 fc 6b aa 71 ff 7f 30 ..nvs.nE".k.q..0 00:20:33.975 00000110 9c c3 ef 09 c1 4d a6 71 82 a3 dc d0 f3 6a b1 7d .....M.q.....j.} 00:20:33.975 00000120 fa b1 20 84 d2 f2 b9 b8 16 e8 67 47 8c 37 80 7a .. .......gG.7.z 00:20:33.975 00000130 3d 63 fa 0c e4 6b 6d 60 93 da 65 68 38 cc 2e 79 =c...km`..eh8..y 00:20:33.975 00000140 8b ac f9 c3 10 9a 01 b3 78 be 08 9f 6a 47 a1 ae ........x...jG.. 00:20:33.975 00000150 81 41 a7 1e 91 ed 9c 51 ca 65 32 2f c2 3b ad e6 .A.....Q.e2/.;.. 00:20:33.975 00000160 ce 95 c0 10 ea 51 8a d7 e3 07 a8 7c 8b 5d 5c c5 .....Q.....|.]\. 00:20:33.975 00000170 1f 9c 00 80 ba cd bf 23 0b 90 65 76 d5 47 e7 c8 .......#..ev.G.. 00:20:33.975 00000180 5c 7b 98 e3 16 1f 2a 31 a3 11 e7 bf 8d 31 ce fb \{....*1.....1.. 00:20:33.975 00000190 40 d5 3d 31 7a 83 4d 83 61 5a 54 ca 56 6e d4 0c @.=1z.M.aZT.Vn.. 00:20:33.975 000001a0 69 39 f1 f4 eb 37 0a 23 70 43 f3 7c 05 c0 66 da i9...7.#pC.|..f. 00:20:33.975 000001b0 ad 43 de 85 f0 0a 45 0c 23 fd ea 3c 8d 32 10 03 .C....E.#..<.2.. 00:20:33.975 000001c0 10 df 69 c8 e0 82 3e 8d 89 d4 2f 62 07 9e 0d 2d ..i...>.../b...- 00:20:33.975 000001d0 08 b9 37 62 f8 7a c3 fb 93 08 c0 a9 de 0d 46 16 ..7b.z........F. 00:20:33.976 000001e0 a0 48 9d 00 47 7b 5c 7a ba 86 dd 61 92 8c 2b 61 .H..G{\z...a..+a 00:20:33.976 000001f0 52 9a dd f5 29 45 ac 04 00 53 95 2e f9 92 07 e1 R...)E...S...... 00:20:33.976 00000200 dd 85 5b 02 6f ca ff 4a e3 ea 60 c3 be 3e 0b 78 ..[.o..J..`..>.x 00:20:33.976 00000210 9a ea a5 02 5a 5b 5a 91 cd 5f c5 41 90 62 4f e8 ....Z[Z.._.A.bO. 00:20:33.976 00000220 a7 aa f3 70 e5 50 1a 51 c7 4d 35 25 6d d8 a9 f1 ...p.P.Q.M5%m... 00:20:33.976 00000230 1a f3 87 c2 e2 a3 fe 08 ac 85 f2 9b 86 58 bc d7 .............X.. 00:20:33.976 00000240 16 13 eb f7 48 12 93 0e 4e f8 77 c2 37 41 cf 37 ....H...N.w.7A.7 00:20:33.976 00000250 51 41 e5 73 aa 4e ad 98 34 5d 63 b6 2c 01 18 08 QA.s.N..4]c.,... 00:20:33.976 00000260 6c ee 45 ab fc 66 30 6e 62 32 4a d3 9e 60 62 79 l.E..f0nb2J..`by 00:20:33.976 00000270 12 e5 23 ee 79 b1 e4 45 a3 56 5f b3 bf 37 66 38 ..#.y..E.V_..7f8 00:20:33.976 00000280 02 2b 20 eb 10 05 eb 4f b0 46 cf d2 54 70 1f 3f .+ ....O.F..Tp.? 00:20:33.976 00000290 ab 2a 31 36 92 ca 42 5b 72 d4 b7 23 c9 fd cf 99 .*16..B[r..#.... 00:20:33.976 000002a0 f0 2d 71 6c 2f 50 6a 35 93 06 5a 35 89 c4 5b da .-ql/Pj5..Z5..[. 00:20:33.976 000002b0 3b 09 d7 b5 5b 3a f9 bb a1 1c 7b 5b f2 fd ed 22 ;...[:....{[..." 00:20:33.976 000002c0 e1 8a a5 1d 0c ae 46 a0 80 dd 9d 33 a8 a7 6d 62 ......F....3..mb 00:20:33.976 000002d0 23 7f 61 08 f2 77 97 31 c6 5d 07 3d 3e 00 27 50 #.a..w.1.].=>.'P 00:20:33.976 000002e0 d6 ab c2 56 aa 02 a0 90 51 e7 ae 9e cc 0c a7 14 ...V....Q....... 00:20:33.976 000002f0 66 06 02 0b 94 0c 2c 5c 01 38 ec 8a 47 9d 77 95 f.....,\.8..G.w. 00:20:33.976 [2024-09-27 15:25:29.760349] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key0, hash=3, dhgroup=4, seq=3428451826, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.976 [2024-09-27 15:25:29.760459] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.976 [2024-09-27 15:25:29.818635] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.976 [2024-09-27 15:25:29.818678] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.976 [2024-09-27 15:25:29.818689] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.976 [2024-09-27 15:25:29.818715] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.976 [2024-09-27 15:25:30.004113] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.976 [2024-09-27 15:25:30.004134] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.976 [2024-09-27 15:25:30.004141] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.976 [2024-09-27 15:25:30.004190] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.976 [2024-09-27 15:25:30.004214] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.976 ctrlr pubkey: 00:20:33.976 00000000 d4 f4 ea 2e 63 ed b9 0e fa b4 a6 54 35 08 62 9d ....c......T5.b. 00:20:33.976 00000010 4e 29 19 d0 71 de 47 da 7f 2a 62 59 05 5a 45 36 N)..q.G..*bY.ZE6 00:20:33.976 00000020 8a 02 ba cb 1a 52 f3 46 f0 24 9d 28 8c bc 99 0f .....R.F.$.(.... 00:20:33.976 00000030 ea 07 d5 15 77 2d 0d 0d 89 2b 1f 19 aa 17 1c 0a ....w-...+...... 00:20:33.976 00000040 cf e5 42 b2 c0 7e fe fb b7 03 fc ea 77 62 b4 f6 ..B..~......wb.. 00:20:33.976 00000050 73 6c 96 7f 05 41 0f 20 22 57 14 73 51 7a 84 c5 sl...A. "W.sQz.. 00:20:33.976 00000060 07 5c 5e e7 b0 93 aa 9f fa b1 98 d2 ff 4b 85 d6 .\^..........K.. 00:20:33.976 00000070 09 86 52 17 61 cb e7 ec 5f 16 43 4a 09 53 b3 90 ..R.a..._.CJ.S.. 00:20:33.976 00000080 cf ae ec 1d 0a 90 2f 9a 98 7a 20 f7 3c 45 e2 91 ....../..z .>6r^4.h..... 00:20:33.976 000000f0 8b 31 93 9b 97 2a 7c 1b 0f 62 37 c7 de 08 66 48 .1...*|..b7...fH 00:20:33.976 00000100 70 78 e5 a8 23 d8 c1 02 68 ad 36 92 96 ff dd 48 px..#...h.6....H 00:20:33.976 00000110 de 26 d2 f5 3e 00 5d 1f 76 df 40 e8 ba dc d9 51 .&..>.].v.@....Q 00:20:33.976 00000120 bf d7 dd d9 5d 5d 49 cd ab 47 ed 93 f1 12 49 7f ....]]I..G....I. 00:20:33.976 00000130 0d 46 dc 1a 47 7b e6 2f ea 55 b1 eb 56 fe 93 44 .F..G{./.U..V..D 00:20:33.976 00000140 35 75 77 a5 ec a6 76 67 99 97 7c 65 c8 0a f0 a7 5uw...vg..|e.... 00:20:33.976 00000150 20 0f 1c 35 39 c0 6e a3 ed 56 b6 72 6e 06 33 ba ..59.n..V.rn.3. 00:20:33.976 00000160 d1 c8 f9 88 f8 0b 9e 2e 1b 53 e9 fa f3 4a 02 b0 .........S...J.. 00:20:33.976 00000170 31 91 83 ec 3f 26 21 c5 60 cd 34 72 24 b0 23 7e 1...?&!.`.4r$.#~ 00:20:33.976 00000180 53 a9 f1 6e 90 c5 99 bd a4 45 2c 67 47 61 5b f5 S..n.....E,gGa[. 00:20:33.976 00000190 d0 b9 25 f1 bc 61 7a 66 f1 c3 b2 22 48 70 ff 12 ..%..azf..."Hp.. 00:20:33.976 000001a0 78 a1 00 15 4d c8 6f e0 12 8c 08 c3 63 8c b8 24 x...M.o.....c..$ 00:20:33.976 000001b0 7f 1d 1f a1 15 9a ea 37 81 df 2c 15 a1 eb 78 5f .......7..,...x_ 00:20:33.976 000001c0 67 83 df 50 e5 38 9a c9 8c b6 fc f5 3a a5 39 68 g..P.8......:.9h 00:20:33.976 000001d0 ed 87 1a 97 b0 32 0b 55 a5 5d 9d 58 33 47 a3 40 .....2.U.].X3G.@ 00:20:33.976 000001e0 5d c5 12 92 95 72 0b 78 27 de df 96 6d ac 7f 82 ]....r.x'...m... 00:20:33.976 000001f0 ae de 66 f1 31 57 ad 8e 32 34 fc c4 e7 e1 ee eb ..f.1W..24...... 00:20:33.976 00000200 37 8d 3a 5f f4 c7 65 1f 51 03 ff 0f 1b ea 1c 68 7.:_..e.Q......h 00:20:33.976 00000210 b2 b6 ac 74 f3 1d 58 6b 7a dc 80 8d f2 26 7d 26 ...t..Xkz....&}& 00:20:33.976 00000220 59 d1 31 79 2a f5 b0 6e e3 65 7a b7 00 16 7b 8f Y.1y*..n.ez...{. 00:20:33.976 00000230 1a 60 85 e9 ab bc 5a 71 1f d7 55 94 98 6d d8 fd .`....Zq..U..m.. 00:20:33.976 00000240 50 20 fc d0 d6 af e2 19 0c ef ee a6 1b 0a 5d cc P ............]. 00:20:33.976 00000250 cb 9e 5a d2 f3 c0 83 30 81 8f 13 83 e4 a3 46 53 ..Z....0......FS 00:20:33.976 00000260 35 25 11 be 37 b2 c6 ad 72 95 7f 9e d2 d6 61 d6 5%..7...r.....a. 00:20:33.976 00000270 46 1d f8 c8 e5 e1 92 05 e3 4d a3 bc 53 c3 fa 6b F........M..S..k 00:20:33.976 00000280 4a 1f 3d c0 8d 7f 24 d5 ff 58 c8 d6 23 ba 64 8f J.=...$..X..#.d. 00:20:33.976 00000290 af 4b c0 1c 64 bd f8 48 f0 d2 e4 97 2c 45 94 5c .K..d..H....,E.\ 00:20:33.976 000002a0 62 32 bc b4 19 28 10 f5 86 be b8 66 89 30 37 4b b2...(.....f.07K 00:20:33.976 000002b0 ad 75 30 0a 9d 11 e3 33 7f a7 1a 26 1b 31 71 b2 .u0....3...&.1q. 00:20:33.976 000002c0 1a 0e b7 55 91 00 87 f6 4c 76 ed 5f b6 40 dd 12 ...U....Lv._.@.. 00:20:33.976 000002d0 31 7b d2 7b 8e 6c e1 38 9b 20 f2 82 7d ee 8b 9d 1{.{.l.8. ..}... 00:20:33.976 000002e0 24 5a d8 ae e2 a9 bb 53 f1 6b 0a 3a a9 06 e4 03 $Z.....S.k.:.... 00:20:33.976 000002f0 ec cb dc 8c a1 45 5b ca 32 10 c8 1a fb c0 aa 1a .....E[.2....... 00:20:33.976 host pubkey: 00:20:33.976 00000000 2b 82 4b 47 db 31 0b d6 f3 ca 33 e1 e3 4f 0a fd +.KG.1....3..O.. 00:20:33.976 00000010 0d dd 5c 8f 31 ec 3e 70 3e 04 7d 61 8a 6b 13 80 ..\.1.>p>.}a.k.. 00:20:33.976 00000020 18 ac 40 da 31 51 94 2b bd b5 f5 83 5c fd eb bc ..@.1Q.+....\... 00:20:33.976 00000030 b7 cf 2c 17 a6 db 9b 2a fc ea 08 1a 0f 19 b2 d0 ..,....*........ 00:20:33.976 00000040 b4 1b ef f9 45 92 6c 80 32 90 51 c4 30 f9 54 5e ....E.l.2.Q.0.T^ 00:20:33.976 00000050 0d 97 00 da 4a 0f ab 95 04 ff 72 7e 11 e6 3c f5 ....J.....r~..<. 00:20:33.976 00000060 1c 97 02 92 f3 f5 d8 55 43 01 a4 3d 70 29 56 0a .......UC..=p)V. 00:20:33.976 00000070 4f a0 0a 5b 3d dd 3c 7b 28 30 54 25 f8 15 48 e8 O..[=.<{(0T%..H. 00:20:33.976 00000080 bc ad 1d d1 37 b0 97 72 b1 da 6b 19 e8 f3 ca 3a ....7..r..k....: 00:20:33.976 00000090 53 2d 2c ed 5d 81 41 e0 af c8 c3 a3 8c db 3f 12 S-,.].A.......?. 00:20:33.976 000000a0 cb cf f3 69 dc 66 ef da cf 0c 1c f3 ac b2 f4 cb ...i.f.......... 00:20:33.976 000000b0 ab e2 57 4f a1 4c a3 8c 63 1e be 2e 53 da c8 0c ..WO.L..c...S... 00:20:33.976 000000c0 56 d0 d6 83 7d 3c 57 a0 08 3d c9 4f 77 05 22 bb V...}PAs...\.!.eK 00:20:33.976 000000f0 ac a0 e1 fb 52 86 ed b2 cf fb 42 fb 3f e9 5c 0a ....R.....B.?.\. 00:20:33.976 00000100 1d 37 0c 52 54 9e f7 45 ad 20 ab 74 74 18 77 c6 .7.RT..E. .tt.w. 00:20:33.976 00000110 1b d7 b7 7f 27 0b 81 92 2c 40 0b 1e fd 58 da 9a ....'...,@...X.. 00:20:33.976 00000120 48 3c 05 57 ea d0 29 aa 88 da 89 d4 c3 0c 98 e6 H<.W..)......... 00:20:33.976 00000130 b8 e9 d2 97 fe 38 7e 55 ae 0f 32 15 90 63 94 31 .....8~U..2..c.1 00:20:33.976 00000140 df 16 f2 bf 76 15 76 cb 2b b0 d4 75 0d 8c 7b 5e ....v.v.+..u..{^ 00:20:33.976 00000150 70 4b 11 6c f7 6a 4b 41 ca a7 e3 e8 07 8e 53 0b pK.l.jKA......S. 00:20:33.976 00000160 19 6c 03 f0 62 0e f9 ed 0a 30 fa e1 de 26 7b 88 .l..b....0...&{. 00:20:33.976 00000170 11 7a df 61 4f 09 3d ac 82 c4 b9 2e ad ef b6 d5 .z.aO.=......... 00:20:33.976 00000180 fc ff 11 ea 89 fd cc 06 49 7b c0 e9 97 d8 b2 63 ........I{.....c 00:20:33.976 00000190 16 68 62 6e 1c 33 a9 dc 59 7b 93 70 c7 36 45 8e .hbn.3..Y{.p.6E. 00:20:33.976 000001a0 59 99 c3 79 0a 6b 6a 16 e9 4a dd 81 0c fa 6c 8a Y..y.kj..J....l. 00:20:33.976 000001b0 fb e1 ac f8 89 5b ac 24 3c 52 e8 bf e2 b8 92 6d .....[.$.Gj.Kz....' 00:20:33.976 00000250 a5 a7 2f 90 bc 33 c5 8c 97 7e 08 c3 fa 39 61 91 ../..3...~...9a. 00:20:33.976 00000260 2b 68 34 7b 62 2b 40 05 6e 43 86 b1 0d e7 6c f6 +h4{b+@.nC....l. 00:20:33.976 00000270 11 80 4f c4 80 62 8f 04 45 47 e6 da d7 a6 93 c9 ..O..b..EG...... 00:20:33.976 00000280 83 ef 3b f1 05 6b dc 2d 87 94 1b 6e b7 90 3c 34 ..;..k.-...n..<4 00:20:33.976 00000290 51 9e c7 66 5b f1 1d 77 ef 25 59 c6 51 eb 7c c7 Q..f[..w.%Y.Q.|. 00:20:33.976 000002a0 a5 ff d4 2a ef 7e 92 23 63 44 70 75 7d f6 9c e4 ...*.~.#cDpu}... 00:20:33.976 000002b0 2c 06 d3 bb 09 7a 23 6a 62 e9 a6 49 9a fc 4f 09 ,....z#jb..I..O. 00:20:33.977 000002c0 66 fc b5 bb 27 8e b8 05 29 38 a0 c1 9c bc 36 ee f...'...)8....6. 00:20:33.977 000002d0 74 26 f9 7e 83 68 53 88 c0 5e 67 75 02 76 4b ad t&.~.hS..^gu.vK. 00:20:33.977 000002e0 77 4d d6 c4 26 05 93 21 09 92 b0 0a bb 2b e5 aa wM..&..!.....+.. 00:20:33.977 000002f0 dc 5c af 6a 14 0f e4 c7 7b c2 87 a0 5a 09 a3 e8 .\.j....{...Z... 00:20:33.977 dh secret: 00:20:33.977 00000000 d3 fd 98 dc 5c e6 3b d0 f0 ec 34 ad ed 50 cf b3 ....\.;...4..P.. 00:20:33.977 00000010 be a8 b1 59 34 42 19 7e 60 05 f1 2d d0 4f 0e a0 ...Y4B.~`..-.O.. 00:20:33.977 00000020 de 1b a5 58 fa 40 0e fd df 49 c3 b7 e6 0e 67 1c ...X.@...I....g. 00:20:33.977 00000030 4e ee ae 16 2d 53 1b 11 6f 80 22 14 94 13 1a 5a N...-S..o."....Z 00:20:33.977 00000040 1b bd 50 ef 61 05 b3 82 50 4f 10 1e c3 bf fc 01 ..P.a...PO...... 00:20:33.977 00000050 20 c2 e8 0a 64 6f 78 bb 30 97 da 22 6d 2d ae d9 ...dox.0.."m-.. 00:20:33.977 00000060 b7 61 24 e0 48 f1 5f c6 46 da 71 f7 8f 18 ed 69 .a$.H._.F.q....i 00:20:33.977 00000070 d1 a8 eb f6 45 7f 50 28 e1 18 11 ac 54 1e c7 41 ....E.P(....T..A 00:20:33.977 00000080 31 ba 5a f0 c2 68 a3 21 5a 55 b0 b0 bc f9 e3 dc 1.Z..h.!ZU...... 00:20:33.977 00000090 d9 68 a0 c3 da 2b 0a 0b b9 69 3c d0 7c b6 d7 25 .h...+...i<.|..% 00:20:33.977 000000a0 e8 00 a6 71 e8 78 16 57 6c 1a 0c ce 75 a2 59 05 ...q.x.Wl...u.Y. 00:20:33.977 000000b0 45 62 a2 e4 d6 1b a0 41 2c 52 52 fb 22 76 9c 7e Eb.....A,RR."v.~ 00:20:33.977 000000c0 77 95 58 45 07 8b 5a c1 36 d8 8e 02 c8 c7 ea 07 w.XE..Z.6....... 00:20:33.977 000000d0 ab 7d b7 e5 ff 57 90 00 78 2a a8 7b fd 16 c2 4f .}...W..x*.{...O 00:20:33.977 000000e0 1f 51 dc 94 a1 11 ba ad ad d6 a3 1f a4 bd 10 75 .Q.............u 00:20:33.977 000000f0 ab 7c 49 04 86 a2 6d 2e 04 36 e6 7c 5c f0 e9 82 .|I...m..6.|\... 00:20:33.977 00000100 be 24 ed 9d 04 85 fc 44 2b 8f 95 b9 26 0f 9a e8 .$.....D+...&... 00:20:33.977 00000110 9d 3e 33 7f 1c 44 a5 9d 20 cd 94 a5 7e 32 dc 3c .>3..D.. ...~2.< 00:20:33.977 00000120 25 4e d0 bd 0b 9f 03 77 46 b6 b9 34 d9 f3 93 c5 %N.....wF..4.... 00:20:33.977 00000130 bd 7d b5 11 1e 22 c4 75 da b1 fd b0 fd 76 46 b9 .}...".u.....vF. 00:20:33.977 00000140 82 32 bf 56 25 59 0c 84 25 77 32 4e b5 e4 d9 a6 .2.V%Y..%w2N.... 00:20:33.977 00000150 c6 90 c3 dd 52 01 49 b5 8c 60 13 74 ab 52 b4 0c ....R.I..`.t.R.. 00:20:33.977 00000160 ad 18 3c 13 a3 cc 0f 95 8c 0b 7d 77 78 b0 ea 77 ..<.......}wx..w 00:20:33.977 00000170 c4 1a b3 db 6c 94 d8 d5 0f d3 4f 51 78 6c 56 10 ....l.....OQxlV. 00:20:33.977 00000180 02 68 e1 36 b4 1a a3 10 69 72 34 c8 af c0 74 fc .h.6....ir4...t. 00:20:33.977 00000190 08 91 0a 42 d0 72 27 9e c5 e3 0c 1c 90 a8 5d 0f ...B.r'.......]. 00:20:33.977 000001a0 7d 10 aa e0 1f af 32 ac 80 49 b5 7d 9d 9e 38 6d }.....2..I.}..8m 00:20:33.977 000001b0 db 11 17 da 02 c3 8d 23 a5 29 8a 87 5a d3 28 71 .......#.)..Z.(q 00:20:33.977 000001c0 46 fc f8 a3 c9 fc 2b b0 3c 2e 2c 4f f2 f2 59 67 F.....+.<.,O..Yg 00:20:33.977 000001d0 dd a7 c2 20 9d 3d 88 1a 62 08 16 ea cf da 3b d5 ... .=..b.....;. 00:20:33.977 000001e0 98 4b bb e7 f3 ed 57 58 3c 00 3d c9 3b 8c bb 8d .K....WX<.=.;... 00:20:33.977 000001f0 6c 5e c8 28 3d f2 d0 4b 6b 7a 8c d8 90 78 b4 13 l^.(=..Kkz...x.. 00:20:33.977 00000200 6b 84 db 16 eb f6 d1 24 da de e7 ff d6 3c c6 fd k......$.....<.. 00:20:33.977 00000210 e2 d4 d5 50 d0 9e f3 9f 31 0d 61 65 49 ec 07 bd ...P....1.aeI... 00:20:33.977 00000220 41 15 bf 76 df c3 5d d5 7d 2e 3e bd 0a 6b 8b 19 A..v..].}.>..k.. 00:20:33.977 00000230 d8 49 58 97 bb b8 c8 53 c6 70 30 cf ef 83 6b 85 .IX....S.p0...k. 00:20:33.977 00000240 51 d2 ca fd 68 b7 cd 16 e1 79 77 cc 9d 9d 10 f1 Q...h....yw..... 00:20:33.977 00000250 10 1d bf 65 a5 4c 31 0f 30 a7 a3 16 ae df 98 8b ...e.L1.0....... 00:20:33.977 00000260 fc 2c 1c 85 e2 c5 2b bb fe f6 21 9a c3 0f 8a 72 .,....+...!....r 00:20:33.977 00000270 c7 22 b1 54 68 2c 95 e6 21 05 e4 32 ba b7 07 c9 .".Th,..!..2.... 00:20:33.977 00000280 dc 8f 0d 33 23 f6 62 53 06 d8 e3 02 f7 9e 9a a7 ...3#.bS........ 00:20:33.977 00000290 c2 dc bc 08 f5 bb 04 e0 a8 c0 9f ee 46 2a ce 2b ............F*.+ 00:20:33.977 000002a0 c9 9f a4 cc 1c 80 af 79 d6 b7 be 4b 1d f5 0e cd .......y...K.... 00:20:33.977 000002b0 18 a2 30 f2 cb 06 ef dd 27 e1 76 a3 3e f8 04 57 ..0.....'.v.>..W 00:20:33.977 000002c0 06 51 2a 6c 59 68 3b c1 cf 0f 36 44 3f 25 40 5a .Q*lYh;...6D?%@Z 00:20:33.977 000002d0 62 c1 da c3 86 64 45 fe b4 ba 90 78 d7 71 b7 dd b....dE....x.q.. 00:20:33.977 000002e0 bd 43 b0 5d b6 8f da 89 18 75 33 e6 b4 40 54 3e .C.].....u3..@T> 00:20:33.977 000002f0 37 74 19 23 1d 79 16 cf 50 19 ea 16 a9 f6 f9 23 7t.#.y..P......# 00:20:33.977 [2024-09-27 15:25:30.057252] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=3, dhgroup=4, seq=3428451827, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.977 [2024-09-27 15:25:30.094316] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.977 [2024-09-27 15:25:30.094376] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.977 [2024-09-27 15:25:30.094391] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.977 [2024-09-27 15:25:30.094416] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.977 [2024-09-27 15:25:30.094427] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.977 [2024-09-27 15:25:30.200664] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.977 [2024-09-27 15:25:30.200683] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.977 [2024-09-27 15:25:30.200691] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.977 [2024-09-27 15:25:30.200701] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.977 [2024-09-27 15:25:30.200758] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.977 ctrlr pubkey: 00:20:33.977 00000000 d4 f4 ea 2e 63 ed b9 0e fa b4 a6 54 35 08 62 9d ....c......T5.b. 00:20:33.977 00000010 4e 29 19 d0 71 de 47 da 7f 2a 62 59 05 5a 45 36 N)..q.G..*bY.ZE6 00:20:33.977 00000020 8a 02 ba cb 1a 52 f3 46 f0 24 9d 28 8c bc 99 0f .....R.F.$.(.... 00:20:33.977 00000030 ea 07 d5 15 77 2d 0d 0d 89 2b 1f 19 aa 17 1c 0a ....w-...+...... 00:20:33.977 00000040 cf e5 42 b2 c0 7e fe fb b7 03 fc ea 77 62 b4 f6 ..B..~......wb.. 00:20:33.977 00000050 73 6c 96 7f 05 41 0f 20 22 57 14 73 51 7a 84 c5 sl...A. "W.sQz.. 00:20:33.977 00000060 07 5c 5e e7 b0 93 aa 9f fa b1 98 d2 ff 4b 85 d6 .\^..........K.. 00:20:33.977 00000070 09 86 52 17 61 cb e7 ec 5f 16 43 4a 09 53 b3 90 ..R.a..._.CJ.S.. 00:20:33.977 00000080 cf ae ec 1d 0a 90 2f 9a 98 7a 20 f7 3c 45 e2 91 ....../..z .>6r^4.h..... 00:20:33.977 000000f0 8b 31 93 9b 97 2a 7c 1b 0f 62 37 c7 de 08 66 48 .1...*|..b7...fH 00:20:33.977 00000100 70 78 e5 a8 23 d8 c1 02 68 ad 36 92 96 ff dd 48 px..#...h.6....H 00:20:33.977 00000110 de 26 d2 f5 3e 00 5d 1f 76 df 40 e8 ba dc d9 51 .&..>.].v.@....Q 00:20:33.977 00000120 bf d7 dd d9 5d 5d 49 cd ab 47 ed 93 f1 12 49 7f ....]]I..G....I. 00:20:33.977 00000130 0d 46 dc 1a 47 7b e6 2f ea 55 b1 eb 56 fe 93 44 .F..G{./.U..V..D 00:20:33.977 00000140 35 75 77 a5 ec a6 76 67 99 97 7c 65 c8 0a f0 a7 5uw...vg..|e.... 00:20:33.977 00000150 20 0f 1c 35 39 c0 6e a3 ed 56 b6 72 6e 06 33 ba ..59.n..V.rn.3. 00:20:33.977 00000160 d1 c8 f9 88 f8 0b 9e 2e 1b 53 e9 fa f3 4a 02 b0 .........S...J.. 00:20:33.977 00000170 31 91 83 ec 3f 26 21 c5 60 cd 34 72 24 b0 23 7e 1...?&!.`.4r$.#~ 00:20:33.977 00000180 53 a9 f1 6e 90 c5 99 bd a4 45 2c 67 47 61 5b f5 S..n.....E,gGa[. 00:20:33.977 00000190 d0 b9 25 f1 bc 61 7a 66 f1 c3 b2 22 48 70 ff 12 ..%..azf..."Hp.. 00:20:33.977 000001a0 78 a1 00 15 4d c8 6f e0 12 8c 08 c3 63 8c b8 24 x...M.o.....c..$ 00:20:33.977 000001b0 7f 1d 1f a1 15 9a ea 37 81 df 2c 15 a1 eb 78 5f .......7..,...x_ 00:20:33.977 000001c0 67 83 df 50 e5 38 9a c9 8c b6 fc f5 3a a5 39 68 g..P.8......:.9h 00:20:33.977 000001d0 ed 87 1a 97 b0 32 0b 55 a5 5d 9d 58 33 47 a3 40 .....2.U.].X3G.@ 00:20:33.977 000001e0 5d c5 12 92 95 72 0b 78 27 de df 96 6d ac 7f 82 ]....r.x'...m... 00:20:33.977 000001f0 ae de 66 f1 31 57 ad 8e 32 34 fc c4 e7 e1 ee eb ..f.1W..24...... 00:20:33.977 00000200 37 8d 3a 5f f4 c7 65 1f 51 03 ff 0f 1b ea 1c 68 7.:_..e.Q......h 00:20:33.977 00000210 b2 b6 ac 74 f3 1d 58 6b 7a dc 80 8d f2 26 7d 26 ...t..Xkz....&}& 00:20:33.977 00000220 59 d1 31 79 2a f5 b0 6e e3 65 7a b7 00 16 7b 8f Y.1y*..n.ez...{. 00:20:33.977 00000230 1a 60 85 e9 ab bc 5a 71 1f d7 55 94 98 6d d8 fd .`....Zq..U..m.. 00:20:33.977 00000240 50 20 fc d0 d6 af e2 19 0c ef ee a6 1b 0a 5d cc P ............]. 00:20:33.977 00000250 cb 9e 5a d2 f3 c0 83 30 81 8f 13 83 e4 a3 46 53 ..Z....0......FS 00:20:33.977 00000260 35 25 11 be 37 b2 c6 ad 72 95 7f 9e d2 d6 61 d6 5%..7...r.....a. 00:20:33.977 00000270 46 1d f8 c8 e5 e1 92 05 e3 4d a3 bc 53 c3 fa 6b F........M..S..k 00:20:33.977 00000280 4a 1f 3d c0 8d 7f 24 d5 ff 58 c8 d6 23 ba 64 8f J.=...$..X..#.d. 00:20:33.977 00000290 af 4b c0 1c 64 bd f8 48 f0 d2 e4 97 2c 45 94 5c .K..d..H....,E.\ 00:20:33.977 000002a0 62 32 bc b4 19 28 10 f5 86 be b8 66 89 30 37 4b b2...(.....f.07K 00:20:33.977 000002b0 ad 75 30 0a 9d 11 e3 33 7f a7 1a 26 1b 31 71 b2 .u0....3...&.1q. 00:20:33.977 000002c0 1a 0e b7 55 91 00 87 f6 4c 76 ed 5f b6 40 dd 12 ...U....Lv._.@.. 00:20:33.977 000002d0 31 7b d2 7b 8e 6c e1 38 9b 20 f2 82 7d ee 8b 9d 1{.{.l.8. ..}... 00:20:33.977 000002e0 24 5a d8 ae e2 a9 bb 53 f1 6b 0a 3a a9 06 e4 03 $Z.....S.k.:.... 00:20:33.977 000002f0 ec cb dc 8c a1 45 5b ca 32 10 c8 1a fb c0 aa 1a .....E[.2....... 00:20:33.977 host pubkey: 00:20:33.977 00000000 5d 84 d0 49 fd e3 e7 38 12 f0 19 0e 68 07 aa b9 ]..I...8....h... 00:20:33.977 00000010 c2 9f 89 d1 b4 70 03 b8 c8 b3 46 4b f2 10 ec ef .....p....FK.... 00:20:33.977 00000020 15 31 0b bf 53 35 cd e7 34 2b 5f 6a 28 64 b0 78 .1..S5..4+_j(d.x 00:20:33.977 00000030 32 38 a9 db c0 33 d8 97 c8 ba e0 f1 b8 94 8a 0c 28...3.......... 00:20:33.977 00000040 3a e2 6e f4 3a 05 d1 d7 88 5b 76 24 4c b1 5a 05 :.n.:....[v$L.Z. 00:20:33.977 00000050 1e 39 92 80 79 60 ad e9 c4 96 aa 2d 54 1c fc 33 .9..y`.....-T..3 00:20:33.977 00000060 fb 0e 47 e1 b5 f8 9a 4e ff d0 2d 2e 8a b7 0a 7e ..G....N..-....~ 00:20:33.977 00000070 2a fc b3 4c dd 99 6e b9 5f 87 4e bd e4 68 47 af *..L..n._.N..hG. 00:20:33.977 00000080 b0 81 3a c9 f8 3d 9e 95 25 d3 57 79 dd ee 95 aa ..:..=..%.Wy.... 00:20:33.977 00000090 13 b9 82 92 bc 67 dc e2 da c5 ea 61 3d a6 e9 73 .....g.....a=..s 00:20:33.978 000000a0 07 e8 b6 55 bd 7d e3 ba 2d a5 a6 2a 2e 96 d9 9b ...U.}..-..*.... 00:20:33.978 000000b0 8e 8a 98 bb 38 a3 bc c7 47 be fb 50 87 63 16 c3 ....8...G..P.c.. 00:20:33.978 000000c0 dc 78 94 eb 70 ca c5 b6 d9 df b5 ea 7d c7 e6 f6 .x..p.......}... 00:20:33.978 000000d0 db d5 ef fc 54 6c 17 10 59 d9 7e 54 f9 4e ed 09 ....Tl..Y.~T.N.. 00:20:33.978 000000e0 26 65 e3 69 a5 31 dd ef 46 c8 b2 9b 27 5d 50 8f &e.i.1..F...']P. 00:20:33.978 000000f0 2c 41 8e f9 61 a2 36 8e 0f 43 3c d5 c0 76 e9 5d ,A..a.6..C<..v.] 00:20:33.978 00000100 50 b7 e9 ea d1 e8 d2 83 95 26 86 c3 3e 73 5a 59 P........&..>sZY 00:20:33.978 00000110 79 20 30 bd 73 02 13 a2 54 30 9b a7 ff 4d 13 66 y 0.s...T0...M.f 00:20:33.978 00000120 88 73 10 39 3e 21 f0 13 72 e7 08 6b 7d 9d 91 06 .s.9>!..r..k}... 00:20:33.978 00000130 38 b3 36 58 0d 67 8a 92 7c 3d d9 f0 40 2a 2e d5 8.6X.g..|=..@*.. 00:20:33.978 00000140 04 99 ce 18 80 28 39 7f 79 49 7b 79 bb ff 35 a6 .....(9.yI{y..5. 00:20:33.978 00000150 ce 65 38 f1 62 78 90 e5 8a df 9a 8a c2 51 e7 2c .e8.bx.......Q., 00:20:33.978 00000160 9b cf 0b d7 0c e2 f0 e5 fe 79 fd 90 0f 6e 0f 5f .........y...n._ 00:20:33.978 00000170 5b 6e 0a 1f ca 47 43 ba ca d5 5e 19 3a 42 3d 9b [n...GC...^.:B=. 00:20:33.978 00000180 b1 fb cc 70 a2 ed 83 06 84 fa bb 30 5e 75 ac 5a ...p.......0^u.Z 00:20:33.978 00000190 96 56 6a c3 e0 6c bf 62 88 ee f8 b1 a7 d4 db 57 .Vj..l.b.......W 00:20:33.978 000001a0 7b b7 92 cb a7 e4 27 c0 20 7e 3e a6 9d f0 7d a7 {.....'. ~>...}. 00:20:33.978 000001b0 69 30 d2 ee cb b4 34 f6 c3 3b cf 5f d2 42 69 de i0....4..;._.Bi. 00:20:33.978 000001c0 a7 7d 65 09 00 49 5f 6e 93 2d 95 06 93 3e c7 e2 .}e..I_n.-...>.. 00:20:33.978 000001d0 b5 10 83 dc f4 7b 7b 8c 30 4d 10 93 57 0b 10 22 .....{{.0M..W.." 00:20:33.978 000001e0 86 6c 85 7e 77 4b 89 75 83 72 75 1f b2 e2 c2 3d .l.~wK.u.ru....= 00:20:33.978 000001f0 42 d0 86 e9 19 80 67 ac 07 e3 b4 6b 8f 81 ee f5 B.....g....k.... 00:20:33.978 00000200 d0 5d 0f ea d7 59 93 eb cc 7b bc 03 14 16 ed 91 .]...Y...{...... 00:20:33.978 00000210 22 dc 55 c0 97 df d7 89 23 1a 4f 5b 5d 0b 21 55 ".U.....#.O[].!U 00:20:33.978 00000220 94 03 4c 4a 12 2c 10 45 fb 21 1b b8 95 40 41 95 ..LJ.,.E.!...@A. 00:20:33.978 00000230 0b 27 c9 30 80 c2 83 3e 7c 38 18 77 3c 2e 08 f4 .'.0...>|8.w<... 00:20:33.978 00000240 fc e9 30 ec 9c 50 19 d4 48 af 0f 8a 67 0b fc 1a ..0..P..H...g... 00:20:33.978 00000250 ac 9e fb 23 31 22 14 d6 9c 14 80 be 7f 3d 3a 93 ...#1".......=:. 00:20:33.978 00000260 a5 5a 50 2c 74 79 90 e6 e9 9c bb 7a bf 3f e7 21 .ZP,ty.....z.?.! 00:20:33.978 00000270 23 35 40 15 b7 02 53 e9 f1 e6 29 7f 6b 3a c9 e0 #5@...S...).k:.. 00:20:33.978 00000280 d5 6d 27 42 92 ad 6b 66 ac e7 8d e3 48 4a 38 33 .m'B..kf....HJ83 00:20:33.978 00000290 50 a4 95 c4 0b 6d 50 10 4b 01 b8 a7 3b 0a c6 3e P....mP.K...;..> 00:20:33.978 000002a0 e8 80 e8 42 e1 3b 04 be 90 2f 5d b7 60 d5 e2 1d ...B.;.../].`... 00:20:33.978 000002b0 05 d3 71 a6 6b 8d 99 9f fd 8d e4 de 6e 83 d8 39 ..q.k.......n..9 00:20:33.978 000002c0 f6 82 5e b1 d9 18 f2 87 b2 46 9b f3 a4 d6 fb 8b ..^......F...... 00:20:33.978 000002d0 07 e7 b1 88 f4 35 21 38 e1 1a a0 4c 02 57 71 06 .....5!8...L.Wq. 00:20:33.978 000002e0 d0 be 4c 13 36 be db 07 4c fc 23 9b bf 4f a5 c6 ..L.6...L.#..O.. 00:20:33.978 000002f0 ad c6 d9 25 66 b3 72 89 06 e0 ae 95 5c 8b c3 2e ...%f.r.....\... 00:20:33.978 dh secret: 00:20:33.978 00000000 0a 03 6b ff 7a 14 69 fd f6 49 f8 e5 f5 59 6c 7c ..k.z.i..I...Yl| 00:20:33.978 00000010 4a 58 30 77 25 37 24 8b e1 28 fb 40 11 b7 1f e3 JX0w%7$..(.@.... 00:20:33.978 00000020 48 97 70 63 72 f0 94 e0 3a cd 0e 58 41 de b7 1d H.pcr...:..XA... 00:20:33.978 00000030 df e0 8b f2 1a 14 2c 72 bc 54 df d0 9b 89 47 b3 ......,r.T....G. 00:20:33.978 00000040 5f 44 2c 03 f0 96 eb 47 66 6c 8a d6 0c 3d f1 c8 _D,....Gfl...=.. 00:20:33.978 00000050 0b a1 f2 63 4d fb b4 1e 48 5b 37 2a 61 27 a2 44 ...cM...H[7*a'.D 00:20:33.978 00000060 bc 6f b7 75 14 f2 d4 36 2e 57 e7 d0 4e ca 1d 00 .o.u...6.W..N... 00:20:33.978 00000070 c3 b5 50 81 10 99 68 7b 88 f5 3f 24 d1 2f 5c 1d ..P...h{..?$./\. 00:20:33.978 00000080 b5 62 44 2b c7 32 94 85 d1 e3 00 b4 5c a4 5e ac .bD+.2......\.^. 00:20:33.978 00000090 2b 8c 2e 29 c6 85 3b e4 e1 99 91 8e 67 b2 e3 18 +..)..;.....g... 00:20:33.978 000000a0 d6 a8 4c 22 bf 63 1a ff 5e d9 80 70 d5 0e 06 0a ..L".c..^..p.... 00:20:33.978 000000b0 ea e3 0c a4 53 f8 7f 2b 5b a0 85 56 ef 36 8f 04 ....S..+[..V.6.. 00:20:33.978 000000c0 63 0e 8e a6 a2 d5 5b e9 f8 70 e6 6f 03 9e ca 87 c.....[..p.o.... 00:20:33.978 000000d0 4d 5e a0 88 8e e2 0d bb d6 59 8d e8 1b 57 fb 59 M^.......Y...W.Y 00:20:33.978 000000e0 a3 30 94 41 0d 9c e5 2d 86 31 d1 c7 44 55 04 e5 .0.A...-.1..DU.. 00:20:33.978 000000f0 92 ad 93 67 50 70 ce 9f 6c 37 65 d0 34 c9 33 75 ...gPp..l7e.4.3u 00:20:33.978 00000100 fa c0 77 3f cb 33 10 8e 01 da 52 4a d7 5a 7e e6 ..w?.3....RJ.Z~. 00:20:33.978 00000110 24 24 12 5b 5d 56 27 de 0f 66 26 50 54 4a 35 d8 $$.[]V'..f&PTJ5. 00:20:33.978 00000120 f4 e9 96 4e 9e 5a 9b 1c 77 8e 92 63 5a b6 d9 1a ...N.Z..w..cZ... 00:20:33.978 00000130 45 5a d7 4b fb 6c 1e fa c0 52 cf c7 77 a5 d0 1e EZ.K.l...R..w... 00:20:33.978 00000140 0c b1 af a1 0f 5c 02 55 9d 84 ed 33 9e 82 b8 18 .....\.U...3.... 00:20:33.978 00000150 48 70 d1 6d 9a 0d dc dd a5 11 6e be a6 73 6d 65 Hp.m......n..sme 00:20:33.978 00000160 fd fd ac 33 42 00 c4 fe 25 cc bd 56 37 a3 ec 83 ...3B...%..V7... 00:20:33.978 00000170 2e 7b e6 51 ff 4f 5e 64 fb a2 b3 54 d0 ec 29 eb .{.Q.O^d...T..). 00:20:33.978 00000180 bd 17 ee 48 70 34 99 6e b9 34 12 a9 08 89 86 26 ...Hp4.n.4.....& 00:20:33.978 00000190 f1 e1 14 33 07 b3 3d 02 fe b2 46 37 10 16 c3 07 ...3..=...F7.... 00:20:33.978 000001a0 f9 43 f6 de 2a 9a 0d 4f 3f 0a 5c 0c 2b 80 d9 e5 .C..*..O?.\.+... 00:20:33.978 000001b0 46 4b aa 83 f5 d6 ec 40 ae a3 30 cf 6f 2c a4 86 FK.....@..0.o,.. 00:20:33.978 000001c0 f0 7a f0 d9 f2 2b 28 69 1f 1e 18 ce bb 80 15 ad .z...+(i........ 00:20:33.978 000001d0 fd e3 c7 85 c2 82 13 ee a1 85 69 8b 10 17 c0 25 ..........i....% 00:20:33.978 000001e0 51 c7 2e ef a0 76 d7 10 b3 4a cb bc e9 d5 85 11 Q....v...J...... 00:20:33.978 000001f0 8c ec f8 d3 c7 76 57 69 97 f1 bd aa 3e c7 76 d9 .....vWi....>.v. 00:20:33.978 00000200 bd 2c c7 fd 1d 70 ad 8f ce 8e e4 93 cf eb 08 4a .,...p.........J 00:20:33.978 00000210 8a 0e 32 e6 81 91 33 29 73 3c d1 8a 5d e9 24 fc ..2...3)s<..].$. 00:20:33.978 00000220 bc 33 d8 25 40 ae 6a 2f a9 bc b7 e3 ed 4a a5 ea .3.%@.j/.....J.. 00:20:33.978 00000230 87 b0 f4 b6 e7 76 95 40 ea 2c c0 8d 95 53 27 35 .....v.@.,...S'5 00:20:33.978 00000240 1d b3 b0 d2 ff 63 ea e4 5e 24 00 bf f9 12 61 47 .....c..^$....aG 00:20:33.978 00000250 4f 73 8a bd 6f 39 ab b2 7a 16 5f 26 d4 dd 16 8b Os..o9..z._&.... 00:20:33.978 00000260 df 44 9a d4 a4 af df 2c 6b f2 9d 6c 5a 9b 59 c8 .D.....,k..lZ.Y. 00:20:33.978 00000270 a0 ac 1e 32 c8 99 f5 71 91 ee b6 bd 09 2d ce 6d ...2...q.....-.m 00:20:33.978 00000280 31 a1 46 60 c0 b6 be 98 09 22 05 a9 6c cd 5e 4f 1.F`....."..l.^O 00:20:33.978 00000290 91 fd 1a 42 6f 61 ba c8 57 80 1d 9f 5a b3 3d 2a ...Boa..W...Z.=* 00:20:33.978 000002a0 96 92 32 35 19 56 d6 f1 0d 5d 06 62 c9 f6 91 27 ..25.V...].b...' 00:20:33.978 000002b0 c7 85 4e 90 c4 8e 53 d5 16 7b 3f 61 77 e4 36 d0 ..N...S..{?aw.6. 00:20:33.978 000002c0 27 4c c4 fc 40 41 30 17 6e c2 ef 79 52 c2 03 11 'L..@A0.n..yR... 00:20:33.978 000002d0 66 37 a4 5c 78 75 24 9a 61 ab 65 93 7a d1 f7 09 f7.\xu$.a.e.z... 00:20:33.978 000002e0 91 e0 5e f7 86 ea d8 0d 9c a3 12 b5 66 6d 22 d9 ..^.........fm". 00:20:33.978 000002f0 4d 84 1c 0d 69 6e c2 09 27 80 6c 96 09 07 b1 59 M...in..'.l....Y 00:20:33.978 [2024-09-27 15:25:30.251001] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=3, dhgroup=4, seq=3428451828, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.978 [2024-09-27 15:25:30.251106] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.978 [2024-09-27 15:25:30.305394] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.978 [2024-09-27 15:25:30.305441] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.978 [2024-09-27 15:25:30.305451] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.978 [2024-09-27 15:25:30.305481] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.978 [2024-09-27 15:25:30.479597] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.978 [2024-09-27 15:25:30.479617] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.978 [2024-09-27 15:25:30.479625] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.978 [2024-09-27 15:25:30.479669] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.978 [2024-09-27 15:25:30.479692] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.978 ctrlr pubkey: 00:20:33.978 00000000 a0 ef b9 50 7f c0 7e d9 81 6f 7b 88 31 c2 8b d3 ...P..~..o{.1... 00:20:33.978 00000010 2b 68 97 21 06 90 33 cd 4d 75 00 43 3f f7 10 4b +h.!..3.Mu.C?..K 00:20:33.978 00000020 88 04 7f 3a 1c 22 de fb c6 0b 2a 29 50 41 e3 5c ...:."....*)PA.\ 00:20:33.978 00000030 c6 48 38 ca da e5 67 8f 98 11 5e 89 7b 9e 5e ff .H8...g...^.{.^. 00:20:33.978 00000040 cf 6c aa 87 1c fe c6 4d 59 5b b1 8b 28 ae c0 42 .l.....MY[..(..B 00:20:33.978 00000050 fa 4b 7c 95 8a c5 1b 10 87 24 9b c8 06 25 ba 9a .K|......$...%.. 00:20:33.978 00000060 be e3 86 78 2f fb 49 30 67 21 95 b2 cb 70 83 8f ...x/.I0g!...p.. 00:20:33.978 00000070 e8 b5 3c 79 ea bf d9 7d 18 40 71 f3 18 81 77 bc ...6... 00:20:33.979 00000110 c0 e8 ae 0f fb e0 e5 a4 42 a3 2e 02 c1 d2 b0 f0 ........B....... 00:20:33.979 00000120 0b 0c ca df ab 17 c8 4c e7 12 0a 29 b4 00 73 29 .......L...)..s) 00:20:33.979 00000130 cc ba 7e eb cd 2c 30 d6 a8 88 4c 8e c4 78 1e b1 ..~..,0...L..x.. 00:20:33.979 00000140 ec cb 58 e3 30 ae 40 63 28 7e 70 87 7f be cb 0e ..X.0.@c(~p..... 00:20:33.979 00000150 16 4a 5b ae f8 93 97 2a 86 1e ed dd 25 03 30 04 .J[....*....%.0. 00:20:33.979 00000160 2d 3a a1 fe 3d 13 50 47 f2 91 d3 68 d2 75 be 70 -:..=.PG...h.u.p 00:20:33.979 00000170 52 ff a9 38 e1 84 18 b6 67 e3 84 3d d6 69 0d 9e R..8....g..=.i.. 00:20:33.979 00000180 c3 d4 b3 6c f3 b4 2b cc 93 f7 41 a4 8c 91 9a 07 ...l..+...A..... 00:20:33.979 00000190 de 26 67 b8 ae 87 30 97 9b cc 8d 33 8d 2d 35 e3 .&g...0....3.-5. 00:20:33.979 000001a0 76 c4 a3 74 4d 46 35 5e b7 fa c2 ad 43 2c d2 be v..tMF5^....C,.. 00:20:33.979 000001b0 5a 67 60 05 0e 06 aa 8d 0a ab 59 e4 d1 da fa 0c Zg`.......Y..... 00:20:33.979 000001c0 c5 5f cd c4 26 70 5d 72 ad 6a ab 75 bc d9 7d 88 ._..&p]r.j.u..}. 00:20:33.979 000001d0 2d ce 94 7a f8 bf c1 5a a0 7e 2c 6c 40 77 c2 6d -..z...Z.~,l@w.m 00:20:33.979 000001e0 97 0a 08 6d 6f a4 b6 52 f3 5d d5 c3 36 14 16 52 ...mo..R.]..6..R 00:20:33.979 000001f0 40 1b f3 f8 17 42 62 ef 5c 03 fb db 20 28 09 d6 @....Bb.\... (.. 00:20:33.979 00000200 c8 24 4a 3a ac 5f 58 02 a1 c9 bc d4 55 9e a0 08 .$J:._X.....U... 00:20:33.979 00000210 0e 34 a9 3a a6 c8 05 6a 35 6a e4 29 26 d6 a4 f0 .4.:...j5j.)&... 00:20:33.979 00000220 a8 56 d9 d6 94 31 63 9e 35 92 2b ae 93 c5 2a ed .V...1c.5.+...*. 00:20:33.979 00000230 8d a2 5e a1 e0 dd 99 9d bb 65 70 81 dc bb 96 3e ..^......ep....> 00:20:33.979 00000240 c9 28 71 60 47 5b 99 a0 81 c0 2f 85 b0 40 9a 4f .(q`G[..../..@.O 00:20:33.979 00000250 42 e7 49 a8 e6 99 59 94 5f 7a e2 4f 88 77 eb ef B.I...Y._z.O.w.. 00:20:33.979 00000260 df 8c 38 fb 2e c9 7f b6 77 f1 9f 07 96 ea 25 d7 ..8.....w.....%. 00:20:33.979 00000270 f9 fa 7a 64 9e b5 dc c8 08 d2 d4 1e 13 5a 82 56 ..zd.........Z.V 00:20:33.979 00000280 ae ac 89 5a 8c 23 20 39 54 dc 98 06 99 aa 39 2e ...Z.# 9T.....9. 00:20:33.979 00000290 b4 b2 7e a0 60 4f 9d e1 42 60 3a 99 ee af 8f c3 ..~.`O..B`:..... 00:20:33.979 000002a0 81 55 b8 ba 19 1f 8f cd 56 de e5 96 db 81 0b f7 .U......V....... 00:20:33.979 000002b0 56 56 94 c0 97 f7 66 c5 7c 5a b3 c2 db c0 89 d3 VV....f.|Z...... 00:20:33.979 000002c0 87 c7 64 40 2f bb 39 02 8c 8e 59 62 14 0f 9d 17 ..d@/.9...Yb.... 00:20:33.979 000002d0 13 61 72 c8 79 7e d1 42 43 54 4e 67 50 d0 02 06 .ar.y~.BCTNgP... 00:20:33.979 000002e0 8b 1b 08 b7 ef 7a ed 9f 0d 74 a2 1d 1d f2 76 f7 .....z...t....v. 00:20:33.979 000002f0 67 d3 85 4d 55 20 f5 c7 06 53 cc ab bb ba 9b 07 g..MU ...S...... 00:20:33.979 dh secret: 00:20:33.979 00000000 de 3a 2c 9b 37 ba d2 d7 24 34 37 86 6d 50 06 47 .:,.7...$47.mP.G 00:20:33.979 00000010 8d f0 1b 6a 98 b8 59 2c 37 2c 9c 5f eb 7e 59 d2 ...j..Y,7,._.~Y. 00:20:33.979 00000020 8d 2e e6 10 d6 85 0f 76 08 56 92 f1 a0 04 8e ac .......v.V...... 00:20:33.979 00000030 4d 25 22 5d a7 4d 1e eb b6 1e 29 7d e6 2e fb 08 M%"].M....)}.... 00:20:33.979 00000040 9c 0c 19 07 1e 1b 66 e2 68 05 ba 44 b4 5b ba e9 ......f.h..D.[.. 00:20:33.979 00000050 8f d0 5b 9d e8 5c 72 8d 31 b5 ff 67 5e c1 20 5a ..[..\r.1..g^. Z 00:20:33.979 00000060 bb db d5 b1 00 65 85 51 76 b7 0b ec d3 84 d2 72 .....e.Qv......r 00:20:33.979 00000070 40 57 93 c8 58 cd 6c de 99 a2 e8 54 ef 90 95 25 @W..X.l....T...% 00:20:33.979 00000080 e0 6f ba f6 e2 19 cd e2 d9 10 41 6f c4 42 8c 94 .o........Ao.B.. 00:20:33.979 00000090 96 c8 88 a6 07 5b b3 b8 0a bf dc 3c d0 12 51 05 .....[.....<..Q. 00:20:33.979 000000a0 a5 5f 8b d6 f6 ee 0d bd 55 80 f0 30 4a ef 2c d8 ._......U..0J.,. 00:20:33.979 000000b0 b0 f0 90 21 a3 4b d0 53 54 96 92 57 8d 20 80 f2 ...!.K.ST..W. .. 00:20:33.979 000000c0 37 e7 12 6d 65 c1 b3 ee a7 a7 42 06 48 08 a8 7d 7..me.....B.H..} 00:20:33.979 000000d0 4f 5a 92 13 fb e3 50 1c c9 94 8d 86 51 04 dd dc OZ....P.....Q... 00:20:33.979 000000e0 a6 de 80 59 44 e3 4a 1a 08 1d 58 88 4d 49 1f 47 ...YD.J...X.MI.G 00:20:33.979 000000f0 53 9a 0f a6 b2 71 b6 4f 59 ba 02 ea 0c 6a 29 02 S....q.OY....j). 00:20:33.979 00000100 c4 e7 6c e7 cc fc 4b b4 0f 8f 44 0e db 1e a0 71 ..l...K...D....q 00:20:33.979 00000110 89 39 b7 ab 1d 05 f5 3d c2 48 c2 94 d7 a3 b3 66 .9.....=.H.....f 00:20:33.979 00000120 d4 cd 70 4d 00 89 5a 0f 25 71 4d 5c 3a 35 d6 50 ..pM..Z.%qM\:5.P 00:20:33.979 00000130 67 c7 aa a4 40 48 1c 7d b1 d0 8c bb 62 a7 ad cc g...@H.}....b... 00:20:33.979 00000140 18 f3 a8 ed b1 7d d6 d6 0b 8d 33 22 02 a5 fd 09 .....}....3".... 00:20:33.979 00000150 25 72 52 be d0 37 ea 88 f4 a3 93 c8 af 3c 04 27 %rR..7.......<.' 00:20:33.979 00000160 b6 88 6c df ef 6a d5 a9 e5 22 8a 4a a4 45 09 fb ..l..j...".J.E.. 00:20:33.979 00000170 33 1b 84 0c dd 79 4c ca 29 5f 3f 00 61 9c 86 9e 3....yL.)_?.a... 00:20:33.979 00000180 55 72 f5 3d ae 77 f9 3d 22 f5 b2 e7 97 bb ca e2 Ur.=.w.="....... 00:20:33.979 00000190 a1 6a d4 ba 0b 3c d7 c1 cd cb ba 5b 36 a3 e7 28 .j...<.....[6..( 00:20:33.979 000001a0 8f 54 12 28 77 02 b0 95 68 c3 3c 75 9d a4 eb b8 .T.(w...h.>r....Z3.H.. 00:20:33.979 00000270 01 22 2f d1 19 93 4c 58 fb c0 e8 83 d7 4a 88 63 ."/...LX.....J.c 00:20:33.979 00000280 a0 29 c3 1b bb 42 21 6b 1c 87 6e d8 d7 3a 1f 44 .)...B!k..n..:.D 00:20:33.979 00000290 94 db f9 d8 1c 7a d0 00 7e 2f 2c 7f 1d dd 73 e5 .....z..~/,...s. 00:20:33.979 000002a0 c7 cd 2e a1 67 22 62 67 1d c6 f1 11 5d b5 1d ad ....g"bg....]... 00:20:33.979 000002b0 38 60 1f 88 3f c0 d5 df ef 79 aa 6b 5a d4 74 e5 8`..?....y.kZ.t. 00:20:33.980 000002c0 c3 d1 85 aa 22 5f 9a 53 58 db a2 65 95 51 63 fe ...."_.SX..e.Qc. 00:20:33.980 000002d0 5c 65 7a 43 76 ca f8 0e 8e ec 52 9d 12 08 12 35 \ezCv.....R....5 00:20:33.980 000002e0 d6 e4 6f 04 cd c7 a6 69 67 a4 fb 09 a1 be 00 bd ..o....ig....... 00:20:33.980 000002f0 16 21 41 a6 6c fe db 3d 74 d9 13 7b 6d 8e b4 12 .!A.l..=t..{m... 00:20:33.980 [2024-09-27 15:25:30.528588] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=3, dhgroup=4, seq=3428451829, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.980 [2024-09-27 15:25:30.564825] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.980 [2024-09-27 15:25:30.564865] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.980 [2024-09-27 15:25:30.564882] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.980 [2024-09-27 15:25:30.564902] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.980 [2024-09-27 15:25:30.564916] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.980 [2024-09-27 15:25:30.671308] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.980 [2024-09-27 15:25:30.671325] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.980 [2024-09-27 15:25:30.671332] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.980 [2024-09-27 15:25:30.671346] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.980 [2024-09-27 15:25:30.671403] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.980 ctrlr pubkey: 00:20:33.980 00000000 a0 ef b9 50 7f c0 7e d9 81 6f 7b 88 31 c2 8b d3 ...P..~..o{.1... 00:20:33.980 00000010 2b 68 97 21 06 90 33 cd 4d 75 00 43 3f f7 10 4b +h.!..3.Mu.C?..K 00:20:33.980 00000020 88 04 7f 3a 1c 22 de fb c6 0b 2a 29 50 41 e3 5c ...:."....*)PA.\ 00:20:33.980 00000030 c6 48 38 ca da e5 67 8f 98 11 5e 89 7b 9e 5e ff .H8...g...^.{.^. 00:20:33.980 00000040 cf 6c aa 87 1c fe c6 4d 59 5b b1 8b 28 ae c0 42 .l.....MY[..(..B 00:20:33.980 00000050 fa 4b 7c 95 8a c5 1b 10 87 24 9b c8 06 25 ba 9a .K|......$...%.. 00:20:33.980 00000060 be e3 86 78 2f fb 49 30 67 21 95 b2 cb 70 83 8f ...x/.I0g!...p.. 00:20:33.980 00000070 e8 b5 3c 79 ea bf d9 7d 18 40 71 f3 18 81 77 bc .....3a.mv!... 00:20:33.980 00000010 8a 3f 0c 29 19 b3 be 9c e7 35 3f 44 1a b8 dd 03 .?.).....5?D.... 00:20:33.980 00000020 70 a9 ce 90 f0 71 d9 09 83 bf 15 bb 13 f2 1d e4 p....q.......... 00:20:33.980 00000030 e6 ac 94 70 27 a7 cc 1b 10 e1 c7 00 cd da c8 47 ...p'..........G 00:20:33.980 00000040 57 6b b3 e3 b7 85 da 6e 76 b9 6f e3 0f e1 9c 46 Wk.....nv.o....F 00:20:33.980 00000050 52 de 6e fa 04 76 24 30 5e 4d 92 4f 51 58 04 9c R.n..v$0^M.OQX.. 00:20:33.980 00000060 cb 10 8c 5d 36 17 e7 2d 44 bb 0d f4 de b0 9b f0 ...]6..-D....... 00:20:33.980 00000070 28 0b 9b d0 d2 c7 f5 d4 c1 86 cb 2d e4 d7 d5 f4 (..........-.... 00:20:33.980 00000080 9f e9 33 82 fe ad 73 b1 9c 89 13 ed 01 bb 1b 59 ..3...s........Y 00:20:33.980 00000090 cb b9 1a da 6c b0 12 3e 87 9e 0e 62 51 80 a1 4c ....l..>...bQ..L 00:20:33.980 000000a0 b0 0e 38 62 aa 95 f1 68 88 84 29 7c 98 d2 f0 55 ..8b...h..)|...U 00:20:33.980 000000b0 e9 bf 0d ff 72 dd aa 34 1a a3 fc 6b 38 01 f2 91 ....r..4...k8... 00:20:33.980 000000c0 91 f2 23 23 53 b5 aa 3f 0d c2 30 68 6f 3d 6e b3 ..##S..?..0ho=n. 00:20:33.980 000000d0 58 3d 7b 5d 37 97 c4 46 15 c1 d4 62 68 52 32 9e X={]7..F...bhR2. 00:20:33.980 000000e0 75 f3 07 39 50 4d 33 ff 0d 91 cb 01 84 b1 32 43 u..9PM3.......2C 00:20:33.980 000000f0 f4 76 e4 cb df d5 31 86 ae 8b ee 69 98 0b 20 41 .v....1....i.. A 00:20:33.980 00000100 b0 72 c0 f1 87 27 af a3 87 de e5 5b bd 83 16 b6 .r...'.....[.... 00:20:33.980 00000110 45 18 27 3b d0 e0 6c b6 69 d7 82 64 a8 2f b0 38 E.';..l.i..d./.8 00:20:33.980 00000120 85 29 79 25 85 7f 3f be f2 f4 f8 78 f4 63 0b 70 .)y%..?....x.c.p 00:20:33.980 00000130 bd 1e cb ab f1 17 a1 60 95 fd 19 3a 60 ce 4d 71 .......`...:`.Mq 00:20:33.980 00000140 a9 b4 b7 da 92 f3 f1 2e e0 ed b5 13 3d 65 16 22 ............=e." 00:20:33.980 00000150 9a 80 a1 78 cc 62 b4 cc 97 79 28 c7 4a 50 f9 86 ...x.b...y(.JP.. 00:20:33.980 00000160 82 67 87 bd 31 ad df f3 91 83 a6 a3 3c 2d 75 f4 .g..1.......<-u. 00:20:33.980 00000170 84 ca 3b 94 4e a2 bf 4d 88 c4 ee 37 0e 96 48 9c ..;.N..M...7..H. 00:20:33.980 00000180 b0 b0 fd 7c 08 2c c5 6a 02 e4 0a 77 22 e8 2f c6 ...|.,.j...w"./. 00:20:33.980 00000190 22 06 01 32 35 3e 6c d6 47 4a af 38 1a 27 6b 08 "..25>l.GJ.8.'k. 00:20:33.980 000001a0 27 5e c7 45 16 f4 47 e2 62 54 79 7b ab 9b 40 2d '^.E..G.bTy{..@- 00:20:33.980 000001b0 04 65 02 1b e6 03 18 58 51 b1 cc af 56 e6 48 a9 .e.....XQ...V.H. 00:20:33.980 000001c0 d4 20 b1 77 e2 99 fe 29 8d 1e a8 ce 26 7c 81 d6 . .w...)....&|.. 00:20:33.980 000001d0 af e4 e0 77 9a b5 2a f1 e2 95 24 6a 73 fb f1 97 ...w..*...$js... 00:20:33.980 000001e0 d7 5e 31 f8 3d eb 6c b6 f9 62 ca 17 cc b2 cf 64 .^1.=.l..b.....d 00:20:33.980 000001f0 cb cc b4 bf 9d 8e 80 6e 9e a6 82 38 74 47 e5 0d .......n...8tG.. 00:20:33.980 00000200 0f 1f a6 34 98 09 79 aa d7 6f c9 be 96 72 24 c8 ...4..y..o...r$. 00:20:33.980 00000210 60 c7 cd 1d bf 63 55 c3 32 8e d8 74 29 d5 8e 1f `....cU.2..t)... 00:20:33.980 00000220 da ce 5c 56 82 2c c1 7f 96 59 56 1d b9 6c fb c2 ..\V.,...YV..l.. 00:20:33.980 00000230 9e 05 d4 49 11 db 65 84 4d 0b ca 0f a1 74 e2 4b ...I..e.M....t.K 00:20:33.980 00000240 f6 d1 d5 94 c4 ba 42 e8 fe 12 b3 a2 e6 93 7f f8 ......B......... 00:20:33.980 00000250 71 05 58 23 05 dc 0b 0a 0b 11 6c ba 68 1d c5 fb q.X#......l.h... 00:20:33.980 00000260 72 a2 55 5e 07 3f c6 71 7c 3d c1 3e 75 0d 2c be r.U^.?.q|=.>u.,. 00:20:33.980 00000270 e5 42 f6 03 d2 fd 59 65 c8 42 75 51 b6 32 df 23 .B....Ye.BuQ.2.# 00:20:33.980 00000280 0b f7 54 e6 0f e7 d1 7d d1 f0 f1 26 cd 75 7f 8a ..T....}...&.u.. 00:20:33.980 00000290 51 13 18 9a df a0 86 89 4d 81 06 1a 07 60 3d df Q.......M....`=. 00:20:33.980 000002a0 ce b7 75 68 da e5 23 88 03 5d ae 04 3a 07 eb e3 ..uh..#..]..:... 00:20:33.980 000002b0 93 6f 76 c1 1e d6 df 6a ae 9d 86 c0 3b 57 d6 a3 .ov....j....;W.. 00:20:33.980 000002c0 f8 83 82 e1 3f a3 b0 b9 47 60 da 7d 19 a7 3d d5 ....?...G`.}..=. 00:20:33.980 000002d0 4c 32 b6 5d 78 2c cb 51 b3 1a 0a ca d7 85 c9 61 L2.]x,.Q.......a 00:20:33.980 000002e0 6d 57 2e 0a 6a 52 67 33 de 7b d2 1b 89 a7 b9 8c mW..jRg3.{...... 00:20:33.980 000002f0 d5 e1 d9 a1 2e 71 19 b1 cc 32 49 36 79 d9 43 ca .....q...2I6y.C. 00:20:33.980 dh secret: 00:20:33.980 00000000 fe 35 58 36 77 3e dc 50 8e 44 7f 6e 75 28 5c 7b .5X6w>.P.D.nu(\{ 00:20:33.980 00000010 df 8b d8 e6 f8 2f 03 db 58 a3 9e 69 51 f5 6b 81 ...../..X..iQ.k. 00:20:33.980 00000020 d3 3e a3 58 1f e7 d7 c1 e0 45 11 0b 4d d8 c0 a1 .>.X.....E..M... 00:20:33.980 00000030 72 ce ce 9d 35 dd bf 90 82 91 49 e1 df b8 5f 7f r...5.....I..._. 00:20:33.980 00000040 33 4c 93 7c 0c 47 cb 42 c6 22 7b ec 4c 58 f6 f1 3L.|.G.B."{.LX.. 00:20:33.980 00000050 f9 7c ac 3a 8a 01 86 9f f6 13 55 80 35 d6 fb f9 .|.:......U.5... 00:20:33.980 00000060 7b ac a7 cc 83 76 be 1a ee 79 8c 02 e0 c2 4a 32 {....v...y....J2 00:20:33.980 00000070 8b bf 86 7e ae 6f d8 30 61 41 d4 56 60 8c d1 e2 ...~.o.0aA.V`... 00:20:33.980 00000080 64 84 8f bf 33 74 1a e6 0a e8 d4 9d 92 3e 0c bc d...3t.......>.. 00:20:33.981 00000090 c5 26 4d 81 f1 9d d0 54 69 a0 96 46 e1 91 d7 d9 .&M....Ti..F.... 00:20:33.981 000000a0 c4 2d eb 09 03 0b 14 d3 11 52 ff e6 23 e9 15 b4 .-.......R..#... 00:20:33.981 000000b0 ea 3f 79 69 7e 11 36 fa 2a a4 39 f4 5e 5f 05 1a .?yi~.6.*.9.^_.. 00:20:33.981 000000c0 05 41 e4 97 88 77 f3 c9 9f e9 db 76 47 d0 96 e5 .A...w.....vG... 00:20:33.981 000000d0 0b 2c 42 85 24 9d 57 21 6f 40 2a c4 75 97 dc 69 .,B.$.W!o@*.u..i 00:20:33.981 000000e0 10 ac 1b e7 f0 59 52 f7 8b 83 96 8c a4 f5 c1 78 .....YR........x 00:20:33.981 000000f0 d4 be 6c ff 2b b2 70 d8 e5 63 80 fd b1 09 88 0e ..l.+.p..c...... 00:20:33.981 00000100 d5 60 3b d9 9c 9b 6e 41 35 b9 33 ab 49 f1 b6 72 .`;...nA5.3.I..r 00:20:33.981 00000110 98 37 4a db 0c 35 9f 7b 43 25 70 29 9e a6 36 a4 .7J..5.{C%p)..6. 00:20:33.981 00000120 95 b7 3c 41 b0 88 99 08 42 b6 44 31 d1 c7 52 9d ..h......" 00:20:33.981 00000160 7e 09 9d 6e 56 1d 1c 11 45 21 bb c2 04 6e 59 31 ~..nV...E!...nY1 00:20:33.981 00000170 e2 0b 7f 7a f9 4e fa c5 f4 27 3a 9f ac c9 4a a8 ...z.N...':...J. 00:20:33.981 00000180 98 af 7c 94 1e 93 f1 a2 6b e6 16 6c e4 fb 78 04 ..|.....k..l..x. 00:20:33.981 00000190 4e b4 6e 43 f0 f6 d5 68 cf d2 d2 40 cb f9 06 3b N.nC...h...@...; 00:20:33.981 000001a0 99 a1 81 21 5b ff 54 03 8f 8f 65 80 06 92 67 36 ...![.T...e...g6 00:20:33.981 000001b0 9c 73 61 31 26 0b 40 8f 8d da 39 8f c9 04 f4 3f .sa1&.@...9....? 00:20:33.981 000001c0 e8 2c d3 1a 2c ca ac bf d8 b8 7a 63 54 db 99 a8 .,..,.....zcT... 00:20:33.981 000001d0 6a 7b 67 a4 d5 de 14 91 db 38 50 51 6d c9 09 19 j{g......8PQm... 00:20:33.981 000001e0 b0 51 65 03 5a 12 f3 5e 2a 38 f0 5b de b5 3b e9 .Qe.Z..^*8.[..;. 00:20:33.981 000001f0 78 eb 7f 83 74 40 03 e0 b9 78 7e 6b 24 2a 9c b8 x...t@...x~k$*.. 00:20:33.981 00000200 41 11 a9 64 a2 51 3b 28 4a 80 88 f2 8e 75 05 0a A..d.Q;(J....u.. 00:20:33.981 00000210 71 40 ee fe 95 93 49 2c 26 dc 1e 7d 89 b9 94 14 q@....I,&..}.... 00:20:33.981 00000220 b9 c3 74 0c dc 2d 5e c5 7f 66 f2 27 bc 7d 18 c6 ..t..-^..f.'.}.. 00:20:33.981 00000230 4e df 15 23 66 4e a3 3f 4e 58 6d 2d d9 1a 29 36 N..#fN.?NXm-..)6 00:20:33.981 00000240 39 c6 ea b3 1e 83 96 25 7c 4d 94 04 4c 8d 7a c7 9......%|M..L.z. 00:20:33.981 00000250 30 91 a8 ab f7 9a 4c e4 4d c4 9e d5 3e 89 74 96 0.....L.M...>.t. 00:20:33.981 00000260 b6 d4 68 49 35 12 81 4e a3 fd ad 0d d6 b6 38 05 ..hI5..N......8. 00:20:33.981 00000270 b8 07 37 19 e5 58 83 13 8b 1c df 2a ad a3 a1 34 ..7..X.....*...4 00:20:33.981 00000280 ee db 99 34 88 46 cd 48 99 1c 26 59 b9 cd 56 65 ...4.F.H..&Y..Ve 00:20:33.981 00000290 c8 77 3c 72 27 10 3d d7 69 f9 57 be 9e 87 e1 de .wslI4.y2t4c.U. 00:20:33.981 000001d0 1e 98 3b 5f 49 2f af 66 ec 95 2c 0e 88 a6 88 7d ..;_I/.f..,....} 00:20:33.981 000001e0 9b 7c 0c 15 ec d1 c2 6e e9 8a e7 26 06 36 1a 47 .|.....n...&.6.G 00:20:33.981 000001f0 46 89 60 c8 68 98 64 f2 20 55 a6 04 c5 e6 9c 5f F.`.h.d. U....._ 00:20:33.981 00000200 35 ff d4 5e 81 50 7c 58 76 5f 0e 57 ed d5 f1 33 5..^.P|Xv_.W...3 00:20:33.981 00000210 bd 31 4a c8 7f 68 a1 1b 34 f7 75 89 76 05 07 5a .1J..h..4.u.v..Z 00:20:33.981 00000220 5f dd 86 f8 86 c9 ef 8a 57 d5 c0 96 09 a2 a8 8d _.......W....... 00:20:33.981 00000230 24 d8 b9 75 54 47 64 1a 37 5c e3 3c 23 c2 b9 71 $..uTGd.7\.<#..q 00:20:33.981 00000240 88 2c 04 9e 04 e9 b1 11 7d 84 76 18 df 0f 8b ce .,......}.v..... 00:20:33.981 00000250 ed 84 13 af 2f a9 5f b9 6f b7 7b 91 85 a6 c2 1b ..../._.o.{..... 00:20:33.981 00000260 7b 3d fe f9 0f 00 34 5b 23 75 21 cc 84 fd 70 07 {=....4[#u!...p. 00:20:33.981 00000270 86 ff 66 ed aa de fb 5d a6 b0 29 eb 58 67 87 ca ..f....]..).Xg.. 00:20:33.981 00000280 a5 1f d5 26 b4 63 92 6f f4 71 67 24 3b 7f fc 8c ...&.c.o.qg$;... 00:20:33.981 00000290 de 04 55 86 af 39 7b d6 3a f5 82 ad f8 6d 9b 5d ..U..9{.:....m.] 00:20:33.981 000002a0 41 5e 6f 33 80 e5 b1 95 7a b0 2e c8 65 ad b4 1f A^o3....z...e... 00:20:33.981 000002b0 53 9e 7f 60 26 4a e2 21 7f 75 bd fd 1b 02 7d db S..`&J.!.u....}. 00:20:33.981 000002c0 ad b9 dc 10 aa ff a7 a0 9e 1a ab 10 67 3a 5d 6b ............g:]k 00:20:33.981 000002d0 e9 b8 9b 55 06 a8 d9 ee 1c ee a7 a1 bc 2e e9 74 ...U...........t 00:20:33.981 000002e0 3b 01 bc 9d da b8 73 d2 27 62 e9 a4 c1 33 b2 8a ;.....s.'b...3.. 00:20:33.981 000002f0 0e 82 95 f1 1e a9 8c 0d 1c 44 b8 c6 a8 c4 bd 09 .........D...... 00:20:33.981 host pubkey: 00:20:33.981 00000000 9a e5 69 7e 7f e3 2f 80 ae fb fb 39 cb 46 32 4b ..i~../....9.F2K 00:20:33.981 00000010 36 d2 2e 9a 36 65 39 93 38 e1 d3 b8 5f 2b ef 86 6...6e9.8..._+.. 00:20:33.981 00000020 d6 71 1f 6f fa 36 f9 6a 37 1b 6a a3 51 d5 33 1c .q.o.6.j7.j.Q.3. 00:20:33.981 00000030 c8 a7 0d 90 be 3d 5d ae 6b b2 63 37 34 d0 db 22 .....=].k.c74.." 00:20:33.981 00000040 b2 b1 74 5b cc 0d 65 27 65 de b7 43 12 60 66 1a ..t[..e'e..C.`f. 00:20:33.981 00000050 31 18 15 9e af b9 a7 09 71 41 16 38 d9 54 0c 49 1.......qA.8.T.I 00:20:33.981 00000060 d4 ff c9 ad 1b ff 22 d7 56 c2 a5 c7 45 c1 11 e8 ......".V...E... 00:20:33.981 00000070 0c c6 60 fb 22 7b b6 2d 2d 51 74 40 9d 1d a4 f8 ..`."{.--Qt@.... 00:20:33.981 00000080 20 d1 e1 3d 6e 1e 04 bb a1 38 15 27 e9 d7 c9 14 ..=n....8.'.... 00:20:33.981 00000090 ed 6b 4f 9a 57 b8 2a 0e 85 fb c1 34 d5 81 26 b9 .kO.W.*....4..&. 00:20:33.981 000000a0 a7 af c5 32 1e 3c 95 6e d5 85 44 04 75 83 2a 60 ...2.<.n..D.u.*` 00:20:33.981 000000b0 d9 68 1c fb bd b2 91 9b 70 44 92 ce f3 c1 bf 34 .h......pD.....4 00:20:33.981 000000c0 6a 7f 60 af 3c c9 c9 94 57 d0 72 b2 6c 17 10 37 j.`.<...W.r.l..7 00:20:33.981 000000d0 14 63 7a 52 13 23 21 e4 15 1f fb a0 26 b8 55 5b .czR.#!.....&.U[ 00:20:33.981 000000e0 d4 c1 99 91 c0 6c f4 9f 06 92 41 b4 bf f0 95 40 .....l....A....@ 00:20:33.981 000000f0 bd 0a 92 42 51 9e 05 82 c1 cc d3 e1 ea b0 39 8f ...BQ.........9. 00:20:33.981 00000100 1d fc 78 5c 0c 69 33 ff d1 4c b2 9f a5 83 8b e4 ..x\.i3..L...... 00:20:33.981 00000110 2c fd 17 07 7f 42 f7 39 83 a8 96 7c 14 a4 09 3f ,....B.9...|...? 00:20:33.981 00000120 fa 2a 1a 35 11 8d 82 cc 4c 44 50 ab 74 17 5b 8a .*.5....LDP.t.[. 00:20:33.981 00000130 31 b3 4a b5 31 23 12 1c ee 38 59 cc 16 45 93 47 1.J.1#...8Y..E.G 00:20:33.982 00000140 7e 4a 83 ea ac 25 c3 94 06 ba b5 a6 ea ca 69 47 ~J...%........iG 00:20:33.982 00000150 cb 14 e7 be 65 77 34 66 1a af af cb de 30 c7 f5 ....ew4f.....0.. 00:20:33.982 00000160 5c cf 28 88 5e cb 3e b8 b7 20 a1 d6 8d cb 64 07 \.(.^.>.. ....d. 00:20:33.982 00000170 0b 2b c4 0c bb bb 0a 8a 30 16 ba 61 41 de a0 16 .+......0..aA... 00:20:33.982 00000180 40 29 bd 4d 19 92 02 69 d0 33 4f e0 3b ce 63 c3 @).M...i.3O.;.c. 00:20:33.982 00000190 02 3d 14 30 08 a7 80 4f de 49 a1 c3 0f 05 1e 7c .=.0...O.I.....| 00:20:33.982 000001a0 10 f3 61 99 6a 9b 86 14 27 c4 38 60 67 98 9d a0 ..a.j...'.8`g... 00:20:33.982 000001b0 52 39 b3 a9 09 f9 7c a7 c1 17 ac ea a3 20 79 58 R9....|...... yX 00:20:33.982 000001c0 dd b5 6b 5d 3f 1f 63 5e b9 1f e6 80 03 d0 3b 30 ..k]?.c^......;0 00:20:33.982 000001d0 3b 8a b4 9d f6 69 71 e3 8d 4a ea 31 35 bd d7 ac ;....iq..J.15... 00:20:33.982 000001e0 4d 32 fe 13 20 40 19 df 4f e1 c5 12 55 44 10 8a M2.. @..O...UD.. 00:20:33.982 000001f0 fa 29 1a 2f 2c 79 e4 9b 3b a0 b2 5e 9c 72 93 c3 .)./,y..;..^.r.. 00:20:33.982 00000200 42 a7 29 4d eb 13 cc 02 79 a5 f5 bd 9f 8b 5a fb B.)M....y.....Z. 00:20:33.982 00000210 5e a1 e4 79 a9 45 4a 84 ee 35 48 f6 ef ae fa 9b ^..y.EJ..5H..... 00:20:33.982 00000220 1f 87 89 75 3c 52 c4 7d 1c a4 27 24 ef e8 82 04 ...u`...id... 00:20:33.982 00000290 e2 37 74 c2 fd 5a 6f 87 4d 3c 48 56 f8 c1 1c 0d .7t..Zo.M....".j 00:20:33.982 000002b0 fb 49 74 58 3c 96 ca b8 90 cc b8 8f 5d 16 e1 9e .ItX<.......]... 00:20:33.982 000002c0 76 51 3d f1 74 df 0f b7 d8 bb 76 65 3c c8 b1 2c vQ=.t.....ve<.., 00:20:33.982 000002d0 bb fb 94 51 53 fe 53 10 cc 70 f3 77 3d 81 bb f2 ...QS.S..p.w=... 00:20:33.982 000002e0 13 d6 08 76 8f f2 f6 6b 33 ee ac 25 66 fb c9 be ...v...k3..%f... 00:20:33.982 000002f0 cc 65 1d 94 f5 f8 0f 19 68 b7 40 94 33 d7 bf 3e .e......h.@.3..> 00:20:33.982 dh secret: 00:20:33.982 00000000 7f 5e 07 9a 62 47 fe 72 3d 78 fe 5f 9c d9 a2 8f .^..bG.r=x._.... 00:20:33.982 00000010 28 0f ef 5f 28 81 d6 1a b2 18 f2 b2 d8 7a bb f5 (.._(........z.. 00:20:33.982 00000020 aa ac d9 b8 f1 e6 3c 6e 6c 80 ae 83 b9 3f fe 6c ........... 00:20:33.982 000001b0 56 23 a0 2e 82 7d 7f ab 27 93 d0 c6 e7 6e 32 32 V#...}..'....n22 00:20:33.982 000001c0 6b 0c 70 ac ce 3f a5 70 a5 f2 c7 43 dd d7 c1 7d k.p..?.p...C...} 00:20:33.982 000001d0 bd 98 b0 aa e7 17 2f 7d 93 4f a3 3c 3d 04 67 1a ....../}.O.<=.g. 00:20:33.982 000001e0 0a 97 85 ed 3d 44 8b 6a 77 47 53 86 7b 69 6f e8 ....=D.jwGS.{io. 00:20:33.982 000001f0 0e 0d e4 40 51 7c 8c a1 e0 ae 36 ea 1f 6e 1b 7b ...@Q|....6..n.{ 00:20:33.982 00000200 d0 4c 7d fd a8 13 27 9c 80 e0 9b 27 b9 e3 e7 6d .L}...'....'...m 00:20:33.982 00000210 d2 19 95 25 bb 14 be ab 2b 4b a5 12 33 b2 c6 f6 ...%....+K..3... 00:20:33.982 00000220 93 e7 16 92 ad ad 4f f6 7e 74 ce 2c 62 bb 2a 38 ......O.~t.,b.*8 00:20:33.982 00000230 a9 8a 11 ea 7e ab 09 98 fa c5 ed 48 c6 57 83 50 ....~......H.W.P 00:20:33.982 00000240 7d 67 83 86 43 01 d8 79 50 2c fd ac 46 26 bb f1 }g..C..yP,..F&.. 00:20:33.982 00000250 ea 57 43 12 24 c1 0e ad 37 3b e9 59 a2 7f fd 56 .WC.$...7;.Y...V 00:20:33.982 00000260 fb 39 ac 5f 14 06 fc 90 b1 58 42 07 00 0e ac 0f .9._.....XB..... 00:20:33.982 00000270 40 57 9a a0 91 7d 90 87 a3 c1 98 6a 46 16 71 3d @W...}.....jF.q= 00:20:33.982 00000280 31 09 31 2b 09 ab ec 83 a5 f2 42 0b fa 63 d7 71 1.1+......B..c.q 00:20:33.982 00000290 3f 23 b9 8e 4b 39 ec c8 81 25 9c 51 cf 77 52 c3 ?#..K9...%.Q.wR. 00:20:33.982 000002a0 c8 22 cd e1 96 e1 70 eb c5 c5 d0 2c 56 82 ad d1 ."....p....,V... 00:20:33.982 000002b0 07 e3 d2 e2 db a6 53 51 68 f2 da d5 5c 6e ea fd ......SQh...\n.. 00:20:33.982 000002c0 e7 08 9d 3a 65 3f 90 4c e4 7e 70 fa 65 76 85 c8 ...:e?.L.~p.ev.. 00:20:33.982 000002d0 b2 f5 cf 3e a4 0a cd 99 0c f1 c6 88 9b 12 20 81 ...>.......... . 00:20:33.982 000002e0 f6 58 b5 31 29 e5 f3 44 d4 be cb 1d f6 d6 85 12 .X.1)..D........ 00:20:33.982 000002f0 ef 79 68 42 e4 d2 a3 76 c8 6a 5c 6f d5 e0 a0 04 .yhB...v.j\o.... 00:20:33.982 [2024-09-27 15:25:31.002019] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=3, dhgroup=4, seq=3428451831, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.982 [2024-09-27 15:25:31.038105] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.982 [2024-09-27 15:25:31.038148] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.982 [2024-09-27 15:25:31.038165] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.982 [2024-09-27 15:25:31.038189] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.982 [2024-09-27 15:25:31.038200] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.982 [2024-09-27 15:25:31.143877] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.982 [2024-09-27 15:25:31.143895] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.982 [2024-09-27 15:25:31.143902] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.982 [2024-09-27 15:25:31.143911] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.982 [2024-09-27 15:25:31.143965] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.982 ctrlr pubkey: 00:20:33.982 00000000 c0 ce 57 76 d0 2b 16 3d b3 f2 82 db 9f a0 44 28 ..Wv.+.=......D( 00:20:33.982 00000010 3f a6 31 01 05 88 40 90 f5 e7 97 25 18 93 9f 67 ?.1...@....%...g 00:20:33.982 00000020 a1 1f 35 0d 48 cb 6a 30 04 6e a4 98 8f 26 04 9d ..5.H.j0.n...&.. 00:20:33.982 00000030 31 a9 0f 3c 88 32 c5 85 f3 bf 22 93 cf 5d d4 d4 1..<.2...."..].. 00:20:33.982 00000040 9e d4 6a e3 01 6a 65 fd 9b 33 ed f9 c9 93 ea ef ..j..je..3...... 00:20:33.982 00000050 bd a8 05 b4 22 3f 5a 6a 55 d2 60 59 7c 8a 45 7c ...."?ZjU.`Y|.E| 00:20:33.982 00000060 4d 99 9f d0 9f 16 31 0e c4 11 e2 e2 0e 18 61 dd M.....1.......a. 00:20:33.982 00000070 5f a0 2b 1d a0 ef 4c d8 91 7d eb 29 94 31 c0 56 _.+...L..}.).1.V 00:20:33.982 00000080 5f b5 9e 57 f0 90 a1 8f cb d3 8f d0 51 22 1d 61 _..W........Q".a 00:20:33.982 00000090 e4 49 67 1f 58 79 b5 fc 03 49 c5 b2 b2 92 b1 6f .Ig.Xy...I.....o 00:20:33.982 000000a0 25 6f 39 f6 15 a2 c6 bc c7 90 6b 54 46 d9 3d 1b %o9.......kTF.=. 00:20:33.982 000000b0 ab 06 41 26 32 43 d6 88 ec 39 ef b7 dd 37 b5 ea ..A&2C...9...7.. 00:20:33.982 000000c0 34 1b 4f 30 17 67 30 f7 ff 8a d3 8c 04 7a 07 9e 4.O0.g0......z.. 00:20:33.982 000000d0 fe 9a 0b b0 a0 db e0 36 43 e5 be 2f 6a 49 fa 25 .......6C../jI.% 00:20:33.982 000000e0 99 65 29 a4 98 ea e7 6d fd 2b 2f 4d 5f 87 05 a1 .e)....m.+/M_... 00:20:33.982 000000f0 89 59 55 db 80 29 f2 8d 4d 33 bd ac 5c 4f 5f b2 .YU..)..M3..\O_. 00:20:33.982 00000100 53 d9 8c c4 ea c7 5a 8c 79 f1 18 0a cd b5 05 d7 S.....Z.y....... 00:20:33.982 00000110 a1 47 a8 95 54 e5 68 cb bc 2c 7a 19 d6 56 3d 44 .G..T.h..,z..V=D 00:20:33.982 00000120 20 81 5b 82 a9 1b f6 db 5a 3c 37 85 ed ef 5f 14 .[.....Z<7..._. 00:20:33.982 00000130 a2 3c f5 5e 97 b4 ec 86 81 01 22 0a 0e aa d4 19 .<.^......"..... 00:20:33.982 00000140 2c 92 79 da 79 77 23 d7 23 84 83 8f 08 eb fb d9 ,.y.yw#.#....... 00:20:33.982 00000150 6f 48 e1 98 7e 01 1b 6e 7a 0f 8e 54 ba 86 4f fe oH..~..nz..T..O. 00:20:33.982 00000160 11 f6 7c 5f f0 e4 2b f0 b6 d1 f3 96 12 dc 82 54 ..|_..+........T 00:20:33.982 00000170 38 68 9f f5 52 84 93 41 1b f2 30 85 4f fa 7a 83 8h..R..A..0.O.z. 00:20:33.982 00000180 e1 22 ed 38 8e 9c eb f8 db 09 f4 2b 64 e9 79 6e .".8.......+d.yn 00:20:33.982 00000190 0b a8 53 d2 39 3a 3a cc 24 20 4c 49 6e a6 40 00 ..S.9::.$ LIn.@. 00:20:33.982 000001a0 f7 86 3c 69 02 66 8d cb e7 cd 61 f8 64 99 27 3f ..slI4.y2t4c.U. 00:20:33.982 000001d0 1e 98 3b 5f 49 2f af 66 ec 95 2c 0e 88 a6 88 7d ..;_I/.f..,....} 00:20:33.982 000001e0 9b 7c 0c 15 ec d1 c2 6e e9 8a e7 26 06 36 1a 47 .|.....n...&.6.G 00:20:33.982 000001f0 46 89 60 c8 68 98 64 f2 20 55 a6 04 c5 e6 9c 5f F.`.h.d. U....._ 00:20:33.982 00000200 35 ff d4 5e 81 50 7c 58 76 5f 0e 57 ed d5 f1 33 5..^.P|Xv_.W...3 00:20:33.982 00000210 bd 31 4a c8 7f 68 a1 1b 34 f7 75 89 76 05 07 5a .1J..h..4.u.v..Z 00:20:33.982 00000220 5f dd 86 f8 86 c9 ef 8a 57 d5 c0 96 09 a2 a8 8d _.......W....... 00:20:33.982 00000230 24 d8 b9 75 54 47 64 1a 37 5c e3 3c 23 c2 b9 71 $..uTGd.7\.<#..q 00:20:33.982 00000240 88 2c 04 9e 04 e9 b1 11 7d 84 76 18 df 0f 8b ce .,......}.v..... 00:20:33.982 00000250 ed 84 13 af 2f a9 5f b9 6f b7 7b 91 85 a6 c2 1b ..../._.o.{..... 00:20:33.982 00000260 7b 3d fe f9 0f 00 34 5b 23 75 21 cc 84 fd 70 07 {=....4[#u!...p. 00:20:33.982 00000270 86 ff 66 ed aa de fb 5d a6 b0 29 eb 58 67 87 ca ..f....]..).Xg.. 00:20:33.982 00000280 a5 1f d5 26 b4 63 92 6f f4 71 67 24 3b 7f fc 8c ...&.c.o.qg$;... 00:20:33.982 00000290 de 04 55 86 af 39 7b d6 3a f5 82 ad f8 6d 9b 5d ..U..9{.:....m.] 00:20:33.982 000002a0 41 5e 6f 33 80 e5 b1 95 7a b0 2e c8 65 ad b4 1f A^o3....z...e... 00:20:33.982 000002b0 53 9e 7f 60 26 4a e2 21 7f 75 bd fd 1b 02 7d db S..`&J.!.u....}. 00:20:33.982 000002c0 ad b9 dc 10 aa ff a7 a0 9e 1a ab 10 67 3a 5d 6b ............g:]k 00:20:33.982 000002d0 e9 b8 9b 55 06 a8 d9 ee 1c ee a7 a1 bc 2e e9 74 ...U...........t 00:20:33.983 000002e0 3b 01 bc 9d da b8 73 d2 27 62 e9 a4 c1 33 b2 8a ;.....s.'b...3.. 00:20:33.983 000002f0 0e 82 95 f1 1e a9 8c 0d 1c 44 b8 c6 a8 c4 bd 09 .........D...... 00:20:33.983 host pubkey: 00:20:33.983 00000000 97 d4 c4 8b 97 65 39 2f 67 5e 09 9b 0e 5b 22 62 .....e9/g^...["b 00:20:33.983 00000010 3c 08 7f 10 17 db 78 55 f0 99 5b 08 c1 ae 97 52 <.....xU..[....R 00:20:33.983 00000020 aa 11 57 bf 24 ab 9d 40 06 75 f3 02 a7 34 76 21 ..W.$..@.u...4v! 00:20:33.983 00000030 15 58 85 f1 00 f9 20 40 57 45 24 99 e6 c9 9b c6 .X.... @WE$..... 00:20:33.983 00000040 e7 a1 fd d5 50 64 02 10 df 99 35 18 e6 49 dd 62 ....Pd....5..I.b 00:20:33.983 00000050 4a 64 dc 69 ae cc 42 08 27 52 f4 a7 31 0b c1 7e Jd.i..B.'R..1..~ 00:20:33.983 00000060 65 7f 2a cd bf ed d5 16 3b 56 0c aa d9 7b a2 aa e.*.....;V...{.. 00:20:33.983 00000070 27 80 17 4d da eb d6 6b 57 82 cd 26 76 be c4 dc '..M...kW..&v... 00:20:33.983 00000080 a9 28 d3 6b 5a 48 15 4b 4e 14 e7 71 4f 62 aa 1a .(.kZH.KN..qOb.. 00:20:33.983 00000090 a3 e3 1a d3 15 7d 18 e7 32 82 c5 3e a2 f2 c9 9a .....}..2..>.... 00:20:33.983 000000a0 95 ea 33 be 3b 4f 13 38 cd 33 58 7a a0 6e 30 57 ..3.;O.8.3Xz.n0W 00:20:33.983 000000b0 de 63 cc d5 70 df 52 c8 42 2e e9 05 f4 91 ba 18 .c..p.R.B....... 00:20:33.983 000000c0 53 e6 16 35 db ab 06 36 94 d2 b1 1f bb 71 75 af S..5...6.....qu. 00:20:33.983 000000d0 f1 5b be b9 bd 4b c5 09 5d 7b 75 3e 82 4d d6 35 .[...K..]{u>.M.5 00:20:33.983 000000e0 99 ff 93 b1 89 89 85 7a 3f 8e 50 d6 e2 d8 18 f0 .......z?.P..... 00:20:33.983 000000f0 68 65 53 8b b8 7b 5a 46 82 1f f0 68 58 57 13 a3 heS..{ZF...hXW.. 00:20:33.983 00000100 1e 72 9e 55 9f 03 68 1b 8b 61 66 03 66 48 ec c6 .r.U..h..af.fH.. 00:20:33.983 00000110 73 86 61 e4 fc 9a 2d 3c 12 8d 66 92 95 0c 79 dc s.a...-<..f...y. 00:20:33.983 00000120 10 10 ef e8 69 52 09 6f f8 e9 e1 67 f7 c5 34 75 ....iR.o...g..4u 00:20:33.983 00000130 77 cb 0d 03 1d da ae 86 3a 13 8d 01 c6 64 4f 20 w.......:....dO 00:20:33.983 00000140 16 f8 f3 84 8b 77 3b 27 f2 db b0 15 03 c9 a8 62 .....w;'.......b 00:20:33.983 00000150 33 22 c9 97 d0 cf 43 3b af 9b 31 9e 38 5c 8d 51 3"....C;..1.8\.Q 00:20:33.983 00000160 d5 c3 31 87 7b 99 17 62 44 cb 12 6f 0e a1 c9 89 ..1.{..bD..o.... 00:20:33.983 00000170 1b 73 e7 e8 9b 44 1e 3b 14 ec d4 51 ba 80 a7 e8 .s...D.;...Q.... 00:20:33.983 00000180 56 db 5c 31 c9 11 a3 b0 8b 37 9d 62 4f 06 61 7b V.\1.....7.bO.a{ 00:20:33.983 00000190 32 ff 8e e3 59 da 3c 24 0a 21 1a 8c c2 7b 44 4a 2...Y.<$.!...{DJ 00:20:33.983 000001a0 f8 59 5a a2 c1 38 0a bd 2e 98 b3 dc 71 62 9a c6 .YZ..8......qb.. 00:20:33.983 000001b0 d6 c5 a1 31 3d 04 3c 8f eb 63 e6 03 3c 39 90 71 ...1=.<..c..<9.q 00:20:33.983 000001c0 bc fe 12 5d 3c c6 50 ff 25 a2 9d 8c ed 96 22 cb ...]<.P.%.....". 00:20:33.983 000001d0 8c 2e d2 0d 50 2a 46 56 86 c5 24 79 08 bc b4 a4 ....P*FV..$y.... 00:20:33.983 000001e0 fa ac ec 2c 41 72 19 85 da 55 39 d5 d3 cd 46 5d ...,Ar...U9...F] 00:20:33.983 000001f0 8d aa 4a ed e8 94 ad fb 28 ed 9b a6 ae 75 78 ea ..J.....(....ux. 00:20:33.983 00000200 8a 09 90 77 3c 60 63 d9 05 76 ce 55 d7 06 39 8b ...w<`c..v.U..9. 00:20:33.983 00000210 ff 4e 38 f2 00 30 35 09 c6 c9 01 53 1e 20 4a 89 .N8..05....S. J. 00:20:33.983 00000220 b7 a3 ff 67 ce 0c 77 b8 a7 03 31 46 f6 5e 88 a2 ...g..w...1F.^.. 00:20:33.983 00000230 2f 5d 55 a8 d5 53 3f 32 b7 d7 a2 1b 58 a4 e2 36 /]U..S?2....X..6 00:20:33.983 00000240 e8 15 17 29 c6 15 55 b4 f4 66 2e 78 68 f9 9f f3 ...)..U..f.xh... 00:20:33.983 00000250 88 95 8b be b2 28 7c 67 d1 30 41 34 14 cb 3f 1e .....(|g.0A4..?. 00:20:33.983 00000260 c8 36 16 2c f4 92 80 81 e1 3f c4 43 92 71 f2 30 .6.,.....?.C.q.0 00:20:33.983 00000270 39 b7 12 64 3e 7e 98 12 0b 3e a3 d3 12 10 62 54 9..d>~...>....bT 00:20:33.983 00000280 db 94 f0 26 0c 7d 15 21 23 a1 2f 1e 29 88 45 92 ...&.}.!#./.).E. 00:20:33.983 00000290 2f d5 21 d5 9e fa 8e 3d 4b 62 3c 4a 50 2e e9 d3 /.!....=Kb..N 00:20:33.983 00000220 87 81 87 1a 79 4b ec aa b2 a3 96 df a3 af 84 9d ....yK.......... 00:20:33.983 00000230 60 ff d6 b1 08 f1 2d 1f a2 1c 3f f2 ce e3 e6 09 `.....-...?..... 00:20:33.983 00000240 a0 31 31 a1 41 5e b5 36 bf ca aa 63 7c b1 25 5a .11.A^.6...c|.%Z 00:20:33.983 00000250 39 7f e7 8b 3f 67 9d 3a 85 9a 16 16 6b 10 b6 70 9...?g.:....k..p 00:20:33.983 00000260 a6 4b 25 b4 ec 82 bd f2 47 9e 0b 29 05 cd 0a 77 .K%.....G..)...w 00:20:33.983 00000270 bd 79 a5 b0 0a d3 36 ed bf 46 4b 1e be e2 8b 01 .y....6..FK..... 00:20:33.983 00000280 82 60 1e 9a 89 d7 c2 f6 6b 7b ed 4b 7a 68 46 88 .`......k{.KzhF. 00:20:33.983 00000290 34 e7 53 dc ae 4b b2 1c 56 74 fe 4b 30 44 0f 8b 4.S..K..Vt.K0D.. 00:20:33.983 000002a0 8e 57 9c 01 54 f0 f6 3e 81 de 15 e1 d1 46 e6 27 .W..T..>.....F.' 00:20:33.983 000002b0 c8 5f 10 ba ff 17 a9 56 3d 5a f3 f7 1f c6 40 86 ._.....V=Z....@. 00:20:33.983 000002c0 58 f3 85 ac 53 14 fb ce 9c 75 a4 36 67 e4 22 d5 X...S....u.6g.". 00:20:33.983 000002d0 d7 31 08 ce 90 5a 4d 68 68 05 79 a6 9c 1b 05 b0 .1...ZMhh.y..... 00:20:33.983 000002e0 bb 84 0d ac 09 82 d9 0b 41 38 5e ee 65 c8 59 d9 ........A8^.e.Y. 00:20:33.983 000002f0 b1 35 87 82 2d d3 1d 9e 6c ed d0 6b 76 15 c2 9d .5..-...l..kv... 00:20:33.983 [2024-09-27 15:25:31.192273] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=3, dhgroup=4, seq=3428451832, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.983 [2024-09-27 15:25:31.192377] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.983 [2024-09-27 15:25:31.250127] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.983 [2024-09-27 15:25:31.250166] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.983 [2024-09-27 15:25:31.250176] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.983 [2024-09-27 15:25:31.250202] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.983 [2024-09-27 15:25:31.425057] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.983 [2024-09-27 15:25:31.425077] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.983 [2024-09-27 15:25:31.425084] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 4 (ffdhe6144) 00:20:33.983 [2024-09-27 15:25:31.425128] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.983 [2024-09-27 15:25:31.425152] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.983 ctrlr pubkey: 00:20:33.983 00000000 ba 0b b5 8c 8c 22 39 67 45 72 c1 72 5b ac 37 ab ....."9gEr.r[.7. 00:20:33.983 00000010 3a 06 a2 8e 7f 84 12 0b 45 de 08 cf 0e 80 c9 72 :.......E......r 00:20:33.983 00000020 6c 4a 89 7d 2b be c3 7b 7c ac ed 90 42 7f 9d a8 lJ.}+..{|...B... 00:20:33.983 00000030 b5 50 67 d5 90 6c f6 80 9f 83 02 e7 0f e4 cc 39 .Pg..l.........9 00:20:33.983 00000040 5b 6e 6d ad dc 14 1e 1b a1 3c a0 55 01 44 80 6d [nm......<.U.D.m 00:20:33.983 00000050 f0 03 8e 7f e5 9b 47 73 46 80 fa ff c4 57 f8 42 ......GsF....W.B 00:20:33.983 00000060 69 21 df bf bb 8f 2f f1 27 79 ba b2 8a 4e 3a cd i!..../.'y...N:. 00:20:33.983 00000070 d7 09 19 d5 fe 2c 3d 9b 2a c8 9d 3d 56 67 1f c2 .....,=.*..=Vg.. 00:20:33.984 00000080 fa 28 c7 2f bf f2 70 80 ce 4f bd 85 59 4a 78 90 .(./..p..O..YJx. 00:20:33.984 00000090 0d 5f a2 1f 51 24 2a 99 ac 9b 1f 75 9c e8 d9 c8 ._..Q$*....u.... 00:20:33.984 000000a0 9b a2 eb 9e 44 15 a1 88 26 ed 92 62 3a 98 f5 f4 ....D...&..b:... 00:20:33.984 000000b0 3c 92 5b cf a3 1f 27 0a b7 52 c2 21 df c1 a8 17 <.[...'..R.!.... 00:20:33.984 000000c0 ed 9f c0 9c eb 2d 13 e2 26 d6 56 02 ef ec 6c 84 .....-..&.V...l. 00:20:33.984 000000d0 b9 da 84 4b 96 72 f0 5b 76 4d cb 81 3b 0c 08 2d ...K.r.[vM..;..- 00:20:33.984 000000e0 bb 01 fc bf 8b fa 82 34 9b 59 a6 27 55 b1 08 8c .......4.Y.'U... 00:20:33.984 000000f0 61 6c 9a 17 5a 58 94 9b 2b 17 2f c8 d4 84 39 0e al..ZX..+./...9. 00:20:33.984 00000100 e5 ef ac b4 31 e3 f1 20 4a d9 01 7a 41 1b f7 3c ....1.. J..zA..< 00:20:33.984 00000110 6b b6 24 97 42 b3 c4 69 45 57 43 7b 18 98 c0 27 k.$.B..iEWC{...' 00:20:33.984 00000120 96 e0 5d e9 eb 8e e6 85 d7 9a 0f 40 c0 1b ab 2b ..]........@...+ 00:20:33.984 00000130 3f dd 1c 8b dd 5f 9f 75 f7 c3 31 db b4 fd 57 11 ?...._.u..1...W. 00:20:33.984 00000140 b2 f2 78 1c 20 f5 93 1f 99 14 20 67 aa 34 a0 fc ..x. ..... g.4.. 00:20:33.984 00000150 29 93 49 f4 ed 8e d4 fa 3c 28 8a d0 24 7f e9 8f ).I.....<(..$... 00:20:33.984 00000160 a5 b2 71 be 6a b4 0a c0 39 ab 8e 10 44 25 6f 18 ..q.j...9...D%o. 00:20:33.984 00000170 1c 66 6a 06 d7 e9 28 ab 0a bb 4c 8a 9c fe e9 78 .fj...(...L....x 00:20:33.984 00000180 53 56 84 f8 2f 79 41 71 8d 8f 54 c5 5e 35 27 bf SV../yAq..T.^5'. 00:20:33.984 00000190 32 9f 4d 72 9b e0 d4 2a ec c6 d5 e2 17 60 93 f0 2.Mr...*.....`.. 00:20:33.984 000001a0 c2 8a 5c df 5c 9c 67 e9 a3 98 09 00 a2 8e e9 b7 ..\.\.g......... 00:20:33.984 000001b0 81 24 15 7c 0b bb a7 f8 4c ca de a7 2c 1d 9e e7 .$.|....L...,... 00:20:33.984 000001c0 96 32 3e 81 cb 9e cd e4 f1 41 18 08 30 35 09 94 .2>......A..05.. 00:20:33.984 000001d0 57 1d 18 38 4b a2 e2 55 2c e3 3c 3f 7c 43 ad 93 W..8K..U,.j....8 00:20:33.984 00000260 2a e7 62 e3 c8 fc dc af af 45 ef a9 04 b8 3f d7 *.b......E....?. 00:20:33.984 00000270 6e 30 85 dd 40 f3 83 52 b0 5d 74 ce 9b c4 4e 3e n0..@..R.]t...N> 00:20:33.984 00000280 5d 09 1a 3c ed ad 76 dd 6a 68 56 8d ff 28 3b 31 ]..<..v.jhV..(;1 00:20:33.984 00000290 03 3e 50 0f 54 b7 76 44 5b 32 e6 10 b4 e3 fc ea .>P.T.vD[2...... 00:20:33.984 000002a0 c3 fb 30 f1 eb d2 5c 26 01 db 38 c5 0c 85 48 31 ..0...\&..8...H1 00:20:33.984 000002b0 bc 26 a9 e7 cd 17 53 cf 9c 08 f6 9d cb 43 e4 c8 .&....S......C.. 00:20:33.984 000002c0 d5 af 98 c3 3b 51 c4 6d d1 b2 23 b5 e6 c7 bf a1 ....;Q.m..#..... 00:20:33.984 000002d0 5e e5 ad f6 00 a3 cb a0 f2 03 2a 21 a8 96 92 98 ^.........*!.... 00:20:33.984 000002e0 8e ab 19 ea 5b e8 88 37 60 a4 52 80 f1 6d ea 98 ....[..7`.R..m.. 00:20:33.984 000002f0 c7 ee a2 9b 9a a0 c4 8e 51 47 c8 40 00 17 d0 f1 ........QG.@.... 00:20:33.984 host pubkey: 00:20:33.984 00000000 14 f4 c2 fb 0a 63 d0 72 e8 65 7a 25 b4 92 12 27 .....c.r.ez%...' 00:20:33.984 00000010 2a d2 9a 06 a0 cb 50 a0 26 bc 28 2e 4f 42 24 7a *.....P.&.(.OB$z 00:20:33.984 00000020 8d 87 e8 45 9a 75 01 e5 a5 16 38 db da 9a 24 2f ...E.u....8...$/ 00:20:33.984 00000030 c4 16 7e d3 59 64 0d cd 17 ee 1b eb 5e a4 bb e9 ..~.Yd......^... 00:20:33.984 00000040 3d 24 f4 d7 25 1c be 47 56 de 50 e7 27 7a 36 f3 =$..%..GV.P.'z6. 00:20:33.984 00000050 9f 2d f8 55 6c 22 63 84 2e a1 0e c4 83 21 fb b1 .-.Ul"c......!.. 00:20:33.984 00000060 34 f8 2b f8 74 b6 da cf a3 6a 17 14 c6 fe 6e a0 4.+.t....j....n. 00:20:33.984 00000070 ef ed 17 f3 85 7a 9e 1b f6 97 79 9c 1d 68 8b 31 .....z....y..h.1 00:20:33.984 00000080 33 7e d0 b3 54 c0 0f 6d 35 5f cd fa 1e 53 a1 ad 3~..T..m5_...S.. 00:20:33.984 00000090 bf 49 bf 05 61 4b 1b 1c a2 5b 2e e5 d3 c0 7f c2 .I..aK...[...... 00:20:33.984 000000a0 02 2f 03 f8 b9 40 1c 50 6c 09 96 1f 46 f9 cf f7 ./...@.Pl...F... 00:20:33.984 000000b0 53 ce ba b9 76 9b 9d 2d 2b 7e 35 af d2 32 91 ea S...v..-+~5..2.. 00:20:33.984 000000c0 a5 b3 3f ec 58 89 92 21 92 4d 6e 4e 71 c4 7b 2f ..?.X..!.MnNq.{/ 00:20:33.984 000000d0 a1 5c db a2 81 03 ee 2d 5e bb 51 02 68 ec 4f e7 .\.....-^.Q.h.O. 00:20:33.984 000000e0 e3 b9 ec 6c 12 45 7a ef df 13 af 0e 36 98 86 b4 ...l.Ez.....6... 00:20:33.984 000000f0 fe 9b b6 49 bc 81 a3 aa 64 d1 75 97 91 0d c3 61 ...I....d.u....a 00:20:33.984 00000100 45 50 6b 78 1c 52 fa 93 71 a2 ed 4a cb 89 0f 37 EPkx.R..q..J...7 00:20:33.984 00000110 3d b2 5c 1f bc 77 31 d0 be 12 73 cb 44 b7 6c 5d =.\..w1...s.D.l] 00:20:33.984 00000120 c8 04 c2 da 3f f3 8b f5 c8 fc d2 5a 55 4a 60 86 ....?......ZUJ`. 00:20:33.984 00000130 55 04 bc 79 07 47 bd 50 70 22 83 ce aa 6a 34 b8 U..y.G.Pp"...j4. 00:20:33.984 00000140 d5 cd f2 95 6a f7 eb 77 1e 94 53 3c 9c 87 b3 70 ....j..w..S<...p 00:20:33.984 00000150 4c 43 82 f5 0a 0e 4b cd 73 09 b3 b5 53 a1 a0 7a LC....K.s...S..z 00:20:33.984 00000160 a8 92 8d 4a a9 9c 5d 34 74 31 c9 de 50 92 d1 04 ...J..]4t1..P... 00:20:33.984 00000170 f7 03 14 7a 4e a4 34 e2 ff d4 d2 08 99 56 86 0a ...zN.4......V.. 00:20:33.984 00000180 c2 92 bc 33 6a c1 65 c4 98 ef c3 59 00 59 59 c5 ...3j.e....Y.YY. 00:20:33.984 00000190 3d cf cb 55 74 14 49 23 80 47 65 6e e1 0f a7 47 =..Ut.I#.Gen...G 00:20:33.984 000001a0 34 48 f6 76 51 29 75 01 d8 89 c7 63 ab 4c 7f ae 4H.vQ)u....c.L.. 00:20:33.984 000001b0 dd 3b d8 e6 25 71 b2 44 ef 0f f1 dc b1 e1 c9 e9 .;..%q.D........ 00:20:33.984 000001c0 21 fd d4 1e d7 06 54 00 c2 82 d1 31 76 2d 44 80 !.....T....1v-D. 00:20:33.984 000001d0 2c 6d 24 3b ec cb de 6d 97 7a 7f d8 82 c5 8f 1e ,m$;...m.z...... 00:20:33.984 000001e0 ec 9b b1 fb e0 a4 6f a8 42 9b c7 e1 3f fd 60 69 ......o.B...?.`i 00:20:33.984 000001f0 7b 44 f1 6f 99 0e 4c 9b 70 99 e1 f0 f1 ee bb 76 {D.o..L.p......v 00:20:33.984 00000200 a2 b8 f0 25 b6 a2 9e 5b da 32 48 91 78 4c ac 6a ...%...[.2H.xL.j 00:20:33.984 00000210 c6 94 53 ab 54 47 52 3a 72 ec 58 85 0a 97 f3 ad ..S.TGR:r.X..... 00:20:33.984 00000220 88 73 6b 8d 53 8b 6c ce 25 a6 e4 2d 3c 02 ad 96 .sk.S.l.%..-<... 00:20:33.984 00000230 13 c4 94 df 19 76 fb d3 f8 75 ce 9b 08 b1 02 3d .....v...u.....= 00:20:33.984 00000240 14 a1 7d 7e d7 2a 5d 19 58 57 08 cc a3 48 40 f1 ..}~.*].XW...H@. 00:20:33.984 00000250 7d 81 50 66 ec db 88 dd be e9 bb c7 76 22 84 b6 }.Pf........v".. 00:20:33.984 00000260 34 a2 79 32 a8 4f 6d 59 01 08 fd fb 53 cf 9c 9f 4.y2.OmY....S... 00:20:33.984 00000270 34 d5 21 80 dc 39 e7 ef 14 f7 72 34 8e 6c 3e 3b 4.!..9....r4.l>; 00:20:33.984 00000280 e8 52 a6 ab c6 b1 84 75 58 6e fe 73 f0 2b 22 ad .R.....uXn.s.+". 00:20:33.984 00000290 c4 75 7a 4d d6 71 7a 6c 20 5a 8b 6b eb c7 80 54 .uzM.qzl Z.k...T 00:20:33.984 000002a0 13 f1 d0 82 23 bb 18 3d 24 f9 d8 67 23 e5 ef bc ....#..=$..g#... 00:20:33.984 000002b0 fc 8d 24 03 48 51 50 a3 ea 35 b5 a9 0b a1 e9 8c ..$.HQP..5...... 00:20:33.984 000002c0 02 1a 31 28 0d b9 57 6e 2a 26 c8 63 4d 21 2d 30 ..1(..Wn*&.cM!-0 00:20:33.984 000002d0 ed 3a 7a 4c f7 6e 11 91 6a 40 96 8a 24 80 87 69 .:zL.n..j@..$..i 00:20:33.984 000002e0 e5 f2 ec 19 f9 22 b2 ef 0a 83 aa 96 a0 72 07 14 .....".......r.. 00:20:33.984 000002f0 96 64 08 5e 57 b5 0d b1 0e 2b ba 15 f6 f1 15 20 .d.^W....+..... 00:20:33.984 dh secret: 00:20:33.984 00000000 ee 8b 2c 7c 51 de 40 39 ce 93 b8 4c a1 a0 7c ba ..,|Q.@9...L..|. 00:20:33.984 00000010 20 22 a2 20 33 2f 71 93 2f af fc 07 b9 68 df 12 ". 3/q./....h.. 00:20:33.984 00000020 73 81 f1 f3 d9 a9 2d 80 00 62 91 27 75 62 89 0b s.....-..b.'ub.. 00:20:33.984 00000030 e4 2c 54 5c 08 72 74 c9 7d 8f 5b 34 24 41 a5 63 .,T\.rt.}.[4$A.c 00:20:33.984 00000040 1c 1a 25 ed 86 79 e2 8a c3 6d c4 22 49 d6 9b db ..%..y...m."I... 00:20:33.984 00000050 7c 7d b8 7a 0e 7c 1c 1f 26 23 20 00 7f 81 8f 3c |}.z.|..&# ....< 00:20:33.984 00000060 b7 79 53 40 d5 1a 69 42 d6 64 22 14 d1 b5 3f 33 .yS@..iB.d"...?3 00:20:33.984 00000070 f7 b2 0f 1b 50 dd 06 53 de 45 f8 b7 48 a0 14 49 ....P..S.E..H..I 00:20:33.984 00000080 d4 68 3d 19 b4 8d a5 3d d2 76 13 ac f5 9f 96 e5 .h=....=.v...... 00:20:33.984 00000090 a7 33 7f 23 bd 9f 3f 52 41 45 07 38 3c 81 52 24 .3.#..?RAE.8<.R$ 00:20:33.984 000000a0 8b 00 01 f9 24 af 1a 70 19 72 33 95 0a 89 01 f2 ....$..p.r3..... 00:20:33.984 000000b0 f5 c1 57 c0 c0 b1 bd 86 4d 95 15 9a c4 5a a9 b2 ..W.....M....Z.. 00:20:33.984 000000c0 7a 45 5d bd c3 dc ca 9f 3a 75 4f d8 a3 47 59 f1 zE].....:uO..GY. 00:20:33.984 000000d0 ef 9f e3 63 3c 0f 49 03 da 56 7e 9c 07 81 29 03 ...c<.I..V~...). 00:20:33.984 000000e0 17 b4 e0 1e 70 88 73 ee c0 a6 92 5e ec 45 5d b6 ....p.s....^.E]. 00:20:33.984 000000f0 80 37 e4 96 f7 9b a1 a0 ff 91 1a c7 a3 02 fa 87 .7.............. 00:20:33.984 00000100 51 40 c0 02 45 f5 32 1a b7 ae 90 fe 15 4d df 47 Q@..E.2......M.G 00:20:33.984 00000110 a1 c9 8e 4e 79 c1 05 c5 16 cd 0f dc bb 36 b2 60 ...Ny........6.` 00:20:33.984 00000120 f6 38 a8 03 0d 3f 88 73 ef 35 a5 13 d0 bb 5f f8 .8...?.s.5...._. 00:20:33.984 00000130 bf 53 92 e6 cf 4f a7 1b e6 21 77 21 36 8e 7a 62 .S...O...!w!6.zb 00:20:33.984 00000140 5a 42 17 ea 34 3f 39 1a 87 45 0f 63 17 5b f1 54 ZB..4?9..E.c.[.T 00:20:33.984 00000150 21 70 40 fc 21 dc 8e 24 73 14 49 07 7e fb d6 2b !p@.!..$s.I.~..+ 00:20:33.984 00000160 41 f1 95 00 84 41 35 61 af de 8d 31 0c b0 8f 1b A....A5a...1.... 00:20:33.984 00000170 0c e8 8a 48 f8 34 3d f3 7e 46 50 01 bd 76 9b c6 ...H.4=.~FP..v.. 00:20:33.984 00000180 a0 e0 cb 71 71 91 71 b2 c9 ff 7d ab 47 f9 de 34 ...qq.q...}.G..4 00:20:33.984 00000190 53 58 88 31 b2 fc bf 29 bf dd a4 95 1e 67 39 38 SX.1...).....g98 00:20:33.984 000001a0 a1 18 f3 f5 82 49 c3 1b 25 fd 4f 40 64 82 65 e9 .....I..%.O@d.e. 00:20:33.984 000001b0 89 97 6e c8 5b 30 88 b0 93 0b d8 2a fd 05 0a 82 ..n.[0.....*.... 00:20:33.984 000001c0 f9 79 09 1d 00 73 81 ca 1a 56 86 78 90 04 dc 8c .y...s...V.x.... 00:20:33.984 000001d0 fd 02 e4 da e2 bd a7 c5 b0 72 00 2d 6d 68 92 d4 .........r.-mh.. 00:20:33.984 000001e0 08 2a 2c d0 91 4d bf b7 31 09 d6 e2 2f 67 1c 0a .*,..M..1.../g.. 00:20:33.984 000001f0 4d 97 a8 c8 f9 31 f4 c1 96 67 9f 0d bc 35 54 79 M....1...g...5Ty 00:20:33.984 00000200 64 ac b4 4f bf 86 9d 45 2b 52 50 39 a2 2c f1 58 d..O...E+RP9.,.X 00:20:33.984 00000210 ea ec c1 65 81 27 36 4b 8e ae 97 a1 35 07 b0 a9 ...e.'6K....5... 00:20:33.984 00000220 74 d6 f7 04 e0 84 1f 07 c2 d9 7d 04 98 ca 11 b5 t.........}..... 00:20:33.984 00000230 5d 50 d2 48 8d 84 c2 e7 e3 fb 3f 57 00 0f eb 14 ]P.H......?W.... 00:20:33.984 00000240 ee 2b 54 bc 11 fe 8d 26 d9 3d 9d 56 8d ab 25 35 .+T....&.=.V..%5 00:20:33.984 00000250 a7 05 61 04 af 3a e8 55 f6 9f a7 fa 3b 42 b2 b5 ..a..:.U....;B.. 00:20:33.984 00000260 f1 fe a3 75 d2 30 9e fe 6a 98 62 5a 25 a3 c9 b3 ...u.0..j.bZ%... 00:20:33.984 00000270 6e e8 26 e2 f2 fc 6f 19 18 56 e6 d6 e8 b0 a6 cc n.&...o..V...... 00:20:33.984 00000280 e2 22 14 d9 75 d9 ef 29 72 dd 17 63 19 36 31 13 ."..u..)r..c.61. 00:20:33.984 00000290 57 a0 a2 a8 4f b4 a4 ba 70 c9 85 84 80 a9 be b5 W...O...p....... 00:20:33.984 000002a0 f5 b4 e9 88 99 61 0f c7 85 0b ef 24 c9 66 67 fd .....a.....$.fg. 00:20:33.984 000002b0 88 0b b2 0b ff 66 a7 b5 a2 fd e2 de 45 11 d4 10 .....f......E... 00:20:33.984 000002c0 a4 9d 1a 48 c7 6d 64 bb 84 a5 18 ab bb aa 41 55 ...H.md.......AU 00:20:33.984 000002d0 fc 96 ec 38 a8 59 ee 8a 68 c5 cd d9 c1 1a 3c e9 ...8.Y..h.....<. 00:20:33.984 000002e0 de a3 f7 4a df ef 90 3d b1 cc 29 8b 74 fe e9 01 ...J...=..).t... 00:20:33.984 000002f0 63 1b 34 7c 1c f1 83 72 c1 f0 24 a6 27 d6 f4 ca c.4|...r..$.'... 00:20:33.984 [2024-09-27 15:25:31.473984] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=3, dhgroup=4, seq=3428451833, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.985 [2024-09-27 15:25:31.510450] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.985 [2024-09-27 15:25:31.510480] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.985 [2024-09-27 15:25:31.510496] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.985 [2024-09-27 15:25:31.510502] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.985 [2024-09-27 15:25:31.616399] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.985 [2024-09-27 15:25:31.616417] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.985 [2024-09-27 15:25:31.616424] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 4 (ffdhe6144) 00:20:33.985 [2024-09-27 15:25:31.616434] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.985 [2024-09-27 15:25:31.616492] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.985 ctrlr pubkey: 00:20:33.985 00000000 ba 0b b5 8c 8c 22 39 67 45 72 c1 72 5b ac 37 ab ....."9gEr.r[.7. 00:20:33.985 00000010 3a 06 a2 8e 7f 84 12 0b 45 de 08 cf 0e 80 c9 72 :.......E......r 00:20:33.985 00000020 6c 4a 89 7d 2b be c3 7b 7c ac ed 90 42 7f 9d a8 lJ.}+..{|...B... 00:20:33.985 00000030 b5 50 67 d5 90 6c f6 80 9f 83 02 e7 0f e4 cc 39 .Pg..l.........9 00:20:33.985 00000040 5b 6e 6d ad dc 14 1e 1b a1 3c a0 55 01 44 80 6d [nm......<.U.D.m 00:20:33.985 00000050 f0 03 8e 7f e5 9b 47 73 46 80 fa ff c4 57 f8 42 ......GsF....W.B 00:20:33.985 00000060 69 21 df bf bb 8f 2f f1 27 79 ba b2 8a 4e 3a cd i!..../.'y...N:. 00:20:33.985 00000070 d7 09 19 d5 fe 2c 3d 9b 2a c8 9d 3d 56 67 1f c2 .....,=.*..=Vg.. 00:20:33.985 00000080 fa 28 c7 2f bf f2 70 80 ce 4f bd 85 59 4a 78 90 .(./..p..O..YJx. 00:20:33.985 00000090 0d 5f a2 1f 51 24 2a 99 ac 9b 1f 75 9c e8 d9 c8 ._..Q$*....u.... 00:20:33.985 000000a0 9b a2 eb 9e 44 15 a1 88 26 ed 92 62 3a 98 f5 f4 ....D...&..b:... 00:20:33.985 000000b0 3c 92 5b cf a3 1f 27 0a b7 52 c2 21 df c1 a8 17 <.[...'..R.!.... 00:20:33.985 000000c0 ed 9f c0 9c eb 2d 13 e2 26 d6 56 02 ef ec 6c 84 .....-..&.V...l. 00:20:33.985 000000d0 b9 da 84 4b 96 72 f0 5b 76 4d cb 81 3b 0c 08 2d ...K.r.[vM..;..- 00:20:33.985 000000e0 bb 01 fc bf 8b fa 82 34 9b 59 a6 27 55 b1 08 8c .......4.Y.'U... 00:20:33.985 000000f0 61 6c 9a 17 5a 58 94 9b 2b 17 2f c8 d4 84 39 0e al..ZX..+./...9. 00:20:33.985 00000100 e5 ef ac b4 31 e3 f1 20 4a d9 01 7a 41 1b f7 3c ....1.. J..zA..< 00:20:33.985 00000110 6b b6 24 97 42 b3 c4 69 45 57 43 7b 18 98 c0 27 k.$.B..iEWC{...' 00:20:33.985 00000120 96 e0 5d e9 eb 8e e6 85 d7 9a 0f 40 c0 1b ab 2b ..]........@...+ 00:20:33.985 00000130 3f dd 1c 8b dd 5f 9f 75 f7 c3 31 db b4 fd 57 11 ?...._.u..1...W. 00:20:33.985 00000140 b2 f2 78 1c 20 f5 93 1f 99 14 20 67 aa 34 a0 fc ..x. ..... g.4.. 00:20:33.985 00000150 29 93 49 f4 ed 8e d4 fa 3c 28 8a d0 24 7f e9 8f ).I.....<(..$... 00:20:33.985 00000160 a5 b2 71 be 6a b4 0a c0 39 ab 8e 10 44 25 6f 18 ..q.j...9...D%o. 00:20:33.985 00000170 1c 66 6a 06 d7 e9 28 ab 0a bb 4c 8a 9c fe e9 78 .fj...(...L....x 00:20:33.985 00000180 53 56 84 f8 2f 79 41 71 8d 8f 54 c5 5e 35 27 bf SV../yAq..T.^5'. 00:20:33.985 00000190 32 9f 4d 72 9b e0 d4 2a ec c6 d5 e2 17 60 93 f0 2.Mr...*.....`.. 00:20:33.985 000001a0 c2 8a 5c df 5c 9c 67 e9 a3 98 09 00 a2 8e e9 b7 ..\.\.g......... 00:20:33.985 000001b0 81 24 15 7c 0b bb a7 f8 4c ca de a7 2c 1d 9e e7 .$.|....L...,... 00:20:33.985 000001c0 96 32 3e 81 cb 9e cd e4 f1 41 18 08 30 35 09 94 .2>......A..05.. 00:20:33.985 000001d0 57 1d 18 38 4b a2 e2 55 2c e3 3c 3f 7c 43 ad 93 W..8K..U,.j....8 00:20:33.985 00000260 2a e7 62 e3 c8 fc dc af af 45 ef a9 04 b8 3f d7 *.b......E....?. 00:20:33.985 00000270 6e 30 85 dd 40 f3 83 52 b0 5d 74 ce 9b c4 4e 3e n0..@..R.]t...N> 00:20:33.985 00000280 5d 09 1a 3c ed ad 76 dd 6a 68 56 8d ff 28 3b 31 ]..<..v.jhV..(;1 00:20:33.985 00000290 03 3e 50 0f 54 b7 76 44 5b 32 e6 10 b4 e3 fc ea .>P.T.vD[2...... 00:20:33.985 000002a0 c3 fb 30 f1 eb d2 5c 26 01 db 38 c5 0c 85 48 31 ..0...\&..8...H1 00:20:33.985 000002b0 bc 26 a9 e7 cd 17 53 cf 9c 08 f6 9d cb 43 e4 c8 .&....S......C.. 00:20:33.985 000002c0 d5 af 98 c3 3b 51 c4 6d d1 b2 23 b5 e6 c7 bf a1 ....;Q.m..#..... 00:20:33.985 000002d0 5e e5 ad f6 00 a3 cb a0 f2 03 2a 21 a8 96 92 98 ^.........*!.... 00:20:33.985 000002e0 8e ab 19 ea 5b e8 88 37 60 a4 52 80 f1 6d ea 98 ....[..7`.R..m.. 00:20:33.985 000002f0 c7 ee a2 9b 9a a0 c4 8e 51 47 c8 40 00 17 d0 f1 ........QG.@.... 00:20:33.985 host pubkey: 00:20:33.985 00000000 9c 75 21 79 ce 18 d0 bb 42 f2 2f 69 eb 9e 95 6a .u!y....B./i...j 00:20:33.985 00000010 fb fa 02 6a 2d 45 24 b1 08 15 11 6e 0a 74 bc 1e ...j-E$....n.t.. 00:20:33.985 00000020 ae 8b 45 7d 1e eb 98 dd be 47 fb 9d f9 29 80 cf ..E}.....G...).. 00:20:33.985 00000030 25 d9 e3 72 91 bb d0 33 79 a2 5a bb 0a 10 b9 72 %..r...3y.Z....r 00:20:33.985 00000040 81 57 5a be 1e 70 0e 29 cc 9b 65 b5 05 4a 72 d4 .WZ..p.)..e..Jr. 00:20:33.985 00000050 61 24 63 2d 97 55 d5 05 87 e0 66 f0 7f 21 dd cf a$c-.U....f..!.. 00:20:33.985 00000060 cb 88 2a 69 85 e4 b1 6d e7 ea 8b 91 3a 4c b3 9a ..*i...m....:L.. 00:20:33.985 00000070 33 04 20 db 87 32 02 ab 74 97 35 5e a0 d5 41 b2 3. ..2..t.5^..A. 00:20:33.985 00000080 ae 6a ba 18 25 fa df 58 5e c9 06 fa c4 5c 07 ac .j..%..X^....\.. 00:20:33.985 00000090 00 6e 29 c9 c7 b6 09 8d 02 28 c2 39 79 1a 19 c4 .n)......(.9y... 00:20:33.985 000000a0 55 88 c7 ed de c4 e0 5e d9 88 aa cd fe b7 4b cc U......^......K. 00:20:33.985 000000b0 df 55 9f 5f 82 27 f6 c0 e5 f8 a3 75 c8 28 76 d9 .U._.'.....u.(v. 00:20:33.985 000000c0 dd fd 88 d5 21 bb 00 b6 c8 2f d4 99 7c 1b 47 f5 ....!..../..|.G. 00:20:33.985 000000d0 0f 6d 17 1d 38 63 45 3d ea 45 ed fe 54 25 d1 dc .m..8cE=.E..T%.. 00:20:33.985 000000e0 da 24 57 b9 2b 5d 3f b3 2e 0b 16 fa 6d cd 7e 30 .$W.+]?.....m.~0 00:20:33.985 000000f0 12 40 4b 4d ce 4d 00 2b c8 24 02 7b 14 71 9d 4b .@KM.M.+.$.{.q.K 00:20:33.985 00000100 df 5e e3 93 06 55 50 8e 5b 75 72 8d ad ae ae 04 .^...UP.[ur..... 00:20:33.985 00000110 3b 2d 66 d2 3e f5 61 83 b2 b8 67 f2 10 e9 ac 2f ;-f.>.a...g..../ 00:20:33.985 00000120 ab 82 ab 04 a2 83 ff a2 df 35 29 ec 82 18 9d d7 .........5)..... 00:20:33.985 00000130 42 2c 1f 96 53 40 2b b6 65 08 86 c9 aa 5d eb d8 B,..S@+.e....].. 00:20:33.985 00000140 d2 55 03 3e 8b 46 60 b2 47 16 d7 86 4e d5 05 f3 .U.>.F`.G...N... 00:20:33.985 00000150 3e 91 bb 88 cf 7c 17 09 95 75 3c c2 04 ad 81 89 >....|...u<..... 00:20:33.985 00000160 85 7d f1 04 89 54 96 2e 5f 29 28 cc 03 e6 06 7e .}...T.._)(....~ 00:20:33.985 00000170 2f c5 19 e5 b4 56 8e 19 a8 9f 70 ca 6d 27 5e 80 /....V....p.m'^. 00:20:33.985 00000180 46 21 42 50 51 22 f0 08 d5 39 ec 67 5a d7 60 a9 F!BPQ"...9.gZ.`. 00:20:33.985 00000190 ba 47 fc 1a 32 c7 57 c9 74 e1 c2 50 c2 b3 e8 ba .G..2.W.t..P.... 00:20:33.985 000001a0 6e a0 50 52 6b 09 b6 8a 39 1b 5d a7 1c 51 67 70 n.PRk...9.]..Qgp 00:20:33.985 000001b0 10 4d 09 22 dc 02 75 9e ed 38 5f 8d aa 07 86 cb .M."..u..8_..... 00:20:33.985 000001c0 19 74 d2 5b 5c f5 e6 62 d2 0f 26 90 00 4a dc d5 .t.[\..b..&..J.. 00:20:33.985 000001d0 0f 0d ee 2c 20 81 b9 5c 7d 71 b6 e8 94 d8 53 d2 ..., ..\}q....S. 00:20:33.985 000001e0 e0 c6 76 5e aa a7 3f 3e 06 d5 72 7e ee 4f 6a d2 ..v^..?>..r~.Oj. 00:20:33.985 000001f0 b3 67 4a 53 2d 46 9e 55 a1 51 ac f3 e3 2e 56 33 .gJS-F.U.Q....V3 00:20:33.985 00000200 a0 6c 89 c9 5d 19 89 38 fc f6 2c e1 fd a8 db d3 .l..]..8..,..... 00:20:33.985 00000210 81 8a b4 7b f6 18 4b b9 6d 93 f1 f1 34 b2 ae d7 ...{..K.m...4... 00:20:33.985 00000220 43 29 d1 28 4b dd 59 c8 f4 71 c5 96 06 00 28 40 C).(K.Y..q....(@ 00:20:33.985 00000230 45 f0 47 cf a0 dc e5 53 8a c0 7b 3a 98 ab 56 a9 E.G....S..{:..V. 00:20:33.985 00000240 f6 fd 72 6e 83 f9 d5 f7 ae d9 b3 7d 23 5d 80 7c ..rn.......}#].| 00:20:33.985 00000250 a8 dc e4 e6 48 43 5e df 0e 28 6f e5 12 73 db b8 ....HC^..(o..s.. 00:20:33.985 00000260 85 25 1d 8b 1d a9 48 d7 88 26 a0 7a ad ae 28 0b .%....H..&.z..(. 00:20:33.985 00000270 28 88 dd 9b 5d 89 a1 34 ea 08 ee ab 66 bd 29 cd (...]..4....f.). 00:20:33.985 00000280 8d 5e 64 d2 3b 87 8a 7b 34 ae 8b ee df c9 3e 44 .^d.;..{4.....>D 00:20:33.985 00000290 6b 39 ee c1 20 4d 2d 14 f2 42 f3 e9 14 ec cb 67 k9.. M-..B.....g 00:20:33.985 000002a0 45 96 38 5c 4f c1 c2 8a 6a d8 7b 7b 0a ad c8 89 E.8\O...j.{{.... 00:20:33.985 000002b0 41 e1 18 0c eb 50 1c 83 8b 03 ad da e3 8b a0 b8 A....P.......... 00:20:33.985 000002c0 7b 07 c4 5c ed 83 1d 5e e4 68 ab eb 8f f9 8d 0c {..\...^.h...... 00:20:33.985 000002d0 b9 47 78 63 5e f4 9e 2a 3a 49 ae 05 6d 79 9b a1 .Gxc^..*:I..my.. 00:20:33.985 000002e0 1b 1f e6 0b 19 50 03 98 a0 b6 c1 42 76 47 f6 cf .....P.....BvG.. 00:20:33.985 000002f0 b7 6a 09 21 9f 45 b0 97 92 55 4a 08 c3 93 01 af .j.!.E...UJ..... 00:20:33.985 dh secret: 00:20:33.985 00000000 61 70 37 2f be 1c ea 36 bf 22 c3 bf 4d 86 fc 40 ap7/...6."..M..@ 00:20:33.985 00000010 68 40 23 b0 fa fb d0 5a b6 34 50 1d 15 48 be d7 h@#....Z.4P..H.. 00:20:33.985 00000020 e7 c0 65 ef 7c d0 cd 77 8a 9e e4 f5 f0 f1 b9 cd ..e.|..w........ 00:20:33.985 00000030 82 dc ee 64 f4 1a 5e 0e c0 1f 60 a5 1e 5b bd 44 ...d..^...`..[.D 00:20:33.985 00000040 e8 b3 c3 a7 f8 44 4d f9 d5 6d 8e 98 a1 aa 45 82 .....DM..m....E. 00:20:33.985 00000050 36 dd a1 36 f7 f7 04 7b 78 03 1d 25 10 66 bd 8e 6..6...{x..%.f.. 00:20:33.985 00000060 d1 9d 28 3c b3 e5 94 01 cf f7 be 95 e6 a7 c1 e0 ..(<............ 00:20:33.985 00000070 64 ef 83 5b 94 b3 b5 78 5e 4c cb 46 23 03 4b 87 d..[...x^L.F#.K. 00:20:33.985 00000080 a1 a8 27 35 e7 51 e2 49 28 d3 51 ab c6 8a da b6 ..'5.Q.I(.Q..... 00:20:33.985 00000090 bd 4b 96 b1 a9 32 ca 64 fe 33 90 01 3d ea 1f 6f .K...2.d.3..=..o 00:20:33.985 000000a0 52 9f a5 37 0d 60 d1 4f 94 25 26 a2 be 22 5c ae R..7.`.O.%&.."\. 00:20:33.985 000000b0 9b 36 5f 6d a7 ff 16 00 6a 34 08 ac 8a e1 20 9a .6_m....j4.... . 00:20:33.985 000000c0 12 c2 ba c1 6c d7 31 28 43 01 d1 32 6f 4a 11 cf ....l.1(C..2oJ.. 00:20:33.985 000000d0 36 a5 89 fb 60 20 cc 75 a7 c3 ae 7f d5 31 a2 49 6...` .u.....1.I 00:20:33.985 000000e0 39 7f a4 64 83 82 18 3a 55 1b 6c a3 e3 17 96 07 9..d...:U.l..... 00:20:33.985 000000f0 12 34 92 f0 e7 25 8f 1c 6e 3b b7 00 90 5b 71 c4 .4...%..n;...[q. 00:20:33.985 00000100 e4 ad bf 9c 43 04 a9 3b 03 d4 45 48 ba 82 02 5e ....C..;..EH...^ 00:20:33.985 00000110 d3 e7 75 ba 34 21 61 72 cb 36 53 25 48 ae 56 93 ..u.4!ar.6S%H.V. 00:20:33.985 00000120 b8 ca af 82 ae 51 f4 9e ac 6e 07 dc fd de 42 dc .....Q...n....B. 00:20:33.985 00000130 80 25 35 83 56 2f 7b 1b 5d 3d 20 d6 5e 1c 4c a3 .%5.V/{.]= .^.L. 00:20:33.985 00000140 e1 a8 f3 4f d0 8e 9b 55 96 d2 e1 7e 28 8f 1f 10 ...O...U...~(... 00:20:33.985 00000150 df 7a 65 80 8c 2a 19 66 6a 82 35 3a 33 68 30 f8 .ze..*.fj.5:3h0. 00:20:33.985 00000160 9c 2b 74 44 e9 38 b3 c7 fe 1a 95 ba f4 dd e3 de .+tD.8.......... 00:20:33.985 00000170 d6 87 7a 13 cd 36 83 17 94 72 fb 1c 18 79 83 1e ..z..6...r...y.. 00:20:33.985 00000180 24 cc 76 3f 74 7e 09 bb 8d 79 6e c5 cd ad 51 fe $.v?t~...yn...Q. 00:20:33.985 00000190 58 af 3a 1f 33 ba 88 08 ed c3 a6 08 d5 5a f7 ac X.:.3........Z.. 00:20:33.985 000001a0 6e e2 14 4c bf 1b 75 d2 9a 12 4e 74 8f 73 bb 59 n..L..u...Nt.s.Y 00:20:33.985 000001b0 bf c3 4a 4e 37 00 b0 bc da 19 4a 94 69 84 ad 6e ..JN7.....J.i..n 00:20:33.985 000001c0 9b 3f a8 70 64 b1 2d 34 8a 56 50 51 a4 37 31 26 .?.pd.-4.VPQ.71& 00:20:33.985 000001d0 af 10 ef 60 9a f0 0c 4d f8 bb 4e 95 bc 85 9d d4 ...`...M..N..... 00:20:33.985 000001e0 45 35 a4 ad 71 e7 d6 8b 8d a8 7c f2 44 87 d9 f9 E5..q.....|.D... 00:20:33.986 000001f0 13 79 26 31 69 a9 38 a2 42 a2 8f 4e 35 f7 72 b2 .y&1i.8.B..N5.r. 00:20:33.986 00000200 77 78 c5 b5 80 25 6e 74 5b 2f b5 b7 dc 2d 89 6a wx...%nt[/...-.j 00:20:33.986 00000210 68 c7 74 11 99 92 60 56 17 15 d3 ed 49 73 04 75 h.t...`V....Is.u 00:20:33.986 00000220 2e fc d2 ca c5 64 2d 7e f2 1f b8 63 8c e1 be 15 .....d-~...c.... 00:20:33.986 00000230 37 38 f2 55 90 91 1a ff 56 59 1d 2a d7 56 87 fe 78.U....VY.*.V.. 00:20:33.986 00000240 c6 1d 0b 4d c5 21 0b 19 29 6f 0b e3 ae bb 63 44 ...M.!..)o....cD 00:20:33.986 00000250 b2 6e 96 f0 9e 05 9a d7 15 4a 51 3e a1 cf 18 44 .n.......JQ>...D 00:20:33.986 00000260 c1 53 de 91 8b 1b c7 5b 86 e5 42 14 32 04 08 9f .S.....[..B.2... 00:20:33.986 00000270 49 27 77 7b e7 de 84 4f f2 96 bb be 00 4a 69 23 I'w{...O.....Ji# 00:20:33.986 00000280 72 a8 a0 f4 3d f5 91 e0 5e 25 17 67 39 ef fd be r...=...^%.g9... 00:20:33.986 00000290 20 ed 36 18 32 14 c4 72 b7 f5 fc 7b e1 ff fc cf .6.2..r...{.... 00:20:33.986 000002a0 74 cb 90 b2 61 48 25 a7 a9 d7 44 9f 39 e3 32 d1 t...aH%...D.9.2. 00:20:33.986 000002b0 ca 3c 08 e2 57 91 85 73 f9 19 ee b7 b5 b9 2a c8 .<..W..s......*. 00:20:33.986 000002c0 8b 0b 00 77 2b 22 5e 86 36 fc 92 96 e1 8f fa 2e ...w+"^.6....... 00:20:33.986 000002d0 d6 ab f6 44 34 21 80 61 28 50 14 ac 87 80 00 c4 ...D4!.a(P...... 00:20:33.986 000002e0 92 68 61 00 ac ad 16 a6 60 56 8f 17 fb ca d7 43 .ha.....`V.....C 00:20:33.986 000002f0 22 b7 82 f0 06 a8 c8 0e b1 c2 5f 17 bf d3 a6 90 "........._..... 00:20:33.986 [2024-09-27 15:25:31.665604] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=3, dhgroup=4, seq=3428451834, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.986 [2024-09-27 15:25:31.665671] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.986 [2024-09-27 15:25:31.723135] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.986 [2024-09-27 15:25:31.723163] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.986 [2024-09-27 15:25:31.723169] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.986 [2024-09-27 15:25:31.916459] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.986 [2024-09-27 15:25:31.916479] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.986 [2024-09-27 15:25:31.916486] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.986 [2024-09-27 15:25:31.916532] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.986 [2024-09-27 15:25:31.916555] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.986 ctrlr pubkey: 00:20:33.986 00000000 ea 19 9d 61 5b 73 40 4b 62 71 7f 53 35 4e 24 db ...a[s@Kbq.S5N$. 00:20:33.986 00000010 a1 6a 0f f5 f7 57 33 19 9a 1c b8 25 01 94 9b 61 .j...W3....%...a 00:20:33.986 00000020 1b 86 9c 00 29 b6 a8 17 0b 7e e2 bc a8 1f e5 73 ....)....~.....s 00:20:33.986 00000030 dd ca 32 86 cb 05 b9 a7 35 2e 59 38 47 7b 59 93 ..2.....5.Y8G{Y. 00:20:33.986 00000040 42 cb be 23 50 5c 70 05 58 13 37 f7 39 19 d7 7a B..#P\p.X.7.9..z 00:20:33.986 00000050 5b 5f 51 67 d6 d4 de 16 5f 32 89 4d 73 48 3f 86 [_Qg...._2.MsH?. 00:20:33.986 00000060 c6 ab d9 f4 b8 78 d8 b9 7d b5 28 c1 95 22 af ed .....x..}.(..".. 00:20:33.986 00000070 9a 57 a1 83 32 15 28 89 67 0b 86 9f 32 a2 2e fb .W..2.(.g...2... 00:20:33.986 00000080 43 82 b9 b6 55 51 9c 13 0f 73 6b 28 36 f5 2a 82 C...UQ...sk(6.*. 00:20:33.986 00000090 a4 87 86 ed 89 76 1a d5 1c 57 4e 3b 8f 29 63 25 .....v...WN;.)c% 00:20:33.986 000000a0 f3 48 48 97 8c e3 5f f9 b2 d8 ea 63 72 b8 2b d3 .HH..._....cr.+. 00:20:33.986 000000b0 2c 73 80 ef bd 60 f7 15 71 13 a2 71 3d 49 54 a5 ,s...`..q..q=IT. 00:20:33.986 000000c0 e1 5e 14 ed 08 f7 15 8e 86 11 15 ef aa 67 11 c6 .^...........g.. 00:20:33.986 000000d0 d7 1f fc d0 6b ef 5f f6 62 27 ea b8 ae a9 40 29 ....k._.b'....@) 00:20:33.986 000000e0 61 76 ff 6d 81 71 aa dc 67 08 c6 37 ca 39 f0 4e av.m.q..g..7.9.N 00:20:33.986 000000f0 3b c0 d6 b6 f0 1e 37 18 2f 38 aa 9e 28 ac e3 d2 ;.....7./8..(... 00:20:33.986 00000100 ae 3c 04 71 6d 1f 70 da 7a 78 d3 d0 d4 2f 77 a2 .<.qm.p.zx.../w. 00:20:33.986 00000110 c7 86 eb 52 f7 2b 78 e5 d6 ab 20 3e a4 24 9a b7 ...R.+x... >.$.. 00:20:33.986 00000120 c0 b7 63 0a 6b 5c a2 0e e3 0b 32 f9 5b 16 c7 df ..c.k\....2.[... 00:20:33.986 00000130 2a d4 fa c4 da 63 21 72 e6 40 15 15 6a 18 66 5d *....c!r.@..j.f] 00:20:33.986 00000140 54 f7 32 2e c1 00 69 c8 9c ff 33 b0 e7 22 02 c1 T.2...i...3..".. 00:20:33.986 00000150 e6 68 72 de 3b 3c 96 b1 97 a4 08 2e 47 83 3c da .hr.;<......G.<. 00:20:33.986 00000160 09 8b b6 33 c2 6b 8b aa d9 e9 2a 70 6c 78 2b 72 ...3.k....*plx+r 00:20:33.986 00000170 87 41 4d 7d 44 b9 2c 7c 74 a0 2c 2c 2d 7b 96 2f .AM}D.,|t.,,-{./ 00:20:33.986 00000180 df bc 54 6a ac 2a 78 39 97 fd 52 70 62 6c 20 5a ..Tj.*x9..Rpbl Z 00:20:33.986 00000190 27 21 c8 e2 f6 e2 cc c3 7e 10 31 70 0d 35 af 7c '!......~.1p.5.| 00:20:33.986 000001a0 7d 85 8f 80 fc 0a c3 7f bd 36 09 b4 c2 05 2d 82 }........6....-. 00:20:33.986 000001b0 90 d8 82 96 9b 6e 2a 33 84 42 35 bc 39 b7 bf a8 .....n*3.B5.9... 00:20:33.986 000001c0 05 87 a4 77 c6 e4 f6 c7 c6 ff 90 b9 4d 3a f3 97 ...w........M:.. 00:20:33.986 000001d0 04 96 ca 89 56 52 e0 95 8d dc 75 62 45 d7 4c ec ....VR....ubE.L. 00:20:33.986 000001e0 0d d3 26 ff 2f cb a4 62 cd 7d ec 96 56 28 8e c0 ..&./..b.}..V(.. 00:20:33.986 000001f0 22 67 27 2a 7e 95 cb 52 ac fe b9 91 f3 fb cf 17 "g'*~..R........ 00:20:33.986 00000200 48 62 17 6f 34 54 4e 06 dd f6 22 8c ec 65 dd af Hb.o4TN..."..e.. 00:20:33.986 00000210 47 f5 d5 08 12 75 7b 04 39 b3 11 7f 5c a1 68 fb G....u{.9...\.h. 00:20:33.986 00000220 ae a8 40 68 3f d5 d9 99 5b 56 fc d1 03 53 5e 6f ..@h?...[V...S^o 00:20:33.986 00000230 0c 84 96 ee 11 3d a3 54 04 ee eb e5 7f f3 0e 26 .....=.T.......& 00:20:33.986 00000240 a1 0a fe b3 07 9a 51 ee 34 4d 84 a6 25 32 58 4e ......Q.4M..%2XN 00:20:33.986 00000250 6b e0 12 5d 5e 8b 9c 9f 7d 81 bd 4e e8 67 a3 07 k..]^...}..N.g.. 00:20:33.986 00000260 38 24 e2 34 a0 8c 9c a5 bf c4 d4 b0 21 d3 1d 7e 8$.4........!..~ 00:20:33.986 00000270 8b 8a 03 67 a2 c1 4e db 06 01 21 9c 11 e3 3d 35 ...g..N...!...=5 00:20:33.986 00000280 22 63 46 1e bb f6 0c 1b f5 19 d2 49 73 27 de cf "cF........Is'.. 00:20:33.986 00000290 17 0d 98 22 8a 39 73 c5 16 4f 55 12 92 63 3d ab ...".9s..OU..c=. 00:20:33.986 000002a0 1f 38 db c0 26 11 bc 82 18 41 ef 84 97 7f 50 92 .8..&....A....P. 00:20:33.986 000002b0 d0 1d 37 69 e5 b4 50 6e a0 f1 84 10 4e f3 4c 91 ..7i..Pn....N.L. 00:20:33.986 000002c0 60 ff f2 da 94 94 d7 b5 b2 5e c5 71 19 22 d6 c8 `........^.q.".. 00:20:33.986 000002d0 6a cc 2a 16 d1 a1 b9 ba 22 ab 5a fd 13 76 21 2b j.*.....".Z..v!+ 00:20:33.986 000002e0 be 3a 30 8b 96 0b a1 02 50 c3 9d 0e d7 39 97 b8 .:0.....P....9.. 00:20:33.986 000002f0 5a b1 bd c4 cf 31 2a 7b da f7 e4 90 50 4e da 24 Z....1*{....PN.$ 00:20:33.986 00000300 9d 3a db c0 91 50 d3 15 36 c9 ae 85 c5 a3 a8 76 .:...P..6......v 00:20:33.986 00000310 67 77 13 38 bc 6a f9 cc dd 0a f3 30 3b 54 81 ac gw.8.j.....0;T.. 00:20:33.986 00000320 53 6a bc 78 ca 9a 41 41 7d 00 08 32 6e 02 27 23 Sj.x..AA}..2n.'# 00:20:33.986 00000330 5e 39 79 f4 8d 4e 1b 01 4e ed 77 08 66 50 a2 68 ^9y..N..N.w.fP.h 00:20:33.986 00000340 3d 0e ab 4b f1 63 72 7a e3 72 89 19 9a 54 c5 c7 =..K.crz.r...T.. 00:20:33.986 00000350 06 ab 86 4d 79 77 53 60 e4 a4 7b 98 92 32 dd f6 ...MywS`..{..2.. 00:20:33.986 00000360 67 23 39 af 17 dd 06 69 26 78 4c 67 ea fa 3d b5 g#9....i&xLg..=. 00:20:33.986 00000370 0a 65 b6 db e3 ac a9 ed 24 89 bb 82 67 03 07 63 .e......$...g..c 00:20:33.986 00000380 0d fa b4 07 e9 21 48 14 ee 42 fe 70 8f db f8 51 .....!H..B.p...Q 00:20:33.986 00000390 5a b1 bd 75 1b 32 aa 59 86 7d 0b eb 1b 2e 7f 2c Z..u.2.Y.}....., 00:20:33.986 000003a0 74 f1 bf a2 62 c2 80 56 7d ac b9 37 8a 89 99 f5 t...b..V}..7.... 00:20:33.986 000003b0 b2 48 6f a2 b8 aa ad 3f 7a fb f9 25 c2 12 81 67 .Ho....?z..%...g 00:20:33.986 000003c0 78 32 3b 57 8d 2b a1 f9 72 64 fb de 2c d5 d6 48 x2;W.+..rd..,..H 00:20:33.986 000003d0 a1 04 48 22 c8 5d eb 9e 27 3a 6b d0 03 f2 ee ac ..H".]..':k..... 00:20:33.986 000003e0 2e 1a 1d f5 89 a7 65 da a5 1c a2 d4 f3 a0 9e dc ......e......... 00:20:33.986 000003f0 e7 9a 6b 2c 25 55 a1 83 c6 e1 65 1b 26 4c 93 e3 ..k,%U....e.&L.. 00:20:33.986 host pubkey: 00:20:33.986 00000000 c7 7c 13 bd b3 61 41 76 10 a6 75 78 70 20 40 93 .|...aAv..uxp @. 00:20:33.986 00000010 97 a5 b3 ac 46 cd 8c 4c 45 ba cf d3 7e 11 7c c9 ....F..LE...~.|. 00:20:33.986 00000020 0d f5 27 f0 53 79 fd d2 b6 ae 1b c3 5d 15 4f 76 ..'.Sy......].Ov 00:20:33.986 00000030 5b 31 ab e3 98 c8 56 34 a0 c4 cd 07 05 33 53 ac [1....V4.....3S. 00:20:33.986 00000040 eb 0d e5 1a 80 be 16 2d 62 b7 88 e9 ff 93 fe 75 .......-b......u 00:20:33.986 00000050 dc e7 03 05 8f 36 a4 c7 9e 25 e1 15 86 58 ae 95 .....6...%...X.. 00:20:33.986 00000060 0a 4a 9f 59 07 b1 cc 62 fd 08 88 f7 1f 32 fe 79 .J.Y...b.....2.y 00:20:33.986 00000070 a0 6e 58 1b 1c 0c c3 e9 f6 ef 24 44 fa 22 b5 c5 .nX.......$D.".. 00:20:33.986 00000080 8e ed b6 10 30 4f 5d 9e ef ad 99 60 bb fe 9b 40 ....0O]....`...@ 00:20:33.986 00000090 d4 3f 60 3f 5a 03 7d 9f f1 29 3c aa 1e 75 dc 72 .?`?Z.}..)<..u.r 00:20:33.986 000000a0 09 8f a6 4b 7a 30 fd c2 cf e7 31 d8 1a 6e 55 28 ...Kz0....1..nU( 00:20:33.986 000000b0 4f bd 5c 97 a1 46 e1 a5 c5 a0 04 51 58 01 65 f4 O.\..F.....QX.e. 00:20:33.986 000000c0 1d 55 63 d1 24 f6 e4 82 f2 49 33 c2 75 4b 4e b7 .Uc.$....I3.uKN. 00:20:33.986 000000d0 21 3c 4c cc 75 be d9 dd aa 2b e6 de d3 b4 b7 e7 !...^.. 00:20:33.986 00000110 ea 1d 73 ba 14 6d c7 65 a2 98 61 92 93 84 73 a0 ..s..m.e..a...s. 00:20:33.986 00000120 8e 13 42 ad 43 8e 6f 8f 0c 99 0f 51 1c e5 2c b7 ..B.C.o....Q..,. 00:20:33.986 00000130 36 85 11 d6 30 f4 7a 1f 94 f5 02 e3 86 1d aa 04 6...0.z......... 00:20:33.986 00000140 23 79 42 9d 0b 6b 6f 9f 40 79 55 88 09 5c e9 ac #yB..ko.@yU..\.. 00:20:33.986 00000150 a0 c3 c5 79 fd b3 6a f7 f4 87 a5 44 cd cb 09 59 ...y..j....D...Y 00:20:33.986 00000160 5d 09 9b 25 c1 80 ae dc 13 d3 7d 80 ed cd 13 b9 ]..%......}..... 00:20:33.986 00000170 02 bd 13 42 85 74 3d df 3e 59 1f 88 bd 40 89 9e ...B.t=.>Y...@.. 00:20:33.986 00000180 ea cb 2b 26 87 4a 8f 26 3f d7 5b 73 3e c4 87 60 ..+&.J.&?.[s>..` 00:20:33.986 00000190 cb 89 bb b2 c4 29 5f 73 42 e2 5e 66 c2 a4 63 e0 .....)_sB.^f..c. 00:20:33.986 000001a0 c5 5a 61 d4 87 95 00 35 f1 c4 14 d2 a4 b5 80 a1 .Za....5........ 00:20:33.986 000001b0 37 ed b1 69 8b 3a 0e b4 c6 2e a9 05 7d f2 b7 76 7..i.:......}..v 00:20:33.986 000001c0 ec 45 06 02 ed 48 28 22 9a b1 f6 e4 ea 2e 49 54 .E...H("......IT 00:20:33.986 000001d0 a6 6d 53 86 74 6c 3a f7 07 eb a7 81 f4 ab 02 46 .mS.tl:........F 00:20:33.986 000001e0 8c 15 cc de 35 24 e9 a2 46 d7 8e cc c8 82 71 1b ....5$..F.....q. 00:20:33.986 000001f0 02 e0 e2 e9 e3 88 cd 22 e3 5d 6b 3e 8e 91 75 5b .......".]k>..u[ 00:20:33.986 00000200 3a 95 ed 52 cb 01 67 74 79 d6 15 1e fc f6 58 04 :..R..gty.....X. 00:20:33.986 00000210 22 8e 62 9c 32 f4 10 c3 cf 6a b4 83 91 75 cf 29 ".b.2....j...u.) 00:20:33.986 00000220 20 4f d1 69 9f ca 2f 56 db 67 20 5d 8d df 88 3b O.i../V.g ]...; 00:20:33.986 00000230 77 6d 08 32 cf f1 aa 21 a7 75 eb f1 49 a8 b0 de wm.2...!.u..I... 00:20:33.986 00000240 62 73 ff 36 eb a4 f7 10 4d 0c b3 50 4c 95 b1 f4 bs.6....M..PL... 00:20:33.986 00000250 f6 59 33 04 f5 4d df d6 5b cf da 5b 49 5a d2 6f .Y3..M..[..[IZ.o 00:20:33.986 00000260 76 7f 96 0e d4 54 87 b1 3b 4e c7 2c db 6a a9 1f v....T..;N.,.j.. 00:20:33.987 00000270 fe 7b 72 bc fa 49 06 85 53 58 4d 44 99 40 75 eb .{r..I..SXMD.@u. 00:20:33.987 00000280 33 eb 28 ce b6 e0 11 97 d4 8d ee be 8e 5b ba d1 3.(..........[.. 00:20:33.987 00000290 66 31 94 1e 57 2f bb 17 b0 39 b2 06 54 9e 77 74 f1..W/...9..T.wt 00:20:33.987 000002a0 c9 22 df a8 5a 48 2d 31 d4 cc 28 a5 9a 3f 72 c7 ."..ZH-1..(..?r. 00:20:33.987 000002b0 25 8a 0d 77 4b 3c 26 10 c1 11 cd 5f ac 2b 02 24 %..wK<&...._.+.$ 00:20:33.987 000002c0 df 45 af f2 5f 97 3c 73 13 d5 46 4e 09 0e a9 d5 .E.._..... 00:20:33.987 000002e0 a5 18 5a f9 27 6e 02 d2 e0 fb 66 55 4c 71 c5 6c ..Z.'n....fULq.l 00:20:33.987 000002f0 63 e9 38 5f 9e 61 aa 6d 11 7a c2 56 15 bf 6e 11 c.8_.a.m.z.V..n. 00:20:33.987 00000300 a1 90 57 d2 89 a0 8a 85 6c 19 a3 2e 90 6f 4f 3a ..W.....l....oO: 00:20:33.987 00000310 ab 2f ab 68 4d 40 25 1f 19 a0 09 81 80 df cc 02 ./.hM@%......... 00:20:33.987 00000320 b2 84 d3 31 08 5e 77 d3 01 aa 57 d8 c7 ca 17 79 ...1.^w...W....y 00:20:33.987 00000330 e9 b6 22 b8 fa d3 7b d5 44 4b 5c f9 da 60 1d 61 .."...{.DK\..`.a 00:20:33.987 00000340 77 e6 98 da 98 86 7e e6 57 ca c9 21 95 14 c7 94 w.....~.W..!.... 00:20:33.987 00000350 cf 35 52 7e c3 29 8e df 43 97 07 5a 2d 1f b2 98 .5R~.)..C..Z-... 00:20:33.987 00000360 48 1d 75 e2 a6 6e 6d cf ae eb d3 42 6c ab a0 0f H.u..nm....Bl... 00:20:33.987 00000370 86 10 ac dd f0 cd b3 12 cd 44 1c ec 91 a9 83 f6 .........D...... 00:20:33.987 00000380 a1 d0 a0 7f e4 03 bd 00 df fc 8e 3c 4c 8e 0f 4f ........... 00:20:33.987 00000120 18 b3 9e 6b c0 19 7b d5 17 fb f9 e9 cc d0 2c 6b ...k..{.......,k 00:20:33.987 00000130 35 35 81 e6 c0 54 c6 56 2b 8d 5f cf 0e 5a ac 3f 55...T.V+._..Z.? 00:20:33.987 00000140 f6 cc 98 e1 8f 25 ce 64 da 14 75 2b f4 22 12 1b .....%.d..u+.".. 00:20:33.987 00000150 80 06 e0 bd 9d 58 7a ac 55 bb f4 f0 10 e9 cd e0 .....Xz.U....... 00:20:33.987 00000160 5b 57 cf 65 6d 18 dc 7b 79 fb d0 1b 0e f2 43 70 [W.em..{y.....Cp 00:20:33.987 00000170 8e 31 02 82 c8 b8 9e 5e ad 49 16 e6 0a 6c d3 b7 .1.....^.I...l.. 00:20:33.987 00000180 25 07 9c 80 fc d5 c2 45 d1 7c 49 67 30 cc 22 e4 %......E.|Ig0.". 00:20:33.987 00000190 40 36 41 f3 38 84 30 f3 48 78 27 16 d1 3e b0 d2 @6A.8.0.Hx'..>.. 00:20:33.987 000001a0 5c ce 9d 40 3c 58 ad 9f 5b 1b 40 0c 68 88 e4 3b \..@w.. 00:20:33.987 00000220 4e c5 c9 08 40 20 2e 11 b3 8f 9c 7a 13 c2 65 eb N...@ .....z..e. 00:20:33.987 00000230 bb 69 4c 7e ed 4e bf 48 2a 73 04 a2 53 a9 88 a1 .iL~.N.H*s..S... 00:20:33.987 00000240 97 54 8a 19 66 ce 1d 27 39 93 7d 92 15 6f db ce .T..f..'9.}..o.. 00:20:33.987 00000250 a7 7e ef 4d da cb ea 6e 47 e1 f3 b9 50 fe a1 1f .~.M...nG...P... 00:20:33.987 00000260 c5 91 0f eb f0 c0 e0 13 cf d0 c2 f9 b2 3f 8e f1 .............?.. 00:20:33.987 00000270 a7 2b bf 56 4f 45 1f 43 c0 09 dd 25 5b c3 86 f9 .+.VOE.C...%[... 00:20:33.987 00000280 a0 c9 05 91 15 7b 35 ee 6f ce bd b0 f4 69 74 b1 .....{5.o....it. 00:20:33.987 00000290 e1 3e fa cb fb ac c6 d6 95 dc 30 ac 85 96 84 8b .>........0..... 00:20:33.987 000002a0 7c 1c c7 0e 4b 17 d4 db 44 2e b5 bb 69 de c8 78 |...K...D...i..x 00:20:33.987 000002b0 48 e1 fd 12 ef 3b 8c 17 c8 15 2b 89 40 cc 17 ac H....;....+.@... 00:20:33.987 000002c0 97 a9 a3 4b 07 a8 d4 27 58 49 85 85 8f 3a b9 2e ...K...'XI...:.. 00:20:33.987 000002d0 03 28 85 dc 31 9b 9f 04 88 7b 65 01 c2 75 48 c0 .(..1....{e..uH. 00:20:33.987 000002e0 c7 14 4f ee d8 97 4b d6 1d f2 87 31 6e 1f dc 70 ..O...K....1n..p 00:20:33.987 000002f0 f9 fb d3 8d 46 31 83 e1 cc 42 e3 68 74 36 d0 c0 ....F1...B.ht6.. 00:20:33.987 00000300 3e 31 a3 c2 6c 5e 41 b3 05 ee d4 1e 07 d6 79 8a >1..l^A.......y. 00:20:33.987 00000310 83 32 da 68 c3 b3 85 64 26 15 3b 9e ef 7a df 0f .2.h...d&.;..z.. 00:20:33.987 00000320 5b 37 bd 10 29 ed 8e 52 f9 44 81 85 a9 68 79 fd [7..)..R.D...hy. 00:20:33.987 00000330 e2 3c b6 b4 9e 48 0e 01 19 1d 2c db 46 c5 35 91 .<...H....,.F.5. 00:20:33.987 00000340 76 87 8c fa c2 11 4a 24 c4 14 28 31 73 a8 ff 18 v.....J$..(1s... 00:20:33.987 00000350 e4 f8 f5 87 c8 72 6d 2f 24 59 9b b7 0e 93 1d d0 .....rm/$Y...... 00:20:33.987 00000360 bf 9b d6 3d c0 39 9f 02 5b 53 cb f8 fe c2 0b 9e ...=.9..[S...... 00:20:33.987 00000370 0b c7 57 73 dd 3b 23 0e 5c fb d9 18 16 7b ee 5e ..Ws.;#.\....{.^ 00:20:33.987 00000380 1a 9e ba f4 76 aa 97 7a 28 42 ce df e9 09 21 78 ....v..z(B....!x 00:20:33.987 00000390 79 a3 88 3f 52 10 d3 cc 9d 90 a5 96 ad 9c bc e7 y..?R........... 00:20:33.987 000003a0 46 62 ff d5 6c f7 30 a5 4b 63 85 c6 61 c7 3a 45 Fb..l.0.Kc..a.:E 00:20:33.987 000003b0 34 c4 7b 87 d9 b3 84 8b ad 61 46 94 4f e9 ea b6 4.{......aF.O... 00:20:33.987 000003c0 ec 62 fe ee 99 95 9b 7f de 9a dc 26 11 d5 6e e0 .b.........&..n. 00:20:33.987 000003d0 6d 77 e2 6e 79 2b 82 6d 67 dd 0f cc 37 86 a9 f9 mw.ny+.mg...7... 00:20:33.987 000003e0 80 60 08 56 94 f8 69 f1 dc 06 f9 9e 5a 03 b3 74 .`.V..i.....Z..t 00:20:33.987 000003f0 63 45 f1 9a 2e b5 aa 8a fe 18 23 89 b9 a3 1e b3 cE........#..... 00:20:33.987 [2024-09-27 15:25:32.029949] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key0, hash=3, dhgroup=5, seq=3428451835, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.987 [2024-09-27 15:25:32.088476] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.987 [2024-09-27 15:25:32.088521] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.987 [2024-09-27 15:25:32.088538] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.987 [2024-09-27 15:25:32.088556] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.987 [2024-09-27 15:25:32.088576] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.987 [2024-09-27 15:25:32.194318] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.987 [2024-09-27 15:25:32.194336] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.987 [2024-09-27 15:25:32.194348] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.987 [2024-09-27 15:25:32.194358] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.987 [2024-09-27 15:25:32.194412] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.987 ctrlr pubkey: 00:20:33.987 00000000 ea 19 9d 61 5b 73 40 4b 62 71 7f 53 35 4e 24 db ...a[s@Kbq.S5N$. 00:20:33.987 00000010 a1 6a 0f f5 f7 57 33 19 9a 1c b8 25 01 94 9b 61 .j...W3....%...a 00:20:33.987 00000020 1b 86 9c 00 29 b6 a8 17 0b 7e e2 bc a8 1f e5 73 ....)....~.....s 00:20:33.987 00000030 dd ca 32 86 cb 05 b9 a7 35 2e 59 38 47 7b 59 93 ..2.....5.Y8G{Y. 00:20:33.987 00000040 42 cb be 23 50 5c 70 05 58 13 37 f7 39 19 d7 7a B..#P\p.X.7.9..z 00:20:33.987 00000050 5b 5f 51 67 d6 d4 de 16 5f 32 89 4d 73 48 3f 86 [_Qg...._2.MsH?. 00:20:33.987 00000060 c6 ab d9 f4 b8 78 d8 b9 7d b5 28 c1 95 22 af ed .....x..}.(..".. 00:20:33.987 00000070 9a 57 a1 83 32 15 28 89 67 0b 86 9f 32 a2 2e fb .W..2.(.g...2... 00:20:33.987 00000080 43 82 b9 b6 55 51 9c 13 0f 73 6b 28 36 f5 2a 82 C...UQ...sk(6.*. 00:20:33.987 00000090 a4 87 86 ed 89 76 1a d5 1c 57 4e 3b 8f 29 63 25 .....v...WN;.)c% 00:20:33.987 000000a0 f3 48 48 97 8c e3 5f f9 b2 d8 ea 63 72 b8 2b d3 .HH..._....cr.+. 00:20:33.987 000000b0 2c 73 80 ef bd 60 f7 15 71 13 a2 71 3d 49 54 a5 ,s...`..q..q=IT. 00:20:33.987 000000c0 e1 5e 14 ed 08 f7 15 8e 86 11 15 ef aa 67 11 c6 .^...........g.. 00:20:33.987 000000d0 d7 1f fc d0 6b ef 5f f6 62 27 ea b8 ae a9 40 29 ....k._.b'....@) 00:20:33.987 000000e0 61 76 ff 6d 81 71 aa dc 67 08 c6 37 ca 39 f0 4e av.m.q..g..7.9.N 00:20:33.987 000000f0 3b c0 d6 b6 f0 1e 37 18 2f 38 aa 9e 28 ac e3 d2 ;.....7./8..(... 00:20:33.987 00000100 ae 3c 04 71 6d 1f 70 da 7a 78 d3 d0 d4 2f 77 a2 .<.qm.p.zx.../w. 00:20:33.987 00000110 c7 86 eb 52 f7 2b 78 e5 d6 ab 20 3e a4 24 9a b7 ...R.+x... >.$.. 00:20:33.987 00000120 c0 b7 63 0a 6b 5c a2 0e e3 0b 32 f9 5b 16 c7 df ..c.k\....2.[... 00:20:33.987 00000130 2a d4 fa c4 da 63 21 72 e6 40 15 15 6a 18 66 5d *....c!r.@..j.f] 00:20:33.987 00000140 54 f7 32 2e c1 00 69 c8 9c ff 33 b0 e7 22 02 c1 T.2...i...3..".. 00:20:33.987 00000150 e6 68 72 de 3b 3c 96 b1 97 a4 08 2e 47 83 3c da .hr.;<......G.<. 00:20:33.987 00000160 09 8b b6 33 c2 6b 8b aa d9 e9 2a 70 6c 78 2b 72 ...3.k....*plx+r 00:20:33.987 00000170 87 41 4d 7d 44 b9 2c 7c 74 a0 2c 2c 2d 7b 96 2f .AM}D.,|t.,,-{./ 00:20:33.987 00000180 df bc 54 6a ac 2a 78 39 97 fd 52 70 62 6c 20 5a ..Tj.*x9..Rpbl Z 00:20:33.987 00000190 27 21 c8 e2 f6 e2 cc c3 7e 10 31 70 0d 35 af 7c '!......~.1p.5.| 00:20:33.987 000001a0 7d 85 8f 80 fc 0a c3 7f bd 36 09 b4 c2 05 2d 82 }........6....-. 00:20:33.987 000001b0 90 d8 82 96 9b 6e 2a 33 84 42 35 bc 39 b7 bf a8 .....n*3.B5.9... 00:20:33.987 000001c0 05 87 a4 77 c6 e4 f6 c7 c6 ff 90 b9 4d 3a f3 97 ...w........M:.. 00:20:33.987 000001d0 04 96 ca 89 56 52 e0 95 8d dc 75 62 45 d7 4c ec ....VR....ubE.L. 00:20:33.987 000001e0 0d d3 26 ff 2f cb a4 62 cd 7d ec 96 56 28 8e c0 ..&./..b.}..V(.. 00:20:33.987 000001f0 22 67 27 2a 7e 95 cb 52 ac fe b9 91 f3 fb cf 17 "g'*~..R........ 00:20:33.987 00000200 48 62 17 6f 34 54 4e 06 dd f6 22 8c ec 65 dd af Hb.o4TN..."..e.. 00:20:33.987 00000210 47 f5 d5 08 12 75 7b 04 39 b3 11 7f 5c a1 68 fb G....u{.9...\.h. 00:20:33.987 00000220 ae a8 40 68 3f d5 d9 99 5b 56 fc d1 03 53 5e 6f ..@h?...[V...S^o 00:20:33.987 00000230 0c 84 96 ee 11 3d a3 54 04 ee eb e5 7f f3 0e 26 .....=.T.......& 00:20:33.987 00000240 a1 0a fe b3 07 9a 51 ee 34 4d 84 a6 25 32 58 4e ......Q.4M..%2XN 00:20:33.987 00000250 6b e0 12 5d 5e 8b 9c 9f 7d 81 bd 4e e8 67 a3 07 k..]^...}..N.g.. 00:20:33.987 00000260 38 24 e2 34 a0 8c 9c a5 bf c4 d4 b0 21 d3 1d 7e 8$.4........!..~ 00:20:33.987 00000270 8b 8a 03 67 a2 c1 4e db 06 01 21 9c 11 e3 3d 35 ...g..N...!...=5 00:20:33.987 00000280 22 63 46 1e bb f6 0c 1b f5 19 d2 49 73 27 de cf "cF........Is'.. 00:20:33.988 00000290 17 0d 98 22 8a 39 73 c5 16 4f 55 12 92 63 3d ab ...".9s..OU..c=. 00:20:33.988 000002a0 1f 38 db c0 26 11 bc 82 18 41 ef 84 97 7f 50 92 .8..&....A....P. 00:20:33.988 000002b0 d0 1d 37 69 e5 b4 50 6e a0 f1 84 10 4e f3 4c 91 ..7i..Pn....N.L. 00:20:33.988 000002c0 60 ff f2 da 94 94 d7 b5 b2 5e c5 71 19 22 d6 c8 `........^.q.".. 00:20:33.988 000002d0 6a cc 2a 16 d1 a1 b9 ba 22 ab 5a fd 13 76 21 2b j.*.....".Z..v!+ 00:20:33.988 000002e0 be 3a 30 8b 96 0b a1 02 50 c3 9d 0e d7 39 97 b8 .:0.....P....9.. 00:20:33.988 000002f0 5a b1 bd c4 cf 31 2a 7b da f7 e4 90 50 4e da 24 Z....1*{....PN.$ 00:20:33.988 00000300 9d 3a db c0 91 50 d3 15 36 c9 ae 85 c5 a3 a8 76 .:...P..6......v 00:20:33.988 00000310 67 77 13 38 bc 6a f9 cc dd 0a f3 30 3b 54 81 ac gw.8.j.....0;T.. 00:20:33.988 00000320 53 6a bc 78 ca 9a 41 41 7d 00 08 32 6e 02 27 23 Sj.x..AA}..2n.'# 00:20:33.988 00000330 5e 39 79 f4 8d 4e 1b 01 4e ed 77 08 66 50 a2 68 ^9y..N..N.w.fP.h 00:20:33.988 00000340 3d 0e ab 4b f1 63 72 7a e3 72 89 19 9a 54 c5 c7 =..K.crz.r...T.. 00:20:33.988 00000350 06 ab 86 4d 79 77 53 60 e4 a4 7b 98 92 32 dd f6 ...MywS`..{..2.. 00:20:33.988 00000360 67 23 39 af 17 dd 06 69 26 78 4c 67 ea fa 3d b5 g#9....i&xLg..=. 00:20:33.988 00000370 0a 65 b6 db e3 ac a9 ed 24 89 bb 82 67 03 07 63 .e......$...g..c 00:20:33.988 00000380 0d fa b4 07 e9 21 48 14 ee 42 fe 70 8f db f8 51 .....!H..B.p...Q 00:20:33.988 00000390 5a b1 bd 75 1b 32 aa 59 86 7d 0b eb 1b 2e 7f 2c Z..u.2.Y.}....., 00:20:33.988 000003a0 74 f1 bf a2 62 c2 80 56 7d ac b9 37 8a 89 99 f5 t...b..V}..7.... 00:20:33.988 000003b0 b2 48 6f a2 b8 aa ad 3f 7a fb f9 25 c2 12 81 67 .Ho....?z..%...g 00:20:33.988 000003c0 78 32 3b 57 8d 2b a1 f9 72 64 fb de 2c d5 d6 48 x2;W.+..rd..,..H 00:20:33.988 000003d0 a1 04 48 22 c8 5d eb 9e 27 3a 6b d0 03 f2 ee ac ..H".]..':k..... 00:20:33.988 000003e0 2e 1a 1d f5 89 a7 65 da a5 1c a2 d4 f3 a0 9e dc ......e......... 00:20:33.988 000003f0 e7 9a 6b 2c 25 55 a1 83 c6 e1 65 1b 26 4c 93 e3 ..k,%U....e.&L.. 00:20:33.988 host pubkey: 00:20:33.988 00000000 cd 0a cc 3b 2b 84 7e 15 8f 92 7d 08 bd 6b fb df ...;+.~...}..k.. 00:20:33.988 00000010 d7 9c c3 b4 7d 3d 73 e6 8d 8b a6 48 b4 20 fa cc ....}=s....H. .. 00:20:33.988 00000020 82 a0 00 67 68 d8 ad 14 60 95 c2 6e 00 b5 4c 29 ...gh...`..n..L) 00:20:33.988 00000030 eb 91 fa 27 34 12 cf 1a ba d8 93 c1 44 88 21 da ...'4.......D.!. 00:20:33.988 00000040 74 9c 1d 67 dd c7 28 72 c9 71 6d 21 ec a6 ee 53 t..g..(r.qm!...S 00:20:33.988 00000050 97 e8 47 5f 39 7f e4 bc d5 75 1c 66 28 a7 96 31 ..G_9....u.f(..1 00:20:33.988 00000060 b1 7d 24 a2 b7 c3 a6 92 d7 a4 a5 60 7a 98 69 75 .}$........`z.iu 00:20:33.988 00000070 88 01 62 19 ef 39 e3 b0 c0 2c 5c fe 8e b7 2c 5a ..b..9...,\...,Z 00:20:33.988 00000080 39 33 5d 87 0e e8 8d 49 66 82 28 52 96 72 0c ea 93]....If.(R.r.. 00:20:33.988 00000090 be 0e 4d a3 28 9c ff 77 54 7a e2 44 c0 6b 3e 99 ..M.(..wTz.D.k>. 00:20:33.988 000000a0 44 28 1f 6e 33 3e 99 82 58 6e fb ba 52 c7 d2 e5 D(.n3>..Xn..R... 00:20:33.988 000000b0 fd 4a c8 8f 2b 76 59 f4 9c 33 25 61 62 17 63 3e .J..+vY..3%ab.c> 00:20:33.988 000000c0 ac c2 e2 ca 8d 04 e7 ee 1c 91 34 a1 9e a3 ce 17 ..........4..... 00:20:33.988 000000d0 41 fb 07 a3 3f e7 6b fd 61 be 3c 6a a8 34 c1 28 A...?.k.a..O.^..|... 00:20:33.988 00000120 ca e6 47 5a 2d d9 52 9e df be 1e d9 bc c6 8d 9c ..GZ-.R......... 00:20:33.988 00000130 e4 05 3d c6 f8 ee 5a 37 96 83 f0 5f f7 6d d9 ef ..=...Z7..._.m.. 00:20:33.988 00000140 15 06 a1 da a8 30 1b d5 4e 6f 49 6b 08 a8 e8 c1 .....0..NoIk.... 00:20:33.988 00000150 a5 46 2e 11 6b 50 e9 28 e2 1e d2 52 75 9c 39 52 .F..kP.(...Ru.9R 00:20:33.988 00000160 36 27 8a d3 03 05 3d f3 0f d0 11 af 34 89 25 48 6'....=.....4.%H 00:20:33.988 00000170 25 62 1c e3 bf 98 9f b0 ce fe 1c 33 61 b4 21 92 %b.........3a.!. 00:20:33.988 00000180 83 03 c6 53 ad 16 f8 7c 39 8d 17 39 2c e2 82 f3 ...S...|9..9,... 00:20:33.988 00000190 42 ed 7f cb a7 f0 8e d7 67 25 b3 f7 aa 51 0e 48 B.......g%...Q.H 00:20:33.988 000001a0 5f 04 19 ce e5 e2 6c 3b 33 b6 aa 30 43 5d 58 37 _.....l;3..0C]X7 00:20:33.988 000001b0 b6 07 dd a8 32 58 4a 57 8a 87 d6 69 9c 9e 2f 0d ....2XJW...i../. 00:20:33.988 000001c0 1a 49 58 72 d2 13 e2 7e 76 f9 24 18 84 76 58 23 .IXr...~v.$..vX# 00:20:33.988 000001d0 e9 94 3c b2 99 c2 d7 27 74 e3 79 67 ab 14 8a e3 ..<....'t.yg.... 00:20:33.988 000001e0 46 70 8f e8 02 ff a6 51 38 b5 08 27 12 6e b0 b7 Fp.....Q8..'.n.. 00:20:33.988 000001f0 9e a6 88 63 4f d2 59 e1 3e 1b 13 2b ae 5e d9 d7 ...cO.Y.>..+.^.. 00:20:33.988 00000200 87 a3 1e 71 8b 5b fd d1 3d 83 3c 88 ac 56 c9 11 ...q.[..=.<..V.. 00:20:33.988 00000210 c5 0a 80 ea 15 2e 91 b1 c3 0c b6 1c 64 d6 53 23 ............d.S# 00:20:33.988 00000220 6a 56 ac b9 34 78 8d d3 9b e5 96 f8 ca 1d 12 8c jV..4x.......... 00:20:33.988 00000230 b5 21 4e 02 02 7e be d7 b0 f7 8c 0d a9 08 6f 02 .!N..~........o. 00:20:33.988 00000240 2a 05 d3 0f 02 66 0a ac c4 2b e7 47 93 ca fe ca *....f...+.G.... 00:20:33.988 00000250 cf cd da 51 44 32 ba 6a 96 90 38 08 7e 93 7d 2d ...QD2.j..8.~.}- 00:20:33.988 00000260 1e db 01 11 05 d6 16 68 2d 28 40 38 9a 24 6c 67 .......h-(@8.$lg 00:20:33.988 00000270 b4 77 d8 5d a0 04 6a 9c 26 a3 44 bb 6e a1 1a b7 .w.]..j.&.D.n... 00:20:33.988 00000280 0e 1a bd a3 48 e9 ac e8 3b fd 53 e0 ee b5 61 b7 ....H...;.S...a. 00:20:33.988 00000290 a4 22 fe f1 c0 8c 00 4c a8 8e af 80 c5 e4 9c eb .".....L........ 00:20:33.988 000002a0 64 13 31 c7 fb 3a a3 ee 08 80 23 2a b6 93 11 a4 d.1..:....#*.... 00:20:33.988 000002b0 ca 04 f5 73 a6 56 87 4e 92 f5 0b a5 c0 13 7c f0 ...s.V.N......|. 00:20:33.988 000002c0 5f b0 a7 e7 5c c0 d2 f2 38 2a c8 05 53 c9 0d 60 _...\...8*..S..` 00:20:33.988 000002d0 d8 88 ea cc f0 28 72 45 2f fd 78 66 47 d7 ad be .....(rE/.xfG... 00:20:33.988 000002e0 78 0d ac 65 9f 24 a0 98 2d d1 00 22 68 80 22 6e x..e.$..-.."h."n 00:20:33.988 000002f0 ae 63 4d 68 33 61 d6 8d b1 88 da f7 f3 e9 c9 58 .cMh3a.........X 00:20:33.988 00000300 ed fe d9 e9 52 10 fb b6 09 54 62 57 e1 25 cc c2 ....R....TbW.%.. 00:20:33.988 00000310 d4 7c 69 ab 54 19 e1 06 d8 e5 b5 a1 21 f2 c6 d7 .|i.T.......!... 00:20:33.988 00000320 44 f4 28 40 9e 90 5d 98 6c 32 cb 9e aa fb be 88 D.(@..].l2...... 00:20:33.988 00000330 11 14 77 30 b8 a2 79 81 9d 66 83 f6 ec b3 73 d4 ..w0..y..f....s. 00:20:33.988 00000340 69 37 2c 47 ed 83 fa 29 3a 24 0f d9 d1 0f a5 40 i7,G...):$.....@ 00:20:33.988 00000350 2e 16 d5 78 ba 64 4e 3d 34 8e ab 8e ad 8d e4 cc ...x.dN=4....... 00:20:33.988 00000360 fa 45 1e 04 0a 66 ad be a0 9e 55 fc ab 3f ee d9 .E...f....U..?.. 00:20:33.988 00000370 b4 e0 e6 a0 4d b3 f9 c9 78 d1 e0 1a d4 6d 4c 45 ....M...x....mLE 00:20:33.988 00000380 5b b2 3d eb 46 fa 60 55 29 b2 97 68 c0 12 b8 6d [.=.F.`U)..h...m 00:20:33.988 00000390 8f 8e c0 5f e6 3d 96 9e d0 7d f2 76 32 26 c1 be ..._.=...}.v2&.. 00:20:33.988 000003a0 f4 c6 eb 80 f6 52 e3 e6 e8 30 13 af 11 4e 38 b2 .....R...0...N8. 00:20:33.988 000003b0 47 54 8f 09 55 27 29 82 c4 0c 5d 90 4f f9 87 43 GT..U')...].O..C 00:20:33.988 000003c0 98 65 1a ba a5 19 21 54 64 e0 6a 33 cc d8 d5 36 .e....!Td.j3...6 00:20:33.988 000003d0 b5 24 e0 7d 70 0d 6a 24 64 84 47 de cc 52 25 ba .$.}p.j$d.G..R%. 00:20:33.988 000003e0 92 5c 9f 05 28 d0 88 7b e9 19 1a c2 7d c9 96 41 .\..(..{....}..A 00:20:33.988 000003f0 3f 95 cd d1 ab 0e 0e f9 8a 76 0b 0f c5 d2 f4 78 ?........v.....x 00:20:33.988 dh secret: 00:20:33.988 00000000 87 35 50 92 8d a1 2c ad ba 42 63 8c df 17 5d f9 .5P...,..Bc...]. 00:20:33.988 00000010 3f 0e a4 cd c9 44 a2 61 44 5e a5 f8 d5 05 c6 c3 ?....D.aD^...... 00:20:33.988 00000020 aa 87 87 31 3f 20 c5 ba 19 c6 28 17 6f 20 bb 9b ...1? ....(.o .. 00:20:33.988 00000030 21 7e 4d 73 ac 59 bc cc 61 f5 2a 27 bf f7 88 f8 !~Ms.Y..a.*'.... 00:20:33.988 00000040 f1 18 11 54 30 50 d5 30 4f 12 8e f4 00 9f a2 7f ...T0P.0O....... 00:20:33.988 00000050 fe ac 28 a3 fe 49 f2 7c b8 99 42 58 08 ca 37 70 ..(..I.|..BX..7p 00:20:33.988 00000060 22 14 91 16 3b d8 2e c4 77 a8 b7 aa e7 38 11 66 "...;...w....8.f 00:20:33.988 00000070 2b f6 e7 0c 55 83 de 48 47 3a cf 8d 89 86 d9 54 +...U..HG:.....T 00:20:33.988 00000080 30 f6 09 df 69 bb 4e 61 98 ab af 1a 6d 27 5c 0c 0...i.Na....m'\. 00:20:33.988 00000090 b6 14 5e 4d a5 31 a0 2e db 8d d3 fe cb 92 3e 9d ..^M.1........>. 00:20:33.988 000000a0 03 12 f5 67 86 8d 42 64 39 09 42 0a 1c a8 23 f8 ...g..Bd9.B...#. 00:20:33.988 000000b0 ca 48 c8 05 b1 e4 1e d2 09 aa e9 b3 8e 57 e8 c0 .H...........W.. 00:20:33.988 000000c0 b7 a3 d9 81 95 90 8f 75 00 0c af 45 d7 ee c9 3c .......u...E...< 00:20:33.988 000000d0 aa af 30 c0 86 62 4a 63 74 a6 41 b2 b4 1c a6 6c ..0..bJct.A....l 00:20:33.988 000000e0 73 eb 2e 5f 3b 9d cf 3e 2a 87 b9 9f 57 63 cc 8a s.._;..>*...Wc.. 00:20:33.988 000000f0 ed 01 71 23 a8 ae 2a 9e e7 57 95 cf 6b d4 91 79 ..q#..*..W..k..y 00:20:33.988 00000100 a9 3c 15 81 e8 6b d7 66 b5 8e 4e 84 85 23 11 5a .<...k.f..N..#.Z 00:20:33.988 00000110 0b bc 11 a7 5b b6 56 f4 47 ce 44 61 f2 e1 03 69 ....[.V.G.Da...i 00:20:33.988 00000120 0b 11 02 ac 23 49 77 66 3d dc cd 13 05 84 78 f8 ....#Iwf=.....x. 00:20:33.988 00000130 5f 90 86 d6 bf 3a 9b a3 bc 94 3d dc 1e 9a da 29 _....:....=....) 00:20:33.988 00000140 81 23 82 31 0b 19 8b fa cd ec be 08 c0 81 3f ef .#.1..........?. 00:20:33.988 00000150 71 f3 51 78 f5 6a b6 47 76 66 7c ad 25 fd c2 32 q.Qx.j.Gvf|.%..2 00:20:33.988 00000160 df 9a e6 48 30 67 87 80 ab 5e 24 cf 8f 16 6d 46 ...H0g...^$...mF 00:20:33.988 00000170 5a 76 55 79 65 d7 cb 33 92 f9 2b 8f 04 1e 94 31 ZvUye..3..+....1 00:20:33.988 00000180 25 02 1e e1 4e d7 46 1d d7 64 c0 ba a3 5a 85 1f %...N.F..d...Z.. 00:20:33.988 00000190 49 9b 1d b5 98 f3 80 60 0b 46 83 b0 72 68 b1 a3 I......`.F..rh.. 00:20:33.988 000001a0 74 49 34 cd 70 98 26 5b f2 08 ee 1d f4 d2 26 a7 tI4.p.&[......&. 00:20:33.988 000001b0 fd 41 c7 a4 85 1a ee b2 a7 51 38 71 e6 85 47 fc .A.......Q8q..G. 00:20:33.988 000001c0 19 f6 ff e6 52 78 b8 17 e7 44 4f 83 83 8a 71 fc ....Rx...DO...q. 00:20:33.988 000001d0 da fd c9 79 b7 f9 b6 51 37 0c 88 18 a2 30 7c 21 ...y...Q7....0|! 00:20:33.988 000001e0 3a 09 6e 5d b1 06 db 62 c9 7f ff 29 b6 ca d8 6b :.n]...b...)...k 00:20:33.988 000001f0 e3 22 43 16 b8 60 c1 ec e3 4e 31 22 64 d6 15 ce ."C..`...N1"d... 00:20:33.988 00000200 87 6f 30 ed 19 76 09 1d 00 17 3e 01 c0 b3 fd 02 .o0..v....>..... 00:20:33.988 00000210 63 94 06 e5 2b 5f 74 94 ff 18 bf 14 0d 3d eb 1e c...+_t......=.. 00:20:33.988 00000220 a4 a1 88 bb 53 6b 9a 11 f5 80 cc 61 88 26 7a c3 ....Sk.....a.&z. 00:20:33.988 00000230 87 48 b6 3a e9 dc a6 bb d3 ad 09 30 81 d8 27 1f .H.:.......0..'. 00:20:33.988 00000240 3b bf 21 c4 9e a8 99 43 82 59 2c 31 4b fa 72 0c ;.!....C.Y,1K.r. 00:20:33.988 00000250 ea 09 31 f6 fc e5 fb cc a4 2c 37 fb 84 3e 05 b6 ..1......,7..>.. 00:20:33.988 00000260 e9 c2 cd de 89 4a f1 c4 2e df 24 cd 33 91 de b7 .....J....$.3... 00:20:33.988 00000270 97 36 8a 6a 96 00 49 b2 7d 91 8b a9 9b f7 6d b7 .6.j..I.}.....m. 00:20:33.988 00000280 65 84 9d f4 fc 83 86 2f 8d 9f d7 c5 6b 3c e6 59 e....../....k<.Y 00:20:33.988 00000290 6a ac f1 e4 0e b4 bf 00 e1 89 ae a9 4a a6 ef a3 j...........J... 00:20:33.988 000002a0 75 52 6b 3e ef 76 98 5d 07 c5 86 c6 c6 96 71 a7 uRk>.v.]......q. 00:20:33.988 000002b0 d3 ca 71 4b 25 f5 69 c6 d9 b4 b4 fb ea 16 21 60 ..qK%.i.......!` 00:20:33.988 000002c0 96 d6 5f 80 ec 4c 15 73 2a ec 39 50 3d 54 41 b1 .._..L.s*.9P=TA. 00:20:33.988 000002d0 7b 77 da 07 a3 52 56 2a 72 52 b8 fa e3 9e 47 e4 {w...RV*rR....G. 00:20:33.988 000002e0 19 87 e6 01 dc e6 5a d5 7c cb 10 f8 f0 d8 31 ea ......Z.|.....1. 00:20:33.988 000002f0 dd 8b 9b ca f4 06 b6 da 77 87 27 9c 75 7f f1 0b ........w.'.u... 00:20:33.988 00000300 c5 43 4d da 5d 92 87 0a 5c c8 6d 51 47 79 71 fd .CM.]...\.mQGyq. 00:20:33.988 00000310 11 ba 81 75 e5 e5 fe ce e1 c4 1d f8 54 77 7d e4 ...u........Tw}. 00:20:33.988 00000320 42 59 0d 72 fa 7b b4 d9 03 51 66 80 6b bd 98 fd BY.r.{...Qf.k... 00:20:33.988 00000330 47 94 d0 b6 d2 39 d3 d9 16 75 40 89 81 d3 08 d4 G....9...u@..... 00:20:33.988 00000340 28 7c 68 9c 34 01 d1 e9 48 25 ad 73 54 d8 d6 d6 (|h.4...H%.sT... 00:20:33.988 00000350 e5 6f 09 d0 5b b8 40 d1 ce 7d d0 b1 aa 85 48 e8 .o..[.@..}....H. 00:20:33.989 00000360 fa e5 cd 73 2a b7 de 4f a7 be 11 33 49 4d d5 88 ...s*..O...3IM.. 00:20:33.989 00000370 11 58 2f 30 51 8a 71 e2 64 f0 6c ff 3c 78 d7 b9 .X/0Q.q.d.l. 00:20:33.989 000000d0 c5 d4 ec ee 32 97 89 21 5e a4 2c d6 87 06 68 f7 ....2..!^.,...h. 00:20:33.989 000000e0 81 a9 95 57 bc 20 80 e7 da 71 16 a3 b2 d6 26 93 ...W. ...q....&. 00:20:33.989 000000f0 29 c2 50 2f ea 38 0b 7e 8a b0 03 27 5c 42 9c d8 ).P/.8.~...'\B.. 00:20:33.989 00000100 14 22 0e 85 2c e4 70 82 5e d8 b0 e4 5c ed e0 f4 ."..,.p.^...\... 00:20:33.989 00000110 87 a4 3a 75 64 b8 40 32 03 b9 87 f2 4b 1f 85 26 ..:ud.@2....K..& 00:20:33.989 00000120 6a 89 82 bc 02 c3 77 c2 a7 98 ad 2c 41 66 62 b5 j.....w....,Afb. 00:20:33.989 00000130 66 50 cd 01 70 41 da 39 9e c9 ca 95 c4 89 8a 52 fP..pA.9.......R 00:20:33.989 00000140 5f 39 37 a8 0f 59 d6 a0 d9 5a f0 e3 6d e9 94 c2 _97..Y...Z..m... 00:20:33.989 00000150 b4 ff 90 62 29 d7 47 f7 23 3a 8f a0 93 a2 00 8d ...b).G.#:...... 00:20:33.989 00000160 a3 b8 90 ae 4f f2 de ed 6c 78 0b 7b 88 89 da 4f ....O...lx.{...O 00:20:33.989 00000170 fe 91 ff 55 d3 b7 0e d4 a9 fc 02 de c0 a9 2c 04 ...U..........,. 00:20:33.989 00000180 0e 53 fc 44 37 4b 44 46 0a 24 5d 0e 3a 96 54 24 .S.D7KDF.$].:.T$ 00:20:33.989 00000190 ef 48 bb 1b 9f a2 f7 1c 50 49 54 f6 40 04 b5 97 .H......PIT.@... 00:20:33.989 000001a0 75 23 79 1f ba 67 12 da d2 d9 d7 ef 7c fb 77 2b u#y..g......|.w+ 00:20:33.989 000001b0 05 3d d1 c3 b5 bd 20 cf 91 53 fe 1f a0 b8 d8 8c .=.... ..S...... 00:20:33.989 000001c0 dd 6d 1a 73 34 8c 4c a8 9d 26 78 1a 37 42 1c b0 .m.s4.L..&x.7B.. 00:20:33.989 000001d0 73 bd a3 5d 4d f6 52 c0 e0 05 e1 51 c3 ea 3c e2 s..]M.R....Q..<. 00:20:33.989 000001e0 67 f9 98 08 18 10 82 0c c8 27 0c 40 84 31 5a 62 g........'.@.1Zb 00:20:33.989 000001f0 84 8e 58 21 9c e5 a4 53 0b d9 9c a6 a1 fa c7 9c ..X!...S........ 00:20:33.989 00000200 25 05 77 8b 0a 8b 69 ab d3 6c c5 fa d7 7c ce 93 %.w...i..l...|.. 00:20:33.989 00000210 4e f2 96 41 79 8f 0b da 32 26 a0 95 93 84 5c ac N..Ay...2&....\. 00:20:33.989 00000220 ff 7e de 68 d7 77 92 39 49 2a ae a2 af 53 fe ac .~.h.w.9I*...S.. 00:20:33.989 00000230 d6 65 c3 e5 d2 37 ed d3 da f0 72 61 67 bc cb 2f .e...7....rag../ 00:20:33.989 00000240 76 25 65 85 c2 55 85 3c f4 7d 0d 53 0e f9 be b9 v%e..U.<.}.S.... 00:20:33.989 00000250 73 9c f7 6d 22 35 4a 31 1e 33 af 29 39 16 f5 52 s..m"5J1.3.)9..R 00:20:33.989 00000260 4e fb 68 86 76 1d 35 91 5f 32 0e e5 5d a4 f0 5b N.h.v.5._2..]..[ 00:20:33.989 00000270 f5 fb ed 04 1b 71 86 30 ac 82 37 bf d5 71 b5 2e .....q.0..7..q.. 00:20:33.989 00000280 46 3b 7f fe 75 4e 40 19 46 18 00 47 3a 3a e1 f3 F;..uN@.F..G::.. 00:20:33.989 00000290 87 6b f3 5c 2c 96 c2 f3 5c 05 f9 6d 2d e3 34 48 .k.\,...\..m-.4H 00:20:33.989 000002a0 34 ad b7 8e ac 7d b9 f5 b1 0a 82 76 c2 18 71 27 4....}.....v..q' 00:20:33.989 000002b0 c0 fc fc e8 3e ec bf 93 48 b9 43 8f 7a b6 d8 9c ....>...H.C.z... 00:20:33.989 000002c0 95 8c ad 6b 66 eb 39 a6 cb 5a f4 36 ad 36 34 ce ...kf.9..Z.6.64. 00:20:33.989 000002d0 1e a8 15 2f 63 8a 07 d2 43 64 92 a7 ad b2 fd ed .../c...Cd...... 00:20:33.989 000002e0 32 ee 1a c9 1c c4 c2 ca df 51 13 5f 21 3b 5e f6 2........Q._!;^. 00:20:33.989 000002f0 5f 6d 2c da c6 d8 8e 61 b1 55 b4 1c c8 9e e3 cf _m,....a.U...... 00:20:33.989 00000300 ce bd cb 83 cb be 99 37 3e 5a 5b ea fa 65 33 70 .......7>Z[..e3p 00:20:33.989 00000310 1c ab 9c af 4d 06 cb 86 b6 6a 54 93 a7 bf 72 ca ....M....jT...r. 00:20:33.989 00000320 13 19 17 09 83 78 ba 8b 96 79 62 f3 96 2f 15 c9 .....x...yb../.. 00:20:33.989 00000330 67 ab a0 84 b8 cf 13 05 4b 78 59 c6 9b 45 a8 be g.......KxY..E.. 00:20:33.989 00000340 79 8b 0a 28 b7 15 53 19 e6 74 5e 8c 6f 2a 8e ec y..(..S..t^.o*.. 00:20:33.989 00000350 90 30 35 7d cc 17 51 d6 96 28 2a 65 65 c9 ee 60 .05}..Q..(*ee..` 00:20:33.989 00000360 c0 a5 40 11 d1 85 a8 03 0f fd f3 a3 ce 5c 79 79 ..@..........\yy 00:20:33.989 00000370 ff 95 8e 32 3f e2 92 11 64 d1 21 1c ab 92 74 d8 ...2?...d.!...t. 00:20:33.989 00000380 e6 6c 09 46 06 31 87 5f b5 28 86 68 bb 7f ce 07 .l.F.1._.(.h.... 00:20:33.989 00000390 4d b7 2c 12 7e 68 61 a0 28 f8 06 f1 5e 1a 31 9a M.,.~ha.(...^.1. 00:20:33.989 000003a0 2d ad 42 51 76 d7 df 58 83 11 ab 7c fb 90 be b2 -.BQv..X...|.... 00:20:33.989 000003b0 15 5f 7d d7 61 93 a8 9b 34 9d 9a 0d ba 2c bf 23 ._}.a...4....,.# 00:20:33.989 000003c0 3d e1 ff 33 19 84 05 58 fb 14 4f fc fe e7 0e 23 =..3...X..O....# 00:20:33.989 000003d0 db bb 8a 83 89 68 81 ba 11 4d 9e d8 6b 14 de 34 .....h...M..k..4 00:20:33.989 000003e0 54 eb 43 b4 fe 80 55 4f b5 dd 4b dd 1b bf ee b1 T.C...UO..K..... 00:20:33.989 000003f0 d5 bf b6 36 8c 10 a7 cc c4 ce 30 d3 8c e6 61 9d ...6......0...a. 00:20:33.989 host pubkey: 00:20:33.989 00000000 7e 7e f2 5e 1d d1 13 ba b7 ff 33 83 be b2 c5 97 ~~.^......3..... 00:20:33.989 00000010 aa 7d 67 8f b1 2c 56 9e 3d c9 42 90 1a 25 84 d4 .}g..,V.=.B..%.. 00:20:33.989 00000020 fa 90 2b 78 c2 59 d1 ab 25 76 4f c7 ba 0f bd b2 ..+x.Y..%vO..... 00:20:33.989 00000030 f4 3c 4c b8 3d da 9d af 0f 63 f1 e0 84 27 cc 90 ...f.bH. 00:20:33.989 000000e0 f8 1c 91 6b 06 7d 35 1c 49 98 fa 9b 9b ec 98 5b ...k.}5.I......[ 00:20:33.989 000000f0 f2 74 1f c0 29 a6 d6 ed 39 d1 f2 d2 8c 87 14 72 .t..)...9......r 00:20:33.989 00000100 02 09 6e f8 a8 0e f2 c5 3b 74 bf 09 4f fd 7c 38 ..n.....;t..O.|8 00:20:33.989 00000110 38 da 3c 30 c9 70 fb 89 e9 8c 26 e5 d2 db 9f bc 8.<0.p....&..... 00:20:33.989 00000120 c3 b4 bc 9c bb 9e ac fd c9 19 d8 59 04 3f aa 39 ...........Y.?.9 00:20:33.989 00000130 95 a3 eb 63 8e 9b ca cc 3e b8 f1 ad 54 a8 6e 7a ...c....>...T.nz 00:20:33.989 00000140 21 fa 70 a5 21 6c fb f2 3d 0a 69 59 56 36 17 d6 !.p.!l..=.iYV6.. 00:20:33.989 00000150 d8 56 b3 c9 00 8b a1 42 ab 97 29 7b 15 f1 1b 27 .V.....B..){...' 00:20:33.989 00000160 6a f7 5a 5b 8a ce e4 ec ed ef fc 6b 87 72 e4 47 j.Z[.......k.r.G 00:20:33.989 00000170 3e 32 42 03 9e 4a 5d 49 8e c6 f5 2f 53 84 76 81 >2B..J]I.../S.v. 00:20:33.989 00000180 3a 42 c1 db b2 0e 53 4e b2 3a 6c 06 5d d5 39 26 :B....SN.:l.].9& 00:20:33.989 00000190 bb 76 4c ed b3 8d f0 2a 53 bc 3a 62 e2 d3 c7 3b .vL....*S.:b...; 00:20:33.989 000001a0 ae 10 37 b9 a0 2f 0e bc d1 95 35 38 1f c1 8f 6a ..7../....58...j 00:20:33.989 000001b0 f2 71 76 66 d3 e4 21 2f 9e ab 1d 63 5a ac c6 4b .qvf..!/...cZ..K 00:20:33.989 000001c0 0c e9 83 34 7d 72 5f ec 6b 55 6b 44 a1 0f f6 23 ...4}r_.kUkD...# 00:20:33.989 000001d0 f9 75 dd 56 b5 58 3d 9e 6d de 87 cd d7 6f 6f 29 .u.V.X=.m....oo) 00:20:33.989 000001e0 bd db d4 9b 95 25 31 0e ea 6b 20 12 07 40 39 8c .....%1..k ..@9. 00:20:33.989 000001f0 fd 71 8a b4 67 71 79 fa 02 90 35 d1 0e 89 96 59 .q..gqy...5....Y 00:20:33.989 00000200 89 77 c1 4e ab af 34 3e 8b 70 61 96 92 e8 18 9a .w.N..4>.pa..... 00:20:33.989 00000210 b8 33 2d 8a 23 e3 b2 64 40 9b e4 d2 e8 ed e4 fa .3-.#..d@....... 00:20:33.989 00000220 55 fd 64 03 ab 4d 23 4f d2 92 99 99 fe a5 b4 8c U.d..M#O........ 00:20:33.989 00000230 bf 8c be 43 2c e1 67 52 27 cc a1 a9 ae 36 a9 80 ...C,.gR'....6.. 00:20:33.989 00000240 ef 20 d1 28 d4 5c 96 29 ee ca da 3f a4 b8 72 60 . .(.\.)...?..r` 00:20:33.989 00000250 80 41 e9 ee de 5e 61 3c f6 3e 33 82 61 0f 68 02 .A...^a<.>3.a.h. 00:20:33.989 00000260 f2 50 76 5d 42 37 97 e8 12 34 c7 51 35 c5 cf 43 .Pv]B7...4.Q5..C 00:20:33.989 00000270 a9 84 36 45 f1 7a 0f 4d 43 0b f0 e0 0a ff 7e c0 ..6E.z.MC.....~. 00:20:33.989 00000280 6b 2c f6 66 e8 aa ad 52 f2 dd b6 16 26 8b c4 21 k,.f...R....&..! 00:20:33.989 00000290 4b a4 14 b5 53 ef fe 0a 08 6e 8e 59 62 27 e9 95 K...S....n.Yb'.. 00:20:33.989 000002a0 c3 53 12 31 cf 6b 17 5a 68 18 2e a6 d5 54 7a 91 .S.1.k.Zh....Tz. 00:20:33.989 000002b0 7e 5c 43 20 b4 8d 2b 39 c4 a9 75 20 ec 06 00 b6 ~\C ..+9..u .... 00:20:33.989 000002c0 c5 1b 66 4c 91 58 45 7b 02 30 05 20 a6 e9 62 3a ..fL.XE{.0. ..b: 00:20:33.989 000002d0 cf 47 c1 69 f6 ad ce ac 58 34 de 52 5a b4 a6 9f .G.i....X4.RZ... 00:20:33.989 000002e0 d7 95 3e ca 23 96 67 cd a4 c4 d7 65 73 02 c4 cb ..>.#.g....es... 00:20:33.989 000002f0 11 52 26 80 bd b6 8b 58 b8 fe d2 10 34 63 52 fc .R&....X....4cR. 00:20:33.989 00000300 9e 75 cc d2 23 e5 d8 0a 56 ac f1 47 b1 b5 f4 1e .u..#...V..G.... 00:20:33.989 00000310 77 8a 82 37 63 1b c2 ae f1 34 08 7d 37 41 8a d7 w..7c....4.}7A.. 00:20:33.989 00000320 d1 08 2a d4 45 ef 1a a6 9a b3 1e f3 fc 66 ff 83 ..*.E........f.. 00:20:33.989 00000330 6a 86 25 1b e5 80 a7 2a e3 c3 39 eb 6e 4b a9 ee j.%....*..9.nK.. 00:20:33.989 00000340 ae 1c 42 8b 90 0b b9 42 41 9b e2 e3 97 2c da 3c ..B....BA....,.< 00:20:33.989 00000350 62 d8 71 e9 d3 7c a4 20 fd 5a ff b7 5c 8c 86 b5 b.q..|. .Z..\... 00:20:33.989 00000360 d4 40 d2 f4 54 05 66 6f ec 82 37 29 af a3 9e 24 .@..T.fo..7)...$ 00:20:33.989 00000370 d0 75 d7 92 5e 9f c4 04 0e e5 d6 fa 94 51 8e 60 .u..^........Q.` 00:20:33.989 00000380 52 2d 6a a9 f9 90 4d 0c d3 6c 06 3d b9 50 89 a9 R-j...M..l.=.P.. 00:20:33.989 00000390 79 c7 d7 f1 2d 60 bd ec 2a ce 79 80 31 fe 3e cb y...-`..*.y.1.>. 00:20:33.990 000003a0 0e e2 20 d6 d7 55 51 0d 63 97 86 34 ae eb d4 08 .. ..UQ.c..4.... 00:20:33.990 000003b0 43 86 4e 28 5b 57 a6 da b9 f0 a1 2f 80 71 5b 72 C.N([W...../.q[r 00:20:33.990 000003c0 e3 19 c8 b1 33 0c 18 d8 34 97 d4 24 1a 30 be e6 ....3...4..$.0.. 00:20:33.990 000003d0 f9 b7 63 90 99 78 66 b0 03 eb 41 66 c3 54 1f cd ..c..xf...Af.T.. 00:20:33.990 000003e0 dc f6 50 72 93 7b f2 0a 98 4a 1f 48 98 0b 35 2b ..Pr.{...J.H..5+ 00:20:33.990 000003f0 0f 6d 9c 9d 5a 43 88 11 c9 b4 c2 0e 93 60 4c a0 .m..ZC.......`L. 00:20:33.990 dh secret: 00:20:33.990 00000000 4a 63 fe 5e cd a6 e4 f2 d8 db 98 fd ab 31 95 53 Jc.^.........1.S 00:20:33.990 00000010 e7 57 3d 47 2b 9b 17 8f fc c2 bd 9e 66 39 b1 0d .W=G+.......f9.. 00:20:33.990 00000020 9b e4 d7 9b 80 5a 5f 58 c2 6c 51 99 3c 96 e0 07 .....Z_X.lQ.<... 00:20:33.990 00000030 f1 34 1e 71 e5 de ee 6b 9d b1 4a f9 3e 0a 8a 51 .4.q...k..J.>..Q 00:20:33.990 00000040 ef 43 91 a6 40 e7 4e 97 3f 05 3a d2 0b 01 dd 3b .C..@.N.?.:....; 00:20:33.990 00000050 4d 92 a7 e9 93 fb 68 b2 bc 58 92 c3 89 11 19 79 M.....h..X.....y 00:20:33.990 00000060 63 4c 02 0d 08 b7 b5 10 4d 34 8e 78 2b ee 8b 63 cL......M4.x+..c 00:20:33.990 00000070 7c 3d c8 84 51 fd 4b 0b 6b 7a 31 5a d6 80 3a e5 |=..Q.K.kz1Z..:. 00:20:33.990 00000080 be 6b c1 b7 32 f8 2c bd 1f 42 96 4c 34 cc d4 72 .k..2.,..B.L4..r 00:20:33.990 00000090 76 04 7b 09 8a c6 6d 5d c4 f6 f5 01 71 5e 00 dc v.{...m]....q^.. 00:20:33.990 000000a0 2c 30 ba 78 e9 a0 8f b2 65 2b ea 71 56 f5 eb aa ,0.x....e+.qV... 00:20:33.990 000000b0 d2 a9 07 94 af 15 cd 5c 6a 35 63 96 9e 38 b0 f5 .......\j5c..8.. 00:20:33.990 000000c0 28 67 20 6b 9c 23 61 e8 83 63 6f 19 21 5b 8a 24 (g k.#a..co.![.$ 00:20:33.990 000000d0 b4 78 f7 c6 5f 31 3c ac ea b9 f5 1e e3 ae 6b 44 .x.._1<.......kD 00:20:33.990 000000e0 d5 51 89 8e 3b d4 aa 69 8f 7f 1b 1a 81 e5 cd e5 .Q..;..i........ 00:20:33.990 000000f0 fe 83 83 3d 87 67 98 bd 41 f8 67 8a 6c e3 ed a9 ...=.g..A.g.l... 00:20:33.990 00000100 28 46 49 15 74 e7 8c 1e 41 ae 2d e3 11 5e 23 c8 (FI.t...A.-..^#. 00:20:33.990 00000110 aa d1 fb 17 20 92 9d c7 47 ff 47 2c 42 1c a8 97 .... ...G.G,B... 00:20:33.990 00000120 49 19 65 a0 21 eb 45 ec b7 f1 8f 0b 37 d4 71 23 I.e.!.E.....7.q# 00:20:33.990 00000130 78 dc a3 2f a5 d9 48 3a 86 4f 18 c8 97 66 49 37 x../..H:.O...fI7 00:20:33.990 00000140 ac 35 80 f6 29 72 d9 19 59 3f 70 58 c0 bc 3b 4f .5..)r..Y?pX..;O 00:20:33.990 00000150 fb 22 72 75 09 7a a5 3a 1c 43 94 13 ec 60 b2 77 ."ru.z.:.C...`.w 00:20:33.990 00000160 43 99 61 54 94 57 87 fa d6 26 11 95 71 7c 2d b5 C.aT.W...&..q|-. 00:20:33.990 00000170 36 e3 c5 33 fb 55 62 d5 86 65 d5 58 ac 7e ac 39 6..3.Ub..e.X.~.9 00:20:33.990 00000180 70 bd 1e 51 db 45 ab 6d e5 7f da b1 5c 21 1c ac p..Q.E.m....\!.. 00:20:33.990 00000190 28 9a 34 8a ad 62 f6 96 98 e1 39 a9 cd 62 96 7b (.4..b....9..b.{ 00:20:33.990 000001a0 a5 fd 23 fb 4a de 9b c7 de 57 2e f7 72 a4 ad 5d ..#.J....W..r..] 00:20:33.990 000001b0 87 94 89 6f e5 d0 ef 32 08 0c 3d 6d 1f 22 6a 43 ...o...2..=m."jC 00:20:33.990 000001c0 8f 97 80 f6 c5 2d 10 c7 8c b7 61 13 0f eb f4 ea .....-....a..... 00:20:33.990 000001d0 51 37 6d f7 9b 89 ef 9b db 5c cf 68 d9 b4 21 71 Q7m......\.h..!q 00:20:33.990 000001e0 51 91 94 5d 13 3f ec 16 51 b7 ab 76 83 19 8b 40 Q..].?..Q..v...@ 00:20:33.990 000001f0 3a f6 aa c9 f3 d6 77 75 b6 74 1d 16 68 80 d8 ea :.....wu.t..h... 00:20:33.990 00000200 3e 43 ba a9 a2 8f 09 38 84 db 54 0a fd a4 d5 bf >C.....8..T..... 00:20:33.990 00000210 80 9d 9e f4 1f 79 7d db 30 d9 c1 23 09 fc ea b1 .....y}.0..#.... 00:20:33.990 00000220 be 75 6b bf a6 6e 1a 5c 46 20 82 61 5e d1 44 80 .uk..n.\F .a^.D. 00:20:33.990 00000230 47 ac 07 49 b8 11 2d 44 61 5d bd c5 8a 39 c5 b7 G..I..-Da]...9.. 00:20:33.990 00000240 28 8f 63 03 77 15 df 30 47 2a 90 d6 a5 38 f2 26 (.c.w..0G*...8.& 00:20:33.990 00000250 8b 2c 16 dd 56 6d c4 7d 32 af f5 8d 15 cc 31 55 .,..Vm.}2.....1U 00:20:33.990 00000260 e3 bf 3c 4b cb 0b fa ac 4a 28 8e 04 18 75 79 93 .. 00:20:33.990 000000d0 c5 d4 ec ee 32 97 89 21 5e a4 2c d6 87 06 68 f7 ....2..!^.,...h. 00:20:33.990 000000e0 81 a9 95 57 bc 20 80 e7 da 71 16 a3 b2 d6 26 93 ...W. ...q....&. 00:20:33.990 000000f0 29 c2 50 2f ea 38 0b 7e 8a b0 03 27 5c 42 9c d8 ).P/.8.~...'\B.. 00:20:33.990 00000100 14 22 0e 85 2c e4 70 82 5e d8 b0 e4 5c ed e0 f4 ."..,.p.^...\... 00:20:33.990 00000110 87 a4 3a 75 64 b8 40 32 03 b9 87 f2 4b 1f 85 26 ..:ud.@2....K..& 00:20:33.990 00000120 6a 89 82 bc 02 c3 77 c2 a7 98 ad 2c 41 66 62 b5 j.....w....,Afb. 00:20:33.990 00000130 66 50 cd 01 70 41 da 39 9e c9 ca 95 c4 89 8a 52 fP..pA.9.......R 00:20:33.990 00000140 5f 39 37 a8 0f 59 d6 a0 d9 5a f0 e3 6d e9 94 c2 _97..Y...Z..m... 00:20:33.990 00000150 b4 ff 90 62 29 d7 47 f7 23 3a 8f a0 93 a2 00 8d ...b).G.#:...... 00:20:33.990 00000160 a3 b8 90 ae 4f f2 de ed 6c 78 0b 7b 88 89 da 4f ....O...lx.{...O 00:20:33.990 00000170 fe 91 ff 55 d3 b7 0e d4 a9 fc 02 de c0 a9 2c 04 ...U..........,. 00:20:33.990 00000180 0e 53 fc 44 37 4b 44 46 0a 24 5d 0e 3a 96 54 24 .S.D7KDF.$].:.T$ 00:20:33.990 00000190 ef 48 bb 1b 9f a2 f7 1c 50 49 54 f6 40 04 b5 97 .H......PIT.@... 00:20:33.990 000001a0 75 23 79 1f ba 67 12 da d2 d9 d7 ef 7c fb 77 2b u#y..g......|.w+ 00:20:33.990 000001b0 05 3d d1 c3 b5 bd 20 cf 91 53 fe 1f a0 b8 d8 8c .=.... ..S...... 00:20:33.990 000001c0 dd 6d 1a 73 34 8c 4c a8 9d 26 78 1a 37 42 1c b0 .m.s4.L..&x.7B.. 00:20:33.990 000001d0 73 bd a3 5d 4d f6 52 c0 e0 05 e1 51 c3 ea 3c e2 s..]M.R....Q..<. 00:20:33.990 000001e0 67 f9 98 08 18 10 82 0c c8 27 0c 40 84 31 5a 62 g........'.@.1Zb 00:20:33.990 000001f0 84 8e 58 21 9c e5 a4 53 0b d9 9c a6 a1 fa c7 9c ..X!...S........ 00:20:33.990 00000200 25 05 77 8b 0a 8b 69 ab d3 6c c5 fa d7 7c ce 93 %.w...i..l...|.. 00:20:33.990 00000210 4e f2 96 41 79 8f 0b da 32 26 a0 95 93 84 5c ac N..Ay...2&....\. 00:20:33.990 00000220 ff 7e de 68 d7 77 92 39 49 2a ae a2 af 53 fe ac .~.h.w.9I*...S.. 00:20:33.990 00000230 d6 65 c3 e5 d2 37 ed d3 da f0 72 61 67 bc cb 2f .e...7....rag../ 00:20:33.990 00000240 76 25 65 85 c2 55 85 3c f4 7d 0d 53 0e f9 be b9 v%e..U.<.}.S.... 00:20:33.990 00000250 73 9c f7 6d 22 35 4a 31 1e 33 af 29 39 16 f5 52 s..m"5J1.3.)9..R 00:20:33.990 00000260 4e fb 68 86 76 1d 35 91 5f 32 0e e5 5d a4 f0 5b N.h.v.5._2..]..[ 00:20:33.990 00000270 f5 fb ed 04 1b 71 86 30 ac 82 37 bf d5 71 b5 2e .....q.0..7..q.. 00:20:33.990 00000280 46 3b 7f fe 75 4e 40 19 46 18 00 47 3a 3a e1 f3 F;..uN@.F..G::.. 00:20:33.990 00000290 87 6b f3 5c 2c 96 c2 f3 5c 05 f9 6d 2d e3 34 48 .k.\,...\..m-.4H 00:20:33.990 000002a0 34 ad b7 8e ac 7d b9 f5 b1 0a 82 76 c2 18 71 27 4....}.....v..q' 00:20:33.990 000002b0 c0 fc fc e8 3e ec bf 93 48 b9 43 8f 7a b6 d8 9c ....>...H.C.z... 00:20:33.990 000002c0 95 8c ad 6b 66 eb 39 a6 cb 5a f4 36 ad 36 34 ce ...kf.9..Z.6.64. 00:20:33.990 000002d0 1e a8 15 2f 63 8a 07 d2 43 64 92 a7 ad b2 fd ed .../c...Cd...... 00:20:33.990 000002e0 32 ee 1a c9 1c c4 c2 ca df 51 13 5f 21 3b 5e f6 2........Q._!;^. 00:20:33.990 000002f0 5f 6d 2c da c6 d8 8e 61 b1 55 b4 1c c8 9e e3 cf _m,....a.U...... 00:20:33.990 00000300 ce bd cb 83 cb be 99 37 3e 5a 5b ea fa 65 33 70 .......7>Z[..e3p 00:20:33.990 00000310 1c ab 9c af 4d 06 cb 86 b6 6a 54 93 a7 bf 72 ca ....M....jT...r. 00:20:33.990 00000320 13 19 17 09 83 78 ba 8b 96 79 62 f3 96 2f 15 c9 .....x...yb../.. 00:20:33.990 00000330 67 ab a0 84 b8 cf 13 05 4b 78 59 c6 9b 45 a8 be g.......KxY..E.. 00:20:33.990 00000340 79 8b 0a 28 b7 15 53 19 e6 74 5e 8c 6f 2a 8e ec y..(..S..t^.o*.. 00:20:33.990 00000350 90 30 35 7d cc 17 51 d6 96 28 2a 65 65 c9 ee 60 .05}..Q..(*ee..` 00:20:33.990 00000360 c0 a5 40 11 d1 85 a8 03 0f fd f3 a3 ce 5c 79 79 ..@..........\yy 00:20:33.990 00000370 ff 95 8e 32 3f e2 92 11 64 d1 21 1c ab 92 74 d8 ...2?...d.!...t. 00:20:33.990 00000380 e6 6c 09 46 06 31 87 5f b5 28 86 68 bb 7f ce 07 .l.F.1._.(.h.... 00:20:33.990 00000390 4d b7 2c 12 7e 68 61 a0 28 f8 06 f1 5e 1a 31 9a M.,.~ha.(...^.1. 00:20:33.990 000003a0 2d ad 42 51 76 d7 df 58 83 11 ab 7c fb 90 be b2 -.BQv..X...|.... 00:20:33.990 000003b0 15 5f 7d d7 61 93 a8 9b 34 9d 9a 0d ba 2c bf 23 ._}.a...4....,.# 00:20:33.990 000003c0 3d e1 ff 33 19 84 05 58 fb 14 4f fc fe e7 0e 23 =..3...X..O....# 00:20:33.990 000003d0 db bb 8a 83 89 68 81 ba 11 4d 9e d8 6b 14 de 34 .....h...M..k..4 00:20:33.990 000003e0 54 eb 43 b4 fe 80 55 4f b5 dd 4b dd 1b bf ee b1 T.C...UO..K..... 00:20:33.991 000003f0 d5 bf b6 36 8c 10 a7 cc c4 ce 30 d3 8c e6 61 9d ...6......0...a. 00:20:33.991 host pubkey: 00:20:33.991 00000000 95 7d f7 f5 c0 ee 4c 6a 19 4c d1 3d 56 fc 16 4b .}....Lj.L.=V..K 00:20:33.991 00000010 d4 4b 1f 8f 25 b6 69 e5 b8 02 06 d7 06 47 8e d3 .K..%.i......G.. 00:20:33.991 00000020 2e 49 88 e4 60 ad 92 49 d8 2b 18 8c 85 df 42 54 .I..`..I.+....BT 00:20:33.991 00000030 9a 7b 86 39 fa 4d 95 f5 49 7c 74 69 3a 22 aa 57 .{.9.M..I|ti:".W 00:20:33.991 00000040 e0 5b 07 42 72 28 74 1a 28 b8 76 23 bf 47 30 c0 .[.Br(t.(.v#.G0. 00:20:33.991 00000050 c1 a7 4e f8 2d a2 4b b0 74 07 6d 7b 7a 4e bf 38 ..N.-.K.t.m{zN.8 00:20:33.991 00000060 82 fc cc 6e 53 91 0a db a2 eb 38 82 f2 cb 40 aa ...nS.....8...@. 00:20:33.991 00000070 12 35 10 73 8b 35 ee b7 99 f3 9d 5b 9e 8b 41 74 .5.s.5.....[..At 00:20:33.991 00000080 18 20 29 0c 5e f7 f0 d9 99 58 ce d4 cb ed 74 2d . ).^....X....t- 00:20:33.991 00000090 29 07 f9 b1 05 26 7a 38 e3 c9 1e a8 ab 2c a1 3a )....&z8.....,.: 00:20:33.991 000000a0 9a eb 73 cd 9b 9d 32 45 5e fd 3d 6e 33 09 db 7c ..s...2E^.=n3..| 00:20:33.991 000000b0 d7 36 a4 d3 d7 13 9d 66 ab 0b df 6e 00 18 b0 4f .6.....f...n...O 00:20:33.991 000000c0 30 8c c5 2f ae 07 66 2a 75 39 84 65 ec 54 7f af 0../..f*u9.e.T.. 00:20:33.991 000000d0 d4 40 f1 81 c4 d3 82 ec be 54 6f c5 24 95 e3 91 .@.......To.$... 00:20:33.991 000000e0 20 73 05 f7 c9 63 ef c6 60 20 e9 66 1e 39 32 7e s...c..` .f.92~ 00:20:33.991 000000f0 fd 7f cd f9 82 41 57 f0 e4 1b 5b 46 84 c5 b6 e4 .....AW...[F.... 00:20:33.991 00000100 0b c4 e7 63 6b d6 e7 55 fa 07 7f 20 dd f1 e3 4e ...ck..U... ...N 00:20:33.991 00000110 63 f7 02 d7 57 d8 1f d4 21 db 56 f6 5d 83 5e 22 c...W...!.V.].^" 00:20:33.991 00000120 85 0c 62 8f ac 2f ca 5b 0c 89 a2 28 0d 46 a1 61 ..b../.[...(.F.a 00:20:33.991 00000130 e4 20 98 3f fc c3 b0 26 ff ce 95 6d 83 8d 37 06 . .?...&...m..7. 00:20:33.991 00000140 ba f9 3c cb 66 54 77 1b 1a 40 20 0e c8 d4 20 3b ..<.fTw..@ ... ; 00:20:33.991 00000150 aa f8 06 19 32 6f 0a 82 7c 8c 01 18 18 b9 0d 11 ....2o..|....... 00:20:33.991 00000160 6b 9f 5f 05 90 20 17 4e 65 5d b8 32 37 93 ba 05 k._.. .Ne].27... 00:20:33.991 00000170 49 b7 03 8f 1e cf 6a e1 12 a2 c1 36 8f e4 1a 7b I.....j....6...{ 00:20:33.991 00000180 2c b8 8e 7f 3e 70 61 fa 3e fa c3 dc 7b 1a fb 29 ,...>pa.>...{..) 00:20:33.991 00000190 89 ae 3e d7 51 9c 51 94 35 ed e5 66 e2 f8 7f 7c ..>.Q.Q.5..f...| 00:20:33.991 000001a0 84 c1 77 ca cd 53 03 6a 7e 07 ec 97 d2 bb 1a 03 ..w..S.j~....... 00:20:33.991 000001b0 47 a3 70 92 35 82 b0 e9 32 3b 47 13 6e a4 fe e9 G.p.5...2;G.n... 00:20:33.991 000001c0 4e 0f 1c f7 0b 80 b8 f7 30 f4 5d 02 0e d9 b1 ee N.......0.]..... 00:20:33.991 000001d0 10 0c c1 7b aa e4 7f db 3c 8b 08 39 9f 60 97 46 ...{....<..9.`.F 00:20:33.991 000001e0 a6 6d 52 ab e3 2f aa 89 9e ee 03 e6 cb df fe 66 .mR../.........f 00:20:33.991 000001f0 6e 14 8e e4 9c f1 18 3a 23 da 5f 64 fe fd aa 34 n......:#._d...4 00:20:33.991 00000200 7b 4c b0 0a f2 9d a0 c1 89 01 73 41 da 33 68 41 {L........sA.3hA 00:20:33.991 00000210 61 b2 3c 30 93 38 a6 78 b2 84 d9 9b b9 1d 35 cc a.<0.8.x......5. 00:20:33.991 00000220 91 a6 47 e6 be fa bb f0 e3 e8 48 9c 7f 14 3e df ..G.......H...>. 00:20:33.991 00000230 4d 5d 56 7d a9 3a 72 61 d9 20 23 6b b0 3b eb 41 M]V}.:ra. #k.;.A 00:20:33.991 00000240 4c 3f 80 ca c2 2c 55 b1 73 e6 c3 6a 48 f6 b5 9c L?...,U.s..jH... 00:20:33.991 00000250 89 27 88 a4 ce de 9c a2 37 d3 f0 d7 d3 40 ca f9 .'......7....@.. 00:20:33.991 00000260 6d f9 56 cc 37 f6 2f 68 a1 ac 83 5a cd c5 9f 12 m.V.7./h...Z.... 00:20:33.991 00000270 e7 88 b3 bf ab 63 da 3a 77 39 05 b0 9b 6f 47 47 .....c.:w9...oGG 00:20:33.991 00000280 0f f0 d1 d2 fd 8a 69 81 ea 68 64 ac 1f ed 0a 1f ......i..hd..... 00:20:33.991 00000290 18 0a e8 c5 e3 de b8 55 6d 61 a1 03 c6 a5 1b 1f .......Uma...... 00:20:33.991 000002a0 a2 f9 26 ab a1 1d db 0c e2 33 fd 37 a3 5e 99 56 ..&......3.7.^.V 00:20:33.991 000002b0 fa 3c 64 ce 92 85 80 75 c2 6f 47 61 49 3c c9 9b . 00:20:33.991 00000030 9d 18 e3 8b f6 d9 f7 0d 18 c6 fd 42 6a 57 91 73 ...........BjW.s 00:20:33.991 00000040 ff df 9f 5b 46 5f 04 27 d8 03 ff 6b b5 91 dc a8 ...[F_.'...k.... 00:20:33.991 00000050 2d 62 24 a6 53 64 ad 3b f0 f5 27 9e b6 90 01 08 -b$.Sd.;..'..... 00:20:33.991 00000060 9b 6a f9 49 da 8d b9 34 c1 5f 1e 9d 79 b7 f7 09 .j.I...4._..y... 00:20:33.991 00000070 bc 90 bf bf e4 30 30 14 e5 59 8d 9a d6 4c 91 e3 .....00..Y...L.. 00:20:33.991 00000080 98 1b 1e cc 4c e7 23 f4 a6 85 10 19 b4 3e bc 33 ....L.#......>.3 00:20:33.991 00000090 7c dc be 9b a8 9d 76 b8 4e 8e 7e bb 2b 35 70 99 |.....v.N.~.+5p. 00:20:33.991 000000a0 b9 4f 6a 8a 01 5a a4 6e 19 9e 5e bb 03 11 25 fd .Oj..Z.n..^...%. 00:20:33.991 000000b0 6b dd 3d d6 65 6c cf 76 c7 0a a8 0e 7c 37 23 bf k.=.el.v....|7#. 00:20:33.991 000000c0 d1 b1 5d 9a 9b e0 29 27 66 2d a0 ed d2 32 89 2f ..]...)'f-...2./ 00:20:33.991 000000d0 a0 b6 83 3d ee 0a 82 c3 26 9d d2 f3 de 14 e0 67 ...=....&......g 00:20:33.991 000000e0 7b 66 fb d5 7a de e1 b6 74 31 5a a0 d8 84 a4 6e {f..z...t1Z....n 00:20:33.991 000000f0 68 1f a8 6c 41 06 be 42 79 62 96 e5 80 e6 40 a5 h..lA..Byb....@. 00:20:33.991 00000100 62 e9 58 63 32 7d 13 7a 31 51 17 84 7c 96 9d 36 b.Xc2}.z1Q..|..6 00:20:33.991 00000110 b6 02 29 ed 05 fa 56 30 03 d0 9d bd 3b cb 1f 5b ..)...V0....;..[ 00:20:33.991 00000120 50 3a b1 81 5f 95 7f d0 fd 6d 36 95 90 0f 3f f8 P:.._....m6...?. 00:20:33.991 00000130 7b ba 93 5c b3 1f fc 6b 03 75 ff 87 f4 db 44 63 {..\...k.u....Dc 00:20:33.991 00000140 db 6f f0 3c e0 73 90 20 0c 22 d6 2d 7c 4a b5 10 .o.<.s. .".-|J.. 00:20:33.991 00000150 d5 ca b2 a7 b2 0f 96 64 ef f0 d4 7b 78 dd eb ad .......d...{x... 00:20:33.991 00000160 4f e0 ef bc 16 a8 ce a4 29 69 43 71 d9 07 17 e7 O.......)iCq.... 00:20:33.991 00000170 f3 de 1b 3a 81 26 f0 6a d5 19 69 73 8c cc b2 e7 ...:.&.j..is.... 00:20:33.991 00000180 47 5f be c0 67 04 43 5e 7b 4b 27 a5 f4 af 0b 5a G_..g.C^{K'....Z 00:20:33.991 00000190 5c 76 96 17 e5 ca 88 56 ce b8 89 ca 36 bb 4f 96 \v.....V....6.O. 00:20:33.991 000001a0 47 db ce ca 69 2d f2 c6 b0 8f 33 55 96 2c b5 38 G...i-....3U.,.8 00:20:33.991 000001b0 0e 24 f5 96 9e 2f 6c 3b 9f a7 32 8d d1 a5 f1 b6 .$.../l;..2..... 00:20:33.991 000001c0 f1 09 c6 03 a9 aa 26 f4 77 0a b2 f1 39 c6 c1 2b ......&.w...9..+ 00:20:33.991 000001d0 2f 5f 5d 85 6d cc c5 a9 0d f1 90 1c 44 49 59 01 /_].m.......DIY. 00:20:33.991 000001e0 66 02 e3 ab 0f 47 1e fc ba 83 d5 4b e3 e7 37 09 f....G.....K..7. 00:20:33.991 000001f0 6f 06 a2 e4 ad 12 7f a4 0a 33 4e 15 58 ad a6 92 o........3N.X... 00:20:33.991 00000200 78 86 1d 54 ea 47 0d 04 17 10 00 2d 36 a9 ca bb x..T.G.....-6... 00:20:33.991 00000210 ea 8f 81 d9 df fc b6 d1 5e 2a fc f6 86 ea 72 d3 ........^*....r. 00:20:33.991 00000220 06 8a 11 63 d6 d8 fe c1 0b ed 70 9a 3e d4 ef 6e ...c......p.>..n 00:20:33.991 00000230 a7 59 aa 67 63 14 c3 92 ec 44 75 d9 5d 4a c2 19 .Y.gc....Du.]J.. 00:20:33.991 00000240 1d c4 e5 48 56 a0 87 57 ad 8c d4 3f f2 20 3c c3 ...HV..W...?. <. 00:20:33.991 00000250 7a 36 e7 99 52 79 2c b5 40 05 9f 09 95 ec 6c 6b z6..Ry,.@.....lk 00:20:33.991 00000260 99 31 be 01 ef 30 3d c7 94 50 2b 04 b4 2c 35 f5 .1...0=..P+..,5. 00:20:33.991 00000270 90 64 18 3b 11 58 eb 36 df d9 90 b1 da 63 43 17 .d.;.X.6.....cC. 00:20:33.991 00000280 a1 00 03 35 af fe 08 72 b0 e3 1c 9c ee 05 5d cd ...5...r......]. 00:20:33.991 00000290 b3 da 7a 14 e4 f8 3e 58 54 8b 9c 14 8d 3b 68 5b ..z...>XT....;h[ 00:20:33.991 000002a0 2d 68 fd 42 cd cf f1 21 61 b7 a7 08 51 bc 50 68 -h.B...!a...Q.Ph 00:20:33.991 000002b0 9d 23 7d 19 15 3e 72 05 7b 88 01 c9 91 8e c5 bc .#}..>r.{....... 00:20:33.991 000002c0 a3 6d 9f 90 c3 da 98 e3 5e 6e 72 10 6e b1 16 09 .m......^nr.n... 00:20:33.991 000002d0 14 be 65 a0 b2 a2 3a 26 9d 01 e4 02 e9 2e bd 98 ..e...:&........ 00:20:33.991 000002e0 7e ff ae 49 2e 29 c0 63 2a 95 ad 74 76 14 e6 dc ~..I.).c*..tv... 00:20:33.991 000002f0 7d fd 93 4f 45 ae f2 86 71 55 e4 df b0 99 10 08 }..OE...qU...... 00:20:33.991 00000300 55 f0 23 15 dd 6b b2 59 e4 5e e7 13 dc 53 c2 90 U.#..k.Y.^...S.. 00:20:33.991 00000310 d5 45 d7 81 2f a3 a4 ba 2a be 6f 51 bf 60 3a 11 .E../...*.oQ.`:. 00:20:33.991 00000320 df aa 5b aa 60 e5 48 ac 60 08 a0 0b a4 dc 89 10 ..[.`.H.`....... 00:20:33.991 00000330 1b 95 57 0e 0d d6 5f 79 4f 3c 40 6d 42 0e c0 4e ..W..._yO<@mB..N 00:20:33.991 00000340 ff 5f a8 29 95 6a a5 f7 fc 89 ae b3 ca 68 59 0d ._.).j.......hY. 00:20:33.991 00000350 a4 24 6d dd d3 60 d1 1c 9f f2 38 52 be 46 7a c7 .$m..`....8R.Fz. 00:20:33.991 00000360 04 4f 58 39 a8 6c 10 b9 95 de 42 23 df 7c 65 3b .OX9.l....B#.|e; 00:20:33.991 00000370 c6 be 6c c9 8d 19 b9 a3 40 32 09 a6 7a a7 5d 74 ..l.....@2..z.]t 00:20:33.991 00000380 8d 79 6b d7 6b cd d5 2f e5 f2 a7 51 89 68 11 77 .yk.k../...Q.h.w 00:20:33.991 00000390 35 f7 6e 75 cc 42 29 6e c6 b3 a2 93 b3 76 75 21 5.nu.B)n.....vu! 00:20:33.991 000003a0 0e a5 35 f0 3d 21 5a d5 42 ee dd f2 19 f7 a2 64 ..5.=!Z.B......d 00:20:33.991 000003b0 b1 1a 7a 5c b0 cb d8 54 64 af ce 02 d3 24 e1 bb ..z\...Td....$.. 00:20:33.991 000003c0 89 8a 32 0c 2d 04 f9 3e bc 41 ce 9e 6f 49 b1 db ..2.-..>.A..oI.. 00:20:33.991 000003d0 fa 0c f4 eb fd 38 17 75 bf b6 fd e4 dd a6 c1 16 .....8.u........ 00:20:33.991 000003e0 2d 6f 64 0a 21 94 57 e3 76 09 96 a8 3a 50 d6 13 -od.!.W.v...:P.. 00:20:33.991 000003f0 17 7c 67 ca 38 06 07 78 a1 9b c7 ea 4c ea 70 b6 .|g.8..x....L.p. 00:20:33.991 [2024-09-27 15:25:32.970701] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key1, hash=3, dhgroup=5, seq=3428451838, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.991 [2024-09-27 15:25:32.970804] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.991 [2024-09-27 15:25:33.053120] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.991 [2024-09-27 15:25:33.053165] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.991 [2024-09-27 15:25:33.053176] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.991 [2024-09-27 15:25:33.053202] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.991 [2024-09-27 15:25:33.249116] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.991 [2024-09-27 15:25:33.249137] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.991 [2024-09-27 15:25:33.249144] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.991 [2024-09-27 15:25:33.249187] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.991 [2024-09-27 15:25:33.249210] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.991 ctrlr pubkey: 00:20:33.991 00000000 46 d3 88 1b 1f c1 3e 38 f9 6c f7 20 99 4e a3 b8 F.....>8.l. .N.. 00:20:33.991 00000010 18 95 06 f1 9f aa 86 b5 d3 1d ce 1b 1a 77 30 90 .............w0. 00:20:33.991 00000020 d0 38 aa 84 61 ce 21 36 91 de 03 57 af b3 ca 1a .8..a.!6...W.... 00:20:33.992 00000030 e0 51 af 94 4c f5 09 57 cb 49 c1 b3 34 35 e7 11 .Q..L..W.I..45.. 00:20:33.992 00000040 04 63 c5 fb 63 61 23 db 54 a4 2d f7 b9 05 8f f2 .c..ca#.T.-..... 00:20:33.992 00000050 d0 73 76 e5 84 80 77 2c 19 ac dd 62 02 a9 ca 61 .sv...w,...b...a 00:20:33.992 00000060 aa 12 46 f9 58 5b 3e 59 f0 81 a3 5a 3a e6 f7 59 ..F.X[>Y...Z:..Y 00:20:33.992 00000070 c6 48 72 dd 6a 73 23 c4 26 08 ff dc dd a4 64 92 .Hr.js#.&.....d. 00:20:33.992 00000080 4e 31 50 de 37 d8 5b 3c b3 c6 3e 86 1c dc 6a dd N1P.7.[<..>...j. 00:20:33.992 00000090 1d 2a e2 72 c4 ed c8 be b9 23 fc 40 f0 c2 0c 0a .*.r.....#.@.... 00:20:33.992 000000a0 83 9c 05 09 4b 36 3d e5 19 ad 02 ee df da 6f b9 ....K6=.......o. 00:20:33.992 000000b0 52 a4 33 87 8b 42 1f 6d cc 74 04 d2 91 6d 72 ce R.3..B.m.t...mr. 00:20:33.992 000000c0 5f 23 f3 68 fe 61 a3 49 27 c9 ba 4c ff 4f 66 e5 _#.h.a.I'..L.Of. 00:20:33.992 000000d0 b5 e3 3d 88 3d 6f d1 a2 7c 02 d0 7d ca 6f 78 12 ..=.=o..|..}.ox. 00:20:33.992 000000e0 d4 22 40 44 54 ab 6d 03 0b 02 6e 10 38 6d 03 7c ."@DT.m...n.8m.| 00:20:33.992 000000f0 d4 97 2b 6d bc d4 13 0b 92 ee 15 49 2e b2 9c 8f ..+m.......I.... 00:20:33.992 00000100 19 98 68 75 38 6f fa 4d d5 1c 80 1b 29 4a d1 f7 ..hu8o.M....)J.. 00:20:33.992 00000110 ad 75 81 d1 e1 a9 2d 73 1d 1b 6a bc 73 e6 f4 b0 .u....-s..j.s... 00:20:33.992 00000120 15 12 a4 7d a2 ac 22 7a ba f6 34 f0 be 7c 9e 57 ...}.."z..4..|.W 00:20:33.992 00000130 62 3a 74 39 cb 08 52 9c 01 d4 5b 14 7b 15 df a4 b:t9..R...[.{... 00:20:33.992 00000140 09 61 7f e2 dc 89 2a 46 9b 04 0d a5 ca 87 62 02 .a....*F......b. 00:20:33.992 00000150 1a 3a 4d 5a 6f 5f 1e 96 b9 29 6a 88 28 0a bc 5f .:MZo_...)j.(.._ 00:20:33.992 00000160 f7 fb 69 5c 0a 75 70 79 0d 1b 39 87 30 86 db ef ..i\.upy..9.0... 00:20:33.992 00000170 36 ce 2d 59 55 6b de a6 b3 c3 7d 88 29 f7 1c 0a 6.-YUk....}.)... 00:20:33.992 00000180 c9 13 a0 ca 71 7b 36 6f 01 1c 30 87 7b 86 36 ea ....q{6o..0.{.6. 00:20:33.992 00000190 a9 ed c2 dc ac ae b5 47 b4 f6 ca 85 3e 15 8d ca .......G....>... 00:20:33.992 000001a0 79 b7 ca 9d 91 8f 44 51 a4 17 df 20 ed c2 13 22 y.....DQ... ..." 00:20:33.992 000001b0 4d f0 f6 b8 1b ca 44 50 00 88 34 3e 8e d1 1c 6c M.....DP..4>...l 00:20:33.992 000001c0 dd 03 90 0a dd 2f 90 90 32 5d 36 93 d0 e5 c9 9d ...../..2]6..... 00:20:33.992 000001d0 4d 44 7a 1a 8d e2 cb 98 7c 43 8b 63 a8 a6 45 85 MDz.....|C.c..E. 00:20:33.992 000001e0 a7 77 11 40 c4 17 a6 fd ce 25 ee 46 6b fe f6 72 .w.@.....%.Fk..r 00:20:33.992 000001f0 a0 76 4a 11 2b 87 60 e0 a9 42 df 69 51 dc b8 69 .vJ.+.`..B.iQ..i 00:20:33.992 00000200 31 f8 4a f8 ee 66 d7 ae 89 51 16 48 d2 37 6d 08 1.J..f...Q.H.7m. 00:20:33.992 00000210 99 9c 27 30 70 78 24 17 96 ea ca 3e 4c 6d 2c ec ..'0px$....>Lm,. 00:20:33.992 00000220 b7 9e 39 57 5d 1f 8d bf 52 dd bd 75 34 f0 54 a6 ..9W]...R..u4.T. 00:20:33.992 00000230 5b a1 af d5 3b 8a 4d e5 b6 cb 7f f0 04 50 76 5b [...;.M......Pv[ 00:20:33.992 00000240 b6 06 94 45 40 b6 0e f6 11 a4 b4 ad 15 83 0c 8a ...E@........... 00:20:33.992 00000250 bb be cd bc 51 73 db 79 0d 8a e5 9b 5f 1b fb cc ....Qs.y...._... 00:20:33.992 00000260 41 7e 3a 6a e2 92 c3 58 e6 6d 2c 94 1d 5e da 25 A~:j...X.m,..^.% 00:20:33.992 00000270 8c 77 80 22 7d 09 e5 24 b4 64 29 52 9b bf 2c 0d .w."}..$.d)R..,. 00:20:33.992 00000280 0c 56 82 fb bc 98 32 8c 77 f9 3b ba 81 42 3a 57 .V....2.w.;..B:W 00:20:33.992 00000290 af 14 6d 43 ce 1f 98 48 ea 2e aa 4c c8 00 15 c4 ..mC...H...L.... 00:20:33.992 000002a0 7f b1 a8 89 f6 a2 51 48 07 e6 5d a7 73 68 59 81 ......QH..].shY. 00:20:33.992 000002b0 08 87 00 8c fc 2e 31 77 ea 21 50 ca 00 8d 8c 81 ......1w.!P..... 00:20:33.992 000002c0 0e 5d fb 30 37 b7 ed cb 72 30 ae 36 d3 77 f5 a4 .].07...r0.6.w.. 00:20:33.992 000002d0 36 4c 0d bb 2c d5 6d eb 11 73 88 a6 f5 54 70 03 6L..,.m..s...Tp. 00:20:33.992 000002e0 e9 30 be 22 d3 c1 b2 62 60 e2 cd ad 6c 54 4a b1 .0."...b`...lTJ. 00:20:33.992 000002f0 30 cf fe 44 d8 c2 c4 9a 13 3a 3b 6d fd 77 4b 76 0..D.....:;m.wKv 00:20:33.992 00000300 2c ca 83 59 89 2d d4 77 f8 7b 30 40 eb 2d b2 b6 ,..Y.-.w.{0@.-.. 00:20:33.992 00000310 0f 7b 72 c5 c2 0e ce f7 93 2e fe ac b7 61 4e c5 .{r..........aN. 00:20:33.992 00000320 99 96 66 15 e7 a7 01 10 3d 22 9f 28 45 48 62 7e ..f.....=".(EHb~ 00:20:33.992 00000330 4c 5f 2d b5 7e 68 32 42 3f c4 10 74 2c 2e e7 5a L_-.~h2B?..t,..Z 00:20:33.992 00000340 1b 77 19 05 82 ec 58 29 9c 95 59 45 bb c7 cd ef .w....X)..YE.... 00:20:33.992 00000350 92 bc e9 25 69 15 02 51 b9 fe 42 06 f8 fa 57 64 ...%i..Q..B...Wd 00:20:33.992 00000360 98 f8 c0 5f 8d 2d 65 a0 ce b2 42 a8 35 cc 86 8b ..._.-e...B.5... 00:20:33.992 00000370 75 ea 39 09 55 4a 39 7f 2a e4 f8 d9 f8 5a 51 59 u.9.UJ9.*....ZQY 00:20:33.992 00000380 45 85 17 71 05 46 d4 ec 10 96 12 fd c1 c5 a3 1b E..q.F.......... 00:20:33.992 00000390 e7 81 c8 7b 54 27 7a ba f9 2b 3f ba 64 0e 39 60 ...{T'z..+?.d.9` 00:20:33.992 000003a0 dc 55 8a 79 7b e2 aa 2e 93 e2 dc 8e 00 af 49 24 .U.y{.........I$ 00:20:33.992 000003b0 d2 92 40 89 70 4c 63 91 7e f4 e9 45 75 81 a0 68 ..@.pLc.~..Eu..h 00:20:33.992 000003c0 f5 f3 be 02 0e 78 e1 de a5 76 fa aa 1d ad 49 5a .....x...v....IZ 00:20:33.992 000003d0 dd 09 f6 41 62 db c2 4a aa 85 4b 48 79 b7 00 f4 ...Ab..J..KHy... 00:20:33.992 000003e0 ab 17 b8 cf 17 77 f5 7c 75 ff 91 55 3c 3a 1e e4 .....w.|u..U<:.. 00:20:33.992 000003f0 09 ed 3c d5 88 3a 2f 2a b8 7a 91 1b 8e 9f 4a 74 ..<..:/*.z....Jt 00:20:33.992 host pubkey: 00:20:33.992 00000000 5c 31 81 72 83 ad 05 7a c1 9b 3a 0d 13 3d bf 49 \1.r...z..:..=.I 00:20:33.992 00000010 3d a1 f6 c0 17 44 ea e8 78 ea 22 32 b4 7d 83 46 =....D..x."2.}.F 00:20:33.992 00000020 73 1f 37 fc 93 d4 6a d6 17 00 36 0b 1b 97 47 50 s.7...j...6...GP 00:20:33.992 00000030 5b 17 fb d0 f8 64 14 32 a7 c6 07 a3 a9 a8 d3 04 [....d.2........ 00:20:33.992 00000040 95 7a 43 d5 0e 81 28 f5 84 0f 5f c9 bc 69 d0 8e .zC...(..._..i.. 00:20:33.992 00000050 82 99 1d 12 14 2c 44 b9 ff e0 c0 a1 f0 c6 24 e7 .....,D.......$. 00:20:33.992 00000060 44 f9 7a 50 37 cf f6 cb 5a 49 a2 2a c1 6e e6 2d D.zP7...ZI.*.n.- 00:20:33.992 00000070 1b 02 ed 0a ba 67 29 18 3b f2 2e 1f 2a 49 65 f0 .....g).;...*Ie. 00:20:33.992 00000080 db 04 ab 0e 29 eb 78 5a 29 d6 09 42 92 bc a4 7b ....).xZ)..B...{ 00:20:33.992 00000090 21 07 cd 01 f3 b8 10 ad 37 1d ff dd 31 77 56 c2 !.......7...1wV. 00:20:33.992 000000a0 0d d0 21 f5 21 79 dd 10 f3 08 49 2f f8 36 83 87 ..!.!y....I/.6.. 00:20:33.992 000000b0 b9 35 3f 5a 80 40 43 23 27 e5 28 e9 75 1a c0 19 .5?Z.@C#'.(.u... 00:20:33.992 000000c0 a7 b0 a6 50 59 c3 14 6f 0c 11 8a 5d 08 9a e4 ea ...PY..o...].... 00:20:33.992 000000d0 4c d4 26 65 6b e0 a8 b7 6a 57 79 dd 0c 8d a4 f2 L.&ek...jWy..... 00:20:33.992 000000e0 49 fc 94 a4 0e e9 81 b3 0e 88 d5 db 91 31 d4 d0 I............1.. 00:20:33.992 000000f0 dc 2c ff a8 55 8e 02 8b ba 2f 78 cd 41 50 46 39 .,..U..../x.APF9 00:20:33.992 00000100 8c bb f0 65 7a 39 66 c8 ec c5 87 ea da 4d e9 18 ...ez9f......M.. 00:20:33.992 00000110 4d 2d 76 c9 43 26 61 58 9f 83 27 c7 7d 14 c3 b6 M-v.C&aX..'.}... 00:20:33.992 00000120 1d e1 d0 00 6c 19 b5 ad 1d b3 0d 69 b2 0c ae cf ....l......i.... 00:20:33.992 00000130 35 1f f7 de dc 2b 87 0b dc 54 49 2f c4 87 59 5e 5....+...TI/..Y^ 00:20:33.992 00000140 cd 0e 6a f1 ac f3 59 60 20 4c 2c fe 46 8c b2 3d ..j...Y` L,.F..= 00:20:33.992 00000150 6b 10 f2 9e 40 46 bd e0 96 dd e8 a5 29 ed ba 5b k...@F......)..[ 00:20:33.992 00000160 25 ae 2f 7e e4 1a 76 89 dd 9e 0a 85 5f c2 7f fc %./~..v....._... 00:20:33.992 00000170 92 57 43 07 68 03 00 88 70 f3 84 56 1a a3 33 45 .WC.h...p..V..3E 00:20:33.992 00000180 83 6a 4c 98 f2 e8 dd 17 2f 48 46 fd 6f 19 ff 9c .jL...../HF.o... 00:20:33.992 00000190 ce 05 90 01 f5 b1 9e 7b 26 f1 2d 75 eb 5c ce 2d .......{&.-u.\.- 00:20:33.992 000001a0 cb d0 dd 34 31 ca 05 d0 32 3f ca 6f bb c2 da 75 ...41...2?.o...u 00:20:33.992 000001b0 55 aa c8 09 ee b3 04 10 2b d9 c5 7e 73 96 43 b6 U.......+..~s.C. 00:20:33.992 000001c0 00 03 e7 cd 5a c3 a4 a0 32 5e ba 97 0a db 0b 92 ....Z...2^...... 00:20:33.992 000001d0 3f 32 66 db 85 c2 5b be fe 6f 23 e3 2a 5b 33 54 ?2f...[..o#.*[3T 00:20:33.992 000001e0 6c 9e 0e 59 6b bd ef e6 c8 76 1a e1 36 b5 e6 e0 l..Yk....v..6... 00:20:33.992 000001f0 be 73 01 a6 bd 1a cc bd 10 96 20 fd 25 5c 73 4e .s........ .%\sN 00:20:33.992 00000200 99 2e a9 89 53 1d 4e 2c 30 7a 28 b6 99 4d c1 4f ....S.N,0z(..M.O 00:20:33.992 00000210 df 89 8a 49 07 90 ec eb 15 cb 5f 18 b8 fe 3f 84 ...I......_...?. 00:20:33.992 00000220 a3 9b 6b 0a 06 b9 23 55 69 b3 77 9d 03 09 02 56 ..k...#Ui.w....V 00:20:33.992 00000230 96 8a 67 da 4a 84 54 d4 72 c2 86 cd 34 b1 d4 cd ..g.J.T.r...4... 00:20:33.992 00000240 39 4a e1 53 be e1 75 f2 ea 1e b3 36 8b 73 59 16 9J.S..u....6.sY. 00:20:33.992 00000250 45 72 f7 21 b2 a7 59 4c 46 95 1b 3d 55 c8 5f 20 Er.!..YLF..=U._ 00:20:33.992 00000260 df 1d 1a b8 fc d9 22 9c 3e ee 86 fa 05 62 87 22 ......".>....b." 00:20:33.992 00000270 85 55 35 05 a0 f8 a8 6d 18 c7 32 86 c2 2c 7d 0f .U5....m..2..,}. 00:20:33.992 00000280 3b b3 86 1a 6d db 84 dc 9a 55 09 45 18 ec 46 bd ;...m....U.E..F. 00:20:33.992 00000290 fa e7 62 11 4b 1a 69 be 7b d5 38 3d ad 8a 2b 96 ..b.K.i.{.8=..+. 00:20:33.992 000002a0 e6 61 cb bc ce 17 18 59 dd ab 84 30 23 c3 d1 e5 .a.....Y...0#... 00:20:33.992 000002b0 a8 bc ba e1 8c 07 5f 1c ce bf de 5e 34 bf 60 96 ......_....^4.`. 00:20:33.992 000002c0 a8 2f a2 5a 6c 1b b0 23 9c 70 c9 47 f5 07 51 2b ./.Zl..#.p.G..Q+ 00:20:33.992 000002d0 4b 30 da e0 b8 a0 97 5a 44 80 a5 7f ea 55 d1 2e K0.....ZD....U.. 00:20:33.992 000002e0 77 8d 61 c4 5f 23 29 d7 b6 a7 87 4a 8f 94 cf 00 w.a._#)....J.... 00:20:33.992 000002f0 8e 1f a1 e4 24 94 01 eb 43 b4 ef 84 22 ec 29 cf ....$...C...".). 00:20:33.992 00000300 8c bf 1e de 09 3f 89 cc a2 8d f1 82 29 50 3c d8 .....?......)P<. 00:20:33.992 00000310 20 ad 6d 1d 88 67 2d 79 42 e5 7d 2d bf ac 30 36 .m..g-yB.}-..06 00:20:33.992 00000320 82 cf c7 b8 3a 44 54 8f dd ad d6 52 39 5a 00 65 ....:DT....R9Z.e 00:20:33.992 00000330 fb 0b 1d 08 14 9d 7a 4e fd 78 50 98 3f 6f e8 04 ......zN.xP.?o.. 00:20:33.992 00000340 9d 75 10 26 81 18 d3 ba ef ec 94 1b ee eb 5a e3 .u.&..........Z. 00:20:33.992 00000350 ba 05 dd e1 b2 d2 10 2e c1 dc a1 60 7d 03 84 cb ...........`}... 00:20:33.992 00000360 0b 7d 15 7c ad c0 19 0a 7d 8d f9 f3 f0 76 f9 ee .}.|....}....v.. 00:20:33.992 00000370 8f 3b c5 00 17 c6 da d5 11 03 b9 24 92 90 45 6f .;.........$..Eo 00:20:33.992 00000380 27 16 85 87 bd a2 46 05 fa 61 00 0b e7 23 be 30 '.....F..a...#.0 00:20:33.992 00000390 95 23 59 65 e6 8f 48 24 8d fc 40 6f 41 20 67 48 .#Ye..H$..@oA gH 00:20:33.992 000003a0 c1 3c f8 bc 4a 8e f7 7f 2c a5 e5 0b 62 ca a8 de .<..J...,...b... 00:20:33.992 000003b0 be 60 8a 7b 59 fb bb 74 eb 4f d7 ea 95 45 28 65 .`.{Y..t.O...E(e 00:20:33.992 000003c0 ad 01 e5 33 85 4e 18 fc 52 4f 5b dc d9 f3 ae 28 ...3.N..RO[....( 00:20:33.992 000003d0 c6 16 6d 34 76 5e 24 3f 08 1c 61 5c 28 a0 e0 5b ..m4v^$?..a\(..[ 00:20:33.992 000003e0 55 6c 57 b8 d6 3e fb a3 c6 df 12 7c b2 6e d9 c6 UlW..>.....|.n.. 00:20:33.992 000003f0 86 d7 1b 4a fc b5 ad 04 c6 b0 bc 9d 41 c0 6a eb ...J........A.j. 00:20:33.992 dh secret: 00:20:33.992 00000000 25 95 cc 0b d8 2c 97 03 75 76 6f 2c c5 c7 80 0e %....,..uvo,.... 00:20:33.992 00000010 5f bb 93 c5 63 82 b1 0c 5d 70 ea 94 71 c6 b3 e7 _...c...]p..q... 00:20:33.992 00000020 92 51 1c 4e 45 37 fa 1a ca 81 02 e0 e1 a6 be 70 .Q.NE7.........p 00:20:33.992 00000030 a2 2a d5 ee 5d aa 7a b1 6b 62 40 88 4a b6 04 24 .*..].z.kb@.J..$ 00:20:33.992 00000040 19 94 1f b4 85 d4 16 cc 5d cf 0d 04 43 fc 40 c8 ........]...C.@. 00:20:33.992 00000050 43 6f f7 5c a8 64 f5 fb 76 b6 c7 ca 9f 0f ba 39 Co.\.d..v......9 00:20:33.992 00000060 21 6c 8b 08 f7 74 ff d4 c8 b8 cb d8 07 88 bb 48 !l...t.........H 00:20:33.992 00000070 16 4f 2d f7 1d 24 51 d6 ac f7 b6 77 b4 58 94 56 .O-..$Q....w.X.V 00:20:33.992 00000080 d5 8b 16 bd 0c 26 94 c4 96 7d 06 45 8e 69 e2 99 .....&...}.E.i.. 00:20:33.992 00000090 6f b7 c9 0e ed 48 41 11 c0 06 55 b0 fd d9 e5 09 o....HA...U..... 00:20:33.992 000000a0 b7 4a bf 3b b4 eb b3 e7 11 d1 46 02 74 6c 27 a2 .J.;......F.tl'. 00:20:33.992 000000b0 e7 21 27 6a ea bd df ae 30 6d 2c 4e e6 12 7d ff .!'j....0m,N..}. 00:20:33.992 000000c0 ba c1 a6 cc b2 9a 35 c5 2f 2b 0b 78 8d 2e 16 60 ......5./+.x...` 00:20:33.992 000000d0 eb e5 03 0c f0 cd 3a 48 d6 07 1c 14 3c 1d 91 d7 ......:H....<... 00:20:33.993 000000e0 f1 3c 3c 88 5f 3b 5d d4 96 ad 1c 0f 9d ea 22 f4 .<<._;].......". 00:20:33.993 000000f0 9f 33 dc d6 55 40 04 8e 96 5d 0b 5e d1 3a 89 91 .3..U@...].^.:.. 00:20:33.993 00000100 15 bb 7b b5 67 e4 77 6f 5d e1 b0 69 1f ad 4c 2c ..{.g.wo]..i..L, 00:20:33.993 00000110 80 8a 55 a5 b8 3f 29 5d 08 d1 28 33 0b 6f 4b aa ..U..?)]..(3.oK. 00:20:33.993 00000120 98 60 4d da c9 78 17 e9 49 9d 2c 3b c5 7f 6e 58 .`M..x..I.,;..nX 00:20:33.993 00000130 d8 45 10 96 3c bf e7 78 69 61 28 0d 7e e7 2f 06 .E..<..xia(.~./. 00:20:33.993 00000140 7f e9 83 c6 29 ad 88 10 36 38 f0 69 94 e7 b9 c8 ....)...68.i.... 00:20:33.993 00000150 09 7c 02 60 ff 0e 27 5d 80 c0 5f 38 3a ae 03 e8 .|.`..'].._8:... 00:20:33.993 00000160 b1 c0 3c b0 28 48 1a c5 da 48 92 75 c5 94 21 71 ..<.(H...H.u..!q 00:20:33.993 00000170 4d 52 06 14 e7 2f df b7 1a 8c a8 3d 48 32 48 93 MR.../.....=H2H. 00:20:33.993 00000180 b4 34 d2 da 9e eb ce ae 49 7e 48 a8 7a 35 fb 63 .4......I~H.z5.c 00:20:33.993 00000190 57 91 86 20 07 ca 2c 56 75 3a 0c 00 7c 0c 49 dd W.. ..,Vu:..|.I. 00:20:33.993 000001a0 39 b8 c4 54 aa a4 08 39 28 16 d4 ae 92 2e ef a4 9..T...9(....... 00:20:33.993 000001b0 f9 eb 00 61 59 2e e6 48 e3 66 38 7f f0 ce 0e a8 ...aY..H.f8..... 00:20:33.993 000001c0 98 28 25 27 87 c4 34 16 36 7f a1 b2 ab 3f 10 23 .(%'..4.6....?.# 00:20:33.993 000001d0 18 89 08 61 29 a4 f8 cf 1a 47 40 34 0e be 29 09 ...a)....G@4..). 00:20:33.993 000001e0 5f 91 3a 24 9b db 05 57 d7 f0 17 a1 42 66 cf 00 _.:$...W....Bf.. 00:20:33.993 000001f0 e0 07 de db 8e 7b 9a c5 9f 8b 5d d7 86 93 74 b2 .....{....]...t. 00:20:33.993 00000200 f4 ec 0e 85 c0 75 4d 7c f1 1f 25 23 13 86 a3 95 .....uM|..%#.... 00:20:33.993 00000210 b8 cb dc 96 ca 14 5e 4f 81 74 17 33 be bb e6 01 ......^O.t.3.... 00:20:33.993 00000220 48 38 44 f5 b3 1c e9 0d 4c c0 db a7 7c 8d 46 1f H8D.....L...|.F. 00:20:33.993 00000230 35 04 5b bc a4 65 cd 5d ea f1 df c5 6d d1 3e e5 5.[..e.]....m.>. 00:20:33.993 00000240 88 56 82 2a b6 ea 0f df c8 22 75 cc 3f 77 ac b5 .V.*....."u.?w.. 00:20:33.993 00000250 18 b1 46 20 c7 b3 34 68 d7 96 29 3c a7 60 9d d2 ..F ..4h..)<.`.. 00:20:33.993 00000260 5e 95 82 46 f6 5e f2 57 3e 24 17 d3 34 18 0f ca ^..F.^.W>$..4... 00:20:33.993 00000270 9b 42 cd 5e 0d b5 bd 2c 05 b4 22 b4 d4 3e 98 0e .B.^...,.."..>.. 00:20:33.993 00000280 e2 83 e3 23 6f 85 1f dd 37 2f b4 ef 72 e5 f2 a8 ...#o...7/..r... 00:20:33.993 00000290 5d a3 d3 3a 4e 1d f1 60 6f 55 ee 00 4e d9 de bc ]..:N..`oU..N... 00:20:33.993 000002a0 24 59 76 0b 02 67 26 44 c5 70 1e 53 7d 75 32 55 $Yv..g&D.p.S}u2U 00:20:33.993 000002b0 62 92 c8 85 36 ce a0 7f 2b f1 c3 59 e0 ce c8 c7 b...6...+..Y.... 00:20:33.993 000002c0 9e 10 80 ba 1f 75 0b 89 60 2d 58 65 4f 03 bb 05 .....u..`-XeO... 00:20:33.993 000002d0 5d 3f 01 c2 14 03 f0 a4 83 6e c2 40 9e 7c 7c 7a ]?.......n.@.||z 00:20:33.993 000002e0 b8 23 28 26 ab 11 51 38 1a c0 02 5b 8a b3 fe 96 .#(&..Q8...[.... 00:20:33.993 000002f0 c2 25 7f 9d 96 ee 9f 47 50 37 51 9d 50 4a 05 b0 .%.....GP7Q.PJ.. 00:20:33.993 00000300 8d d7 24 a8 c9 84 5e 63 0a a0 32 2e 96 0b 23 7d ..$...^c..2...#} 00:20:33.993 00000310 6c 65 05 c3 2c f8 01 13 f3 ed 23 3e 64 41 c9 5b le..,.....#>dA.[ 00:20:33.993 00000320 72 5c bb d9 b7 30 70 c9 d8 f7 67 f6 62 3c 0b ad r\...0p...g.b<.. 00:20:33.993 00000330 e7 a2 89 c9 4a bf 1e 51 32 fa 53 0f 9f 21 6c 91 ....J..Q2.S..!l. 00:20:33.993 00000340 7d c5 9d 89 92 0e 0f 2d e6 69 5a 2e ad 86 ea 41 }......-.iZ....A 00:20:33.993 00000350 cd 4c 54 75 99 e8 cd 06 1c aa c0 af 57 69 29 6c .LTu........Wi)l 00:20:33.993 00000360 18 51 6c 59 8b 4d 80 4c 5e f5 02 df bd 1c 7d 23 .QlY.M.L^.....}# 00:20:33.993 00000370 62 37 3d f2 d5 3d 41 da c7 7b 7f 0b 07 7a a8 cf b7=..=A..{...z.. 00:20:33.993 00000380 66 68 2a aa 35 8b e8 ff 9a 29 9d e7 44 b6 7c 5b fh*.5....)..D.|[ 00:20:33.993 00000390 b6 fa b4 6d a9 bb c5 be c0 f1 6a 29 bd a4 5b e2 ...m......j)..[. 00:20:33.993 000003a0 83 61 67 36 b5 3e 91 22 13 d9 87 88 b4 3a a0 fb .ag6.>.".....:.. 00:20:33.993 000003b0 fc 31 05 5e 7f 6c 27 c4 00 93 eb 9a 94 f8 80 bd .1.^.l'......... 00:20:33.993 000003c0 e5 ed 89 70 ae 2c 73 16 6b e9 87 56 2d 61 9f 27 ...p.,s.k..V-a.' 00:20:33.993 000003d0 43 0f 3e a3 da c6 23 4a 19 a7 3d 52 4f 14 53 be C.>...#J..=RO.S. 00:20:33.993 000003e0 6a c2 36 b9 7d ce c6 28 93 02 94 49 6b e1 a5 85 j.6.}..(...Ik... 00:20:33.993 000003f0 1d 6d 00 70 d6 05 1a d1 6d de 0f ce 1e d6 00 00 .m.p....m....... 00:20:33.993 [2024-09-27 15:25:33.362914] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=3, dhgroup=5, seq=3428451839, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.993 [2024-09-27 15:25:33.423642] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.993 [2024-09-27 15:25:33.423693] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.993 [2024-09-27 15:25:33.423712] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.993 [2024-09-27 15:25:33.423737] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.993 [2024-09-27 15:25:33.423748] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.993 [2024-09-27 15:25:33.531255] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.993 [2024-09-27 15:25:33.531274] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.993 [2024-09-27 15:25:33.531281] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.993 [2024-09-27 15:25:33.531291] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.993 [2024-09-27 15:25:33.531348] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.993 ctrlr pubkey: 00:20:33.993 00000000 46 d3 88 1b 1f c1 3e 38 f9 6c f7 20 99 4e a3 b8 F.....>8.l. .N.. 00:20:33.993 00000010 18 95 06 f1 9f aa 86 b5 d3 1d ce 1b 1a 77 30 90 .............w0. 00:20:33.993 00000020 d0 38 aa 84 61 ce 21 36 91 de 03 57 af b3 ca 1a .8..a.!6...W.... 00:20:33.993 00000030 e0 51 af 94 4c f5 09 57 cb 49 c1 b3 34 35 e7 11 .Q..L..W.I..45.. 00:20:33.993 00000040 04 63 c5 fb 63 61 23 db 54 a4 2d f7 b9 05 8f f2 .c..ca#.T.-..... 00:20:33.993 00000050 d0 73 76 e5 84 80 77 2c 19 ac dd 62 02 a9 ca 61 .sv...w,...b...a 00:20:33.993 00000060 aa 12 46 f9 58 5b 3e 59 f0 81 a3 5a 3a e6 f7 59 ..F.X[>Y...Z:..Y 00:20:33.993 00000070 c6 48 72 dd 6a 73 23 c4 26 08 ff dc dd a4 64 92 .Hr.js#.&.....d. 00:20:33.993 00000080 4e 31 50 de 37 d8 5b 3c b3 c6 3e 86 1c dc 6a dd N1P.7.[<..>...j. 00:20:33.993 00000090 1d 2a e2 72 c4 ed c8 be b9 23 fc 40 f0 c2 0c 0a .*.r.....#.@.... 00:20:33.993 000000a0 83 9c 05 09 4b 36 3d e5 19 ad 02 ee df da 6f b9 ....K6=.......o. 00:20:33.993 000000b0 52 a4 33 87 8b 42 1f 6d cc 74 04 d2 91 6d 72 ce R.3..B.m.t...mr. 00:20:33.993 000000c0 5f 23 f3 68 fe 61 a3 49 27 c9 ba 4c ff 4f 66 e5 _#.h.a.I'..L.Of. 00:20:33.993 000000d0 b5 e3 3d 88 3d 6f d1 a2 7c 02 d0 7d ca 6f 78 12 ..=.=o..|..}.ox. 00:20:33.993 000000e0 d4 22 40 44 54 ab 6d 03 0b 02 6e 10 38 6d 03 7c ."@DT.m...n.8m.| 00:20:33.993 000000f0 d4 97 2b 6d bc d4 13 0b 92 ee 15 49 2e b2 9c 8f ..+m.......I.... 00:20:33.993 00000100 19 98 68 75 38 6f fa 4d d5 1c 80 1b 29 4a d1 f7 ..hu8o.M....)J.. 00:20:33.993 00000110 ad 75 81 d1 e1 a9 2d 73 1d 1b 6a bc 73 e6 f4 b0 .u....-s..j.s... 00:20:33.993 00000120 15 12 a4 7d a2 ac 22 7a ba f6 34 f0 be 7c 9e 57 ...}.."z..4..|.W 00:20:33.993 00000130 62 3a 74 39 cb 08 52 9c 01 d4 5b 14 7b 15 df a4 b:t9..R...[.{... 00:20:33.993 00000140 09 61 7f e2 dc 89 2a 46 9b 04 0d a5 ca 87 62 02 .a....*F......b. 00:20:33.993 00000150 1a 3a 4d 5a 6f 5f 1e 96 b9 29 6a 88 28 0a bc 5f .:MZo_...)j.(.._ 00:20:33.993 00000160 f7 fb 69 5c 0a 75 70 79 0d 1b 39 87 30 86 db ef ..i\.upy..9.0... 00:20:33.993 00000170 36 ce 2d 59 55 6b de a6 b3 c3 7d 88 29 f7 1c 0a 6.-YUk....}.)... 00:20:33.993 00000180 c9 13 a0 ca 71 7b 36 6f 01 1c 30 87 7b 86 36 ea ....q{6o..0.{.6. 00:20:33.993 00000190 a9 ed c2 dc ac ae b5 47 b4 f6 ca 85 3e 15 8d ca .......G....>... 00:20:33.993 000001a0 79 b7 ca 9d 91 8f 44 51 a4 17 df 20 ed c2 13 22 y.....DQ... ..." 00:20:33.993 000001b0 4d f0 f6 b8 1b ca 44 50 00 88 34 3e 8e d1 1c 6c M.....DP..4>...l 00:20:33.993 000001c0 dd 03 90 0a dd 2f 90 90 32 5d 36 93 d0 e5 c9 9d ...../..2]6..... 00:20:33.993 000001d0 4d 44 7a 1a 8d e2 cb 98 7c 43 8b 63 a8 a6 45 85 MDz.....|C.c..E. 00:20:33.993 000001e0 a7 77 11 40 c4 17 a6 fd ce 25 ee 46 6b fe f6 72 .w.@.....%.Fk..r 00:20:33.993 000001f0 a0 76 4a 11 2b 87 60 e0 a9 42 df 69 51 dc b8 69 .vJ.+.`..B.iQ..i 00:20:33.993 00000200 31 f8 4a f8 ee 66 d7 ae 89 51 16 48 d2 37 6d 08 1.J..f...Q.H.7m. 00:20:33.993 00000210 99 9c 27 30 70 78 24 17 96 ea ca 3e 4c 6d 2c ec ..'0px$....>Lm,. 00:20:33.993 00000220 b7 9e 39 57 5d 1f 8d bf 52 dd bd 75 34 f0 54 a6 ..9W]...R..u4.T. 00:20:33.993 00000230 5b a1 af d5 3b 8a 4d e5 b6 cb 7f f0 04 50 76 5b [...;.M......Pv[ 00:20:33.993 00000240 b6 06 94 45 40 b6 0e f6 11 a4 b4 ad 15 83 0c 8a ...E@........... 00:20:33.993 00000250 bb be cd bc 51 73 db 79 0d 8a e5 9b 5f 1b fb cc ....Qs.y...._... 00:20:33.993 00000260 41 7e 3a 6a e2 92 c3 58 e6 6d 2c 94 1d 5e da 25 A~:j...X.m,..^.% 00:20:33.993 00000270 8c 77 80 22 7d 09 e5 24 b4 64 29 52 9b bf 2c 0d .w."}..$.d)R..,. 00:20:33.993 00000280 0c 56 82 fb bc 98 32 8c 77 f9 3b ba 81 42 3a 57 .V....2.w.;..B:W 00:20:33.993 00000290 af 14 6d 43 ce 1f 98 48 ea 2e aa 4c c8 00 15 c4 ..mC...H...L.... 00:20:33.993 000002a0 7f b1 a8 89 f6 a2 51 48 07 e6 5d a7 73 68 59 81 ......QH..].shY. 00:20:33.993 000002b0 08 87 00 8c fc 2e 31 77 ea 21 50 ca 00 8d 8c 81 ......1w.!P..... 00:20:33.993 000002c0 0e 5d fb 30 37 b7 ed cb 72 30 ae 36 d3 77 f5 a4 .].07...r0.6.w.. 00:20:33.993 000002d0 36 4c 0d bb 2c d5 6d eb 11 73 88 a6 f5 54 70 03 6L..,.m..s...Tp. 00:20:33.993 000002e0 e9 30 be 22 d3 c1 b2 62 60 e2 cd ad 6c 54 4a b1 .0."...b`...lTJ. 00:20:33.993 000002f0 30 cf fe 44 d8 c2 c4 9a 13 3a 3b 6d fd 77 4b 76 0..D.....:;m.wKv 00:20:33.993 00000300 2c ca 83 59 89 2d d4 77 f8 7b 30 40 eb 2d b2 b6 ,..Y.-.w.{0@.-.. 00:20:33.993 00000310 0f 7b 72 c5 c2 0e ce f7 93 2e fe ac b7 61 4e c5 .{r..........aN. 00:20:33.993 00000320 99 96 66 15 e7 a7 01 10 3d 22 9f 28 45 48 62 7e ..f.....=".(EHb~ 00:20:33.993 00000330 4c 5f 2d b5 7e 68 32 42 3f c4 10 74 2c 2e e7 5a L_-.~h2B?..t,..Z 00:20:33.993 00000340 1b 77 19 05 82 ec 58 29 9c 95 59 45 bb c7 cd ef .w....X)..YE.... 00:20:33.993 00000350 92 bc e9 25 69 15 02 51 b9 fe 42 06 f8 fa 57 64 ...%i..Q..B...Wd 00:20:33.993 00000360 98 f8 c0 5f 8d 2d 65 a0 ce b2 42 a8 35 cc 86 8b ..._.-e...B.5... 00:20:33.993 00000370 75 ea 39 09 55 4a 39 7f 2a e4 f8 d9 f8 5a 51 59 u.9.UJ9.*....ZQY 00:20:33.993 00000380 45 85 17 71 05 46 d4 ec 10 96 12 fd c1 c5 a3 1b E..q.F.......... 00:20:33.993 00000390 e7 81 c8 7b 54 27 7a ba f9 2b 3f ba 64 0e 39 60 ...{T'z..+?.d.9` 00:20:33.993 000003a0 dc 55 8a 79 7b e2 aa 2e 93 e2 dc 8e 00 af 49 24 .U.y{.........I$ 00:20:33.993 000003b0 d2 92 40 89 70 4c 63 91 7e f4 e9 45 75 81 a0 68 ..@.pLc.~..Eu..h 00:20:33.993 000003c0 f5 f3 be 02 0e 78 e1 de a5 76 fa aa 1d ad 49 5a .....x...v....IZ 00:20:33.993 000003d0 dd 09 f6 41 62 db c2 4a aa 85 4b 48 79 b7 00 f4 ...Ab..J..KHy... 00:20:33.993 000003e0 ab 17 b8 cf 17 77 f5 7c 75 ff 91 55 3c 3a 1e e4 .....w.|u..U<:.. 00:20:33.993 000003f0 09 ed 3c d5 88 3a 2f 2a b8 7a 91 1b 8e 9f 4a 74 ..<..:/*.z....Jt 00:20:33.993 host pubkey: 00:20:33.993 00000000 f4 48 d2 93 d3 69 1e 64 c2 c0 09 0b af 86 47 61 .H...i.d......Ga 00:20:33.993 00000010 a9 03 f4 50 98 7e 44 26 72 b9 d9 b9 86 cd d7 0b ...P.~D&r....... 00:20:33.993 00000020 33 6b b9 bb 73 9c 98 c2 1b cd ec 82 2d 13 48 08 3k..s.......-.H. 00:20:33.993 00000030 51 a1 02 81 97 79 b9 11 ea 25 a4 42 dc 2a 58 f2 Q....y...%.B.*X. 00:20:33.993 00000040 0e 67 1f 44 43 12 c1 4a d7 63 74 88 79 a6 2f 31 .g.DC..J.ct.y./1 00:20:33.993 00000050 77 21 ee 36 cd 19 16 8f 31 75 ce ef 8b 74 5e a6 w!.6....1u...t^. 00:20:33.993 00000060 3c f3 ad 58 dc 36 db b7 25 8a 3d 0a 19 aa 7d 53 <..X.6..%.=...}S 00:20:33.993 00000070 35 bf 5e f1 72 b5 08 6f 3b 82 22 7b a7 ad 36 6b 5.^.r..o;."{..6k 00:20:33.993 00000080 fd a1 15 63 49 25 f1 ee 3b 29 c3 57 57 c8 f0 8d ...cI%..;).WW... 00:20:33.993 00000090 77 74 9f 0d e8 e5 8e f4 61 ee 5f c4 dd 82 7e c8 wt......a._...~. 00:20:33.993 000000a0 39 2e f4 28 98 d2 c8 7f b2 d0 9c 2f 28 e6 5b 38 9..(......./(.[8 00:20:33.993 000000b0 85 e4 c2 de 35 f0 02 a7 bf 5f 86 7d 8b f5 fb 03 ....5...._.}.... 00:20:33.993 000000c0 92 6f 19 22 76 d6 17 e8 dc af 64 1d e4 8d 54 44 .o."v.....d...TD 00:20:33.993 000000d0 dd 71 a2 b8 b0 72 94 b6 60 2d 57 81 e5 3a eb fd .q...r..`-W..:.. 00:20:33.993 000000e0 0a 34 03 57 36 5e ce e5 ac 40 c2 0b 5d 8d 69 01 .4.W6^...@..].i. 00:20:33.993 000000f0 eb a2 d1 9d 35 53 9a 3a 50 05 f7 44 8f fa 40 58 ....5S.:P..D..@X 00:20:33.993 00000100 df f1 4e 51 31 78 73 ca 8a 92 94 5a 25 ca 9e 72 ..NQ1xs....Z%..r 00:20:33.993 00000110 81 39 5e ba d3 b9 f2 9e ed 3c 3d 28 91 f5 01 1b .9^......<=(.... 00:20:33.993 00000120 98 52 85 be d8 df c5 68 94 74 21 c7 9e 98 ae c5 .R.....h.t!..... 00:20:33.994 00000130 55 f1 e2 97 17 7a 4a 6d 9d 1e 3a c9 97 12 c9 a1 U....zJm..:..... 00:20:33.994 00000140 f5 35 00 21 1e 1e 26 89 01 a3 71 ac f4 4c a7 c0 .5.!..&...q..L.. 00:20:33.994 00000150 a2 00 3f bb a0 1a 88 7b 0f c6 5e 25 a7 cd e4 7b ..?....{..^%...{ 00:20:33.994 00000160 38 a1 ab 9c 98 f6 d6 2d 93 72 75 ed 86 68 a3 83 8......-.ru..h.. 00:20:33.994 00000170 94 b2 f3 e9 20 59 10 61 ad 07 27 1f 8f 62 69 dd .... Y.a..'..bi. 00:20:33.994 00000180 8b 58 9b 57 a8 fc 4d 94 d7 1d e8 88 d5 50 eb c9 .X.W..M......P.. 00:20:33.994 00000190 2e 14 e2 34 b8 5f e3 54 37 72 b5 dc 7c 11 5d b2 ...4._.T7r..|.]. 00:20:33.994 000001a0 42 81 1d 23 de 9e 2e 9e b7 aa b6 19 d9 7a 1c f0 B..#.........z.. 00:20:33.994 000001b0 73 fb 0e ac b0 6a 02 7f 37 db 5d 3f c7 79 68 a0 s....j..7.]?.yh. 00:20:33.994 000001c0 30 19 9f 19 21 6a f4 ee eb 1a 5a 9c b2 2c a5 57 0...!j....Z..,.W 00:20:33.994 000001d0 13 88 ba f3 b0 92 0c bc 13 5b 06 c1 32 8f 80 f0 .........[..2... 00:20:33.994 000001e0 9c 22 57 74 9a 49 d3 4c 9e 71 fe ac ab 1d fd 7f ."Wt.I.L.q...... 00:20:33.994 000001f0 cc f4 f8 2f 41 4a 3e e0 56 9a a8 57 77 c5 23 bf .../AJ>.V..Ww.#. 00:20:33.994 00000200 a9 2a af 48 76 23 3a e4 1b aa 5d d2 8e e9 97 9b .*.Hv#:...]..... 00:20:33.994 00000210 ee 9b 1e d0 8f ed 56 cc 79 78 40 2d e5 25 b2 30 ......V.yx@-.%.0 00:20:33.994 00000220 d7 22 a0 a3 ff a5 b6 2c bc f3 9e 05 ce 35 9a 75 .".....,.....5.u 00:20:33.994 00000230 13 36 a6 a5 04 66 d2 11 12 56 bc 23 29 f2 a9 a5 .6...f...V.#)... 00:20:33.994 00000240 13 88 0e 2b 72 70 ba 85 ac 5a bc bc d4 ef e7 eb ...+rp...Z...... 00:20:33.994 00000250 4c 47 d2 03 36 b9 78 be e7 3c 5c 17 92 54 de fc LG..6.x..<\..T.. 00:20:33.994 00000260 0b 3d 9b 88 c4 96 29 79 61 82 fb 2f f0 03 3b 00 .=....)ya../..;. 00:20:33.994 00000270 82 74 7d 21 65 11 82 49 f2 bf da 3d da bd 7b ec .t}!e..I...=..{. 00:20:33.994 00000280 c7 3d 22 ef d8 7e 47 09 33 8d a9 c4 9e fd db 80 .="..~G.3....... 00:20:33.994 00000290 26 09 20 69 ed c8 cb 45 94 1d cd d8 a9 78 93 d5 &. i...E.....x.. 00:20:33.994 000002a0 4c 38 9e 3c 9d b8 86 db b9 d3 4a 75 09 19 44 5d L8.<......Ju..D] 00:20:33.994 000002b0 dc d9 45 b1 9a ea ce 89 b3 81 e2 df 38 b0 20 2e ..E.........8. . 00:20:33.994 000002c0 c0 2b a4 04 75 6c 43 ee 3d a8 f3 00 d3 16 c3 e9 .+..ulC.=....... 00:20:33.994 000002d0 55 13 9b 2e dc 14 bf a0 a4 be 71 3b 23 3f ac f4 U.........q;#?.. 00:20:33.994 000002e0 7f 9b 0c 41 92 89 5c 05 d3 74 8b c4 67 22 0f 49 ...A..\..t..g".I 00:20:33.994 000002f0 57 b6 b9 66 b6 e1 f5 e2 61 b0 0e 5d 2b 49 1c 5d W..f....a..]+I.] 00:20:33.994 00000300 66 80 86 bb 0f 8c 78 5d bd ea 25 58 64 f3 65 17 f.....x]..%Xd.e. 00:20:33.994 00000310 2b e0 b7 3f 7e 26 51 e7 5f a6 03 34 db e2 52 39 +..?~&Q._..4..R9 00:20:33.994 00000320 30 47 bc 35 26 57 86 e5 de 90 66 a5 25 87 32 88 0G.5&W....f.%.2. 00:20:33.994 00000330 ab 10 60 6b 47 e5 9f 7d bc 87 b4 5b 55 2d 17 aa ..`kG..}...[U-.. 00:20:33.994 00000340 91 01 14 02 b6 45 cf 88 2f a1 93 ed 89 21 37 48 .....E../....!7H 00:20:33.994 00000350 5d 71 0b 5f 2a c3 de d6 76 de db eb 80 e8 14 2f ]q._*...v....../ 00:20:33.994 00000360 45 22 26 22 ce 2d 9d 6b 1e 75 10 9e a6 6e 31 05 E"&".-.k.u...n1. 00:20:33.994 00000370 89 23 7d c3 73 e7 91 5e b3 89 61 c9 a4 cd f3 01 .#}.s..^..a..... 00:20:33.994 00000380 d8 fc 34 58 f3 8b af 94 1c da 5a 85 87 8b 03 30 ..4X......Z....0 00:20:33.994 00000390 05 c2 9c 1f 67 db bc cc 13 d0 40 0e df fe 01 97 ....g.....@..... 00:20:33.994 000003a0 9a 3b bf 11 a7 41 49 b4 20 bf c0 ac 27 ae cd 01 .;...AI. ...'... 00:20:33.994 000003b0 ee 37 f6 7c 48 65 14 24 35 0c ac 29 55 6b dc 98 .7.|He.$5..)Uk.. 00:20:33.994 000003c0 29 90 f9 a5 35 5e 8a 64 e4 2d e6 56 c1 c0 c3 c6 )...5^.d.-.V.... 00:20:33.994 000003d0 11 d8 42 14 9f b2 82 e7 8c f5 74 6c 85 72 1d 37 ..B.......tl.r.7 00:20:33.994 000003e0 70 bd 22 b4 83 89 25 74 d5 df 44 d0 fc b3 7d 39 p."...%t..D...}9 00:20:33.994 000003f0 52 92 40 ef f3 f4 3e 0a b5 62 79 35 7d 0f 61 da R.@...>..by5}.a. 00:20:33.994 dh secret: 00:20:33.994 00000000 57 5d 81 27 f9 f7 80 c8 e8 3d 75 fa 79 fa 83 c0 W].'.....=u.y... 00:20:33.994 00000010 f8 6f 03 bb 95 51 aa b0 20 c1 9d 1e da dd 19 f7 .o...Q.. ....... 00:20:33.994 00000020 f9 df 80 49 b9 d0 10 d9 2d b8 cf 37 f7 67 eb e0 ...I....-..7.g.. 00:20:33.994 00000030 ca 6a b3 a6 02 f7 b7 4c 8d ce 2d 96 d2 21 98 94 .j.....L..-..!.. 00:20:33.994 00000040 f1 7c ad 11 63 09 7b 92 23 d8 b1 1e a4 b5 dc 29 .|..c.{.#......) 00:20:33.994 00000050 11 3e 03 58 25 fb 7e a3 1c 9d a2 55 4a a4 35 d9 .>.X%.~....UJ.5. 00:20:33.994 00000060 2a b4 dc fc 98 6b 49 d8 d2 a4 6a 5e 2e 61 c0 99 *....kI...j^.a.. 00:20:33.994 00000070 5e 28 4e 2e fa 90 6a 07 40 88 f2 7e c2 9f e3 f7 ^(N...j.@..~.... 00:20:33.994 00000080 7d b7 61 14 69 d4 c3 70 cf 60 c6 bf 68 76 64 25 }.a.i..p.`..hvd% 00:20:33.994 00000090 aa f3 97 98 87 4e c0 aa 86 eb f7 98 29 f3 7c c7 .....N......).|. 00:20:33.994 000000a0 86 0b e6 6e 45 c2 8b 53 34 7a 79 0b 23 5e 1e 7d ...nE..S4zy.#^.} 00:20:33.994 000000b0 f5 48 3b 9c b1 17 4b 30 62 ed 4e 87 ab d3 f1 5e .H;...K0b.N....^ 00:20:33.994 000000c0 f0 2f 5a f0 97 62 2f 32 f8 e3 13 a5 be 9f 60 16 ./Z..b/2......`. 00:20:33.994 000000d0 0b d9 f5 47 91 9c 02 92 dd 20 8c 34 fa 63 5c 42 ...G..... .4.c\B 00:20:33.994 000000e0 94 8b 44 2b 00 6a 06 68 40 e0 63 2d 1e 68 f6 65 ..D+.j.h@.c-.h.e 00:20:33.994 000000f0 51 0b 09 d7 b7 e1 ff eb e3 63 88 a8 bf b5 3b f1 Q........c....;. 00:20:33.994 00000100 ce 12 eb 2a 16 94 75 ef 70 4d 0d dd c7 3c 2c 9d ...*..u.pM...<,. 00:20:33.994 00000110 58 6a 98 3a 2d cd 37 25 d8 81 f0 6f cf 4b 4d f5 Xj.:-.7%...o.KM. 00:20:33.994 00000120 ad db b4 a0 67 2f 1b 3c 27 76 dc 5c 50 9b 6f de ....g/.<'v.\P.o. 00:20:33.994 00000130 74 f5 06 23 39 c9 67 1f d3 ea 0d 35 f8 0c 6e 6c t..#9.g....5..nl 00:20:33.994 00000140 0d 84 4f ff 43 8a d0 53 57 28 1e 94 36 0c 7e 88 ..O.C..SW(..6.~. 00:20:33.994 00000150 f2 99 a2 d0 7d bb fb 64 9b 99 ea 29 dd 45 60 3f ....}..d...).E`? 00:20:33.994 00000160 69 55 82 4e 17 0e 82 44 6c 83 07 03 43 c8 e7 75 iU.N...Dl...C..u 00:20:33.994 00000170 54 04 04 b6 7a f1 89 0d 7e 0f d9 46 43 64 c8 d8 T...z...~..FCd.. 00:20:33.994 00000180 fc 8b 93 34 fc cf ad 21 a4 d4 87 56 14 23 b5 24 ...4...!...V.#.$ 00:20:33.994 00000190 71 57 de c6 cd 88 f5 16 77 b9 89 b8 7c 95 a8 47 qW......w...|..G 00:20:33.994 000001a0 eb ae 8f 8c 4e 23 1b 13 af b4 f5 29 d0 58 79 47 ....N#.....).XyG 00:20:33.994 000001b0 61 ca a9 4f 0d 6e c0 59 ce f9 05 c6 68 9d 51 f0 a..O.n.Y....h.Q. 00:20:33.994 000001c0 f5 5b b1 79 3a cf bd 1a 3a d0 e3 02 85 31 0e 43 .[.y:...:....1.C 00:20:33.994 000001d0 c7 c6 7f 4e f1 5a dd 36 a3 a7 41 81 80 72 f8 d2 ...N.Z.6..A..r.. 00:20:33.994 000001e0 ef 3d e1 46 2f 19 31 80 ef 31 a9 4d f7 9a cb 74 .=.F/.1..1.M...t 00:20:33.994 000001f0 36 9e ec 8c e9 59 6e e0 91 9b 39 66 ba 3b ac c8 6....Yn...9f.;.. 00:20:33.994 00000200 27 6c fe 16 34 3a b1 52 10 17 32 3e 1d 61 e8 22 'l..4:.R..2>.a." 00:20:33.994 00000210 24 14 3a 35 fb 35 aa aa 47 c1 17 f5 48 67 7e c1 $.:5.5..G...Hg~. 00:20:33.994 00000220 74 b4 58 6a 38 0e 8e 88 c7 3b 7a 1a 00 5d 78 39 t.Xj8....;z..]x9 00:20:33.994 00000230 89 01 f3 c1 8e 54 42 e6 52 df cf c0 f7 7d c3 b3 .....TB.R....}.. 00:20:33.994 00000240 1f 69 ae 3b 0d 92 41 92 b6 3d da c0 31 aa 68 e5 .i.;..A..=..1.h. 00:20:33.994 00000250 27 ca 39 2b 49 f7 39 aa 22 a2 91 c7 02 2e 06 46 '.9+I.9."......F 00:20:33.994 00000260 00 ff f7 02 4a 0e 93 2c 11 0b a7 9c 6e 20 74 b1 ....J..,....n t. 00:20:33.994 00000270 00 e4 67 7f 7e 47 d7 79 10 31 da 01 0b 01 34 1e ..g.~G.y.1....4. 00:20:33.994 00000280 8d 33 b7 f3 06 71 da c5 3b 2d 3f a5 a4 06 6a 20 .3...q..;-?...j 00:20:33.994 00000290 96 40 a3 cb f0 5e 28 55 41 96 b7 24 4a 2a 10 fa .@...^(UA..$J*.. 00:20:33.994 000002a0 ad 80 ff 05 c4 a8 72 6f 65 d6 78 ea 85 b1 97 e5 ......roe.x..... 00:20:33.994 000002b0 8c 8a b7 15 39 bd 0e 6e 4a e0 8f 5e 43 47 bc 6f ....9..nJ..^CG.o 00:20:33.994 000002c0 fd 86 3a 18 3d c3 2c d5 c2 18 10 cf 55 19 5f aa ..:.=.,.....U._. 00:20:33.994 000002d0 e2 57 c1 3f 55 14 f6 33 9c 79 ef 75 04 5b 52 1d .W.?U..3.y.u.[R. 00:20:33.994 000002e0 fd 53 bd 04 20 7d cf a9 d5 cb 2f c4 93 f4 82 83 .S.. }..../..... 00:20:33.994 000002f0 ca 27 20 34 cb d2 f7 82 07 5d 3b b2 ce df 2b b3 .' 4.....];...+. 00:20:33.994 00000300 fb e8 52 27 31 2a f5 46 09 d4 67 d8 25 44 84 03 ..R'1*.F..g.%D.. 00:20:33.994 00000310 be a3 4e 17 7a bb 23 9a b0 85 df c1 5a 68 e0 87 ..N.z.#.....Zh.. 00:20:33.994 00000320 a1 b3 ad 33 72 7f 57 cb 0f ae 69 47 43 e7 ef a2 ...3r.W...iGC... 00:20:33.994 00000330 7a f4 70 9e b9 e3 4c 29 9c 6f 64 a0 7e d6 15 bd z.p...L).od.~... 00:20:33.994 00000340 eb b7 92 89 a8 e7 e6 78 f7 00 6e af dc 5e 0f dd .......x..n..^.. 00:20:33.994 00000350 b0 30 87 b2 63 0d 02 97 cc 6c 6b ba 79 ad 16 f0 .0..c....lk.y... 00:20:33.994 00000360 f5 a4 d9 07 79 b1 4d 68 08 a4 e3 3c a6 33 ad 95 ....y.Mh...<.3.. 00:20:33.994 00000370 22 fc 9d 01 c1 86 79 f9 10 19 92 5b 51 9f d1 75 ".....y....[Q..u 00:20:33.994 00000380 a0 f9 e2 27 94 96 83 ac 7c 56 34 37 ce 47 4f 7b ...'....|V47.GO{ 00:20:33.994 00000390 7b 95 3b 77 3e 43 33 84 f7 65 fc dd ed ff 80 9e {.;w>C3..e...... 00:20:33.994 000003a0 64 c0 df 0b 25 d6 6a 58 bf 51 05 a6 ef 36 e6 cd d...%.jX.Q...6.. 00:20:33.994 000003b0 c9 cc 7c 7a 70 33 82 80 5b 56 e6 46 aa 50 0f 70 ..|zp3..[V.F.P.p 00:20:33.994 000003c0 1f b1 79 6d 4f 04 77 69 52 83 da bc 82 f4 92 ab ..ymO.wiR....... 00:20:33.994 000003d0 37 a8 9e b2 16 83 d9 ed 9f 0e 43 a5 50 38 a2 80 7.........C.P8.. 00:20:33.994 000003e0 f5 04 81 9e 31 7e 71 3d 91 66 61 13 cf c2 e8 69 ....1~q=.fa....i 00:20:33.994 000003f0 e2 c6 25 3f 21 d8 73 80 b8 b0 f3 04 75 1a 73 b5 ..%?!.s.....u.s. 00:20:33.994 [2024-09-27 15:25:33.642149] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key2, hash=3, dhgroup=5, seq=3428451840, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.994 [2024-09-27 15:25:33.642255] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.994 [2024-09-27 15:25:33.720508] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.994 [2024-09-27 15:25:33.720552] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.994 [2024-09-27 15:25:33.720562] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.994 [2024-09-27 15:25:33.720589] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.994 [2024-09-27 15:25:33.907224] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.994 [2024-09-27 15:25:33.907243] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.994 [2024-09-27 15:25:33.907251] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.994 [2024-09-27 15:25:33.907303] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.994 [2024-09-27 15:25:33.907328] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.994 ctrlr pubkey: 00:20:33.994 00000000 af 44 d4 21 3e 62 04 04 9a e0 60 34 13 19 2b 9f .D.!>b....`4..+. 00:20:33.994 00000010 1c e7 44 fd 1d 93 b0 f5 6a 26 f0 61 a9 41 df c7 ..D.....j&.a.A.. 00:20:33.994 00000020 b4 31 ad 06 e7 f4 1c fc da 47 58 13 ee 11 72 2a .1.......GX...r* 00:20:33.994 00000030 b5 e8 2d 3b 22 c4 4c 1b 50 5e 1e 3b fa 5a d6 8f ..-;".L.P^.;.Z.. 00:20:33.994 00000040 70 52 0a 21 4f bf a8 a7 e2 ed c2 55 2e ad 18 c1 pR.!O......U.... 00:20:33.994 00000050 67 17 b9 14 4b 13 e3 c6 34 02 1c d0 99 1e 32 d9 g...K...4.....2. 00:20:33.994 00000060 07 ba 2b 8c 3c 41 49 0d 2b 4f 1e 2b 45 8d d2 63 ..+.... 00:20:33.994 00000110 8b 29 9d 0a e5 87 35 70 dc f4 95 f4 f9 00 e1 7d .)....5p.......} 00:20:33.994 00000120 50 05 9f 84 20 7d 3f 46 62 5b e9 b6 f7 0a c2 d4 P... }?Fb[...... 00:20:33.994 00000130 f4 12 38 97 6c 51 d0 4a 48 fa c1 58 ea 49 56 1a ..8.lQ.JH..X.IV. 00:20:33.994 00000140 ff 82 e7 5a 3e 82 ce 70 4c e7 fc 45 4d d2 f5 dd ...Z>..pL..EM... 00:20:33.994 00000150 35 bb 7c 1d 66 5f d1 c0 f6 76 3b 28 49 99 f7 ea 5.|.f_...v;(I... 00:20:33.994 00000160 64 5a 87 e6 db fe cf d9 b8 8e 76 ca 44 b1 9e 22 dZ........v.D.." 00:20:33.995 00000170 56 c6 48 5c 13 3b ac 39 b8 c2 b8 cf c7 8e eb 19 V.H\.;.9........ 00:20:33.995 00000180 5e 3e 6a 2d ce 5c 84 b2 a4 2b 41 e3 2b af ba 85 ^>j-.\...+A.+... 00:20:33.995 00000190 a5 7d e9 74 33 63 c7 8c 3d f6 2f 6a 2b 92 d0 ba .}.t3c..=./j+... 00:20:33.995 000001a0 86 af 81 93 31 84 fa 52 40 17 6b 4b c9 ee c7 aa ....1..R@.kK.... 00:20:33.995 000001b0 18 aa 5c 9f 1b 92 ad 01 07 2d 1a d4 44 92 b3 62 ..\......-..D..b 00:20:33.995 000001c0 c4 fe 94 ff 32 09 8b 08 d1 34 27 9b 01 c4 b0 a8 ....2....4'..... 00:20:33.995 000001d0 cb ff be 8d d6 38 28 7c fd 7a e6 c8 8d 20 7b f6 .....8(|.z... {. 00:20:33.995 000001e0 39 e0 9d 4f 85 53 4a 87 21 83 6f 94 52 35 c0 e7 9..O.SJ.!.o.R5.. 00:20:33.995 000001f0 e9 4d 56 db 8e a4 cf 2f 78 b2 2c 11 da 09 ee ba .MV..../x.,..... 00:20:33.995 00000200 ae 0c 7d 09 6d 27 15 b5 bd e6 c8 59 a7 69 51 ed ..}.m'.....Y.iQ. 00:20:33.995 00000210 86 88 a7 15 dd 75 02 04 e6 e3 ed 8b 99 aa 72 8a .....u........r. 00:20:33.995 00000220 e3 f1 68 ec 0f 6f 31 24 9e 7d 5d 8e 8f 73 36 24 ..h..o1$.}]..s6$ 00:20:33.995 00000230 d7 2f 1b e2 e5 c6 40 09 0d f3 97 51 b0 64 db 35 ./....@....Q.d.5 00:20:33.995 00000240 56 8b d8 11 3e db 59 54 41 9c 4d 72 a9 0d 49 74 V...>.YTA.Mr..It 00:20:33.995 00000250 11 bb f9 a3 d7 0f 7f c0 7c 21 f7 05 d4 8e a0 9c ........|!...... 00:20:33.995 00000260 ec 67 ea 3c 1f 82 26 44 a5 45 30 b8 6c 7e ee db .g.<..&D.E0.l~.. 00:20:33.995 00000270 f7 46 df 87 1c 1f 34 a4 2c 57 e7 c0 ab 12 2f 3f .F....4.,W..../? 00:20:33.995 00000280 4d 44 6d 19 e3 f4 03 a6 58 d0 7e 03 75 cf 06 7d MDm.....X.~.u..} 00:20:33.995 00000290 f3 b1 68 1b 36 c4 d8 7c 37 3a ca 18 49 1f ac 9d ..h.6..|7:..I... 00:20:33.995 000002a0 1f b8 26 ab ff e5 d2 af 6b 30 ba c3 ad 4b cc 91 ..&.....k0...K.. 00:20:33.995 000002b0 66 94 d1 5f 0b d5 94 61 bd 96 df e2 9b 6d 8b 3b f.._...a.....m.; 00:20:33.995 000002c0 e9 54 6d 97 b9 e7 be 1e 13 39 46 92 ae d9 fc 77 .Tm......9F....w 00:20:33.995 000002d0 ef 43 32 c9 da 02 90 a6 b2 67 d4 d3 dc d9 f7 33 .C2......g.....3 00:20:33.995 000002e0 b0 1d 5b 48 eb 3a 2d ef 01 51 77 9d 80 58 0e 05 ..[H.:-..Qw..X.. 00:20:33.995 000002f0 64 4a 9f dd d7 05 31 8d eb 8f 46 65 5a be 10 24 dJ....1...FeZ..$ 00:20:33.995 00000300 84 7a 6d 05 4d f8 52 29 31 4f a0 cb bb da e0 23 .zm.M.R)1O.....# 00:20:33.995 00000310 69 86 23 77 10 42 38 5f 50 21 95 1a 61 07 ce c1 i.#w.B8_P!..a... 00:20:33.995 00000320 71 ff 8a cb a0 a4 b9 a5 08 28 b7 8a 54 0a e9 04 q........(..T... 00:20:33.995 00000330 92 9e 01 2e 1f a3 bb b5 cb 17 e8 fc 23 bf c9 a5 ............#... 00:20:33.995 00000340 e3 a1 52 c4 ca 85 94 21 47 8f a7 ae 25 ef ac 90 ..R....!G...%... 00:20:33.995 00000350 71 75 2d c6 6f b6 b5 0a 35 56 63 0c 4c 99 04 51 qu-.o...5Vc.L..Q 00:20:33.995 00000360 9a 5a fc 5b 5b a9 73 84 4a 58 56 41 3b 33 46 8d .Z.[[.s.JXVA;3F. 00:20:33.995 00000370 ad 48 a0 f1 42 ed eb 15 40 b7 9a 83 d7 af a8 41 .H..B...@......A 00:20:33.995 00000380 48 6d 2e a6 f6 d2 7d c2 a9 f9 af 9b 5c 09 45 52 Hm....}.....\.ER 00:20:33.995 00000390 23 6e 06 46 d2 14 59 07 f3 7f d9 0d 4c 06 19 57 #n.F..Y.....L..W 00:20:33.995 000003a0 af f8 f3 e9 a5 88 22 e4 d2 e8 5b b5 17 3a 27 26 ......"...[..:'& 00:20:33.995 000003b0 a4 30 e9 0d 15 9b c6 75 48 f2 2d ad e6 2f 70 df .0.....uH.-../p. 00:20:33.995 000003c0 cc e8 f9 ea 3e eb a5 50 44 6f 54 25 28 cf 4f a1 ....>..PDoT%(.O. 00:20:33.995 000003d0 59 96 a7 dc bf 6e c7 0e 52 eb 90 a6 ab a8 6b 85 Y....n..R.....k. 00:20:33.995 000003e0 dd 8e d3 7b 44 53 9e 55 86 92 12 a0 77 2b b5 19 ...{DS.U....w+.. 00:20:33.995 000003f0 8c 6b d2 05 de 27 0c eb b6 c3 e4 2c 55 70 80 f5 .k...'.....,Up.. 00:20:33.995 host pubkey: 00:20:33.995 00000000 21 7b c9 bf 49 ba 85 2e 4f 8e 33 27 95 f2 40 03 !{..I...O.3'..@. 00:20:33.995 00000010 7c 9c 6a 24 4e f5 d0 37 8c cf 0e 83 a3 3f 8f d7 |.j$N..7.....?.. 00:20:33.995 00000020 2f 01 e8 b4 04 12 d3 d3 e3 e3 45 cc 9a 0c f1 3f /.........E....? 00:20:33.995 00000030 b4 b9 dc 86 09 73 56 4d c5 30 9c 5d 5e e2 67 69 .....sVM.0.]^.gi 00:20:33.995 00000040 ce 91 ce 0f 4d 2f da aa b6 b7 23 97 43 a0 2c 24 ....M/....#.C.,$ 00:20:33.995 00000050 3c 4f 62 9e fd 0c 5a 67 31 28 5a 5a 72 9f a5 6f ..#.-|.. 00:20:33.995 00000230 22 ea dc e7 fb 39 97 f6 36 08 32 60 05 bd e4 f7 "....9..6.2`.... 00:20:33.995 00000240 4f 97 ee 33 2f 93 94 95 aa c7 28 a0 5f dc f9 ee O..3/.....(._... 00:20:33.995 00000250 40 39 54 6e 9b e0 60 b7 c2 d6 c5 fd 8a 29 b0 34 @9Tn..`......).4 00:20:33.995 00000260 f7 eb 0f ca f0 63 42 ea c5 31 77 72 60 58 27 fa .....cB..1wr`X'. 00:20:33.995 00000270 ce 05 78 2f c9 d5 26 0e 2d f5 b4 c6 b1 0c cc 2d ..x/..&.-......- 00:20:33.995 00000280 89 63 90 75 cd c0 43 c9 7d 5b f3 24 83 22 8d 3a .c.u..C.}[.$.".: 00:20:33.995 00000290 16 46 69 ae e8 c4 53 62 55 d5 49 dc ea f7 f1 81 .Fi...SbU.I..... 00:20:33.995 000002a0 3a 84 21 8f 43 9a 9a 84 66 6f a6 65 dd 02 6d 6f :.!.C...fo.e..mo 00:20:33.995 000002b0 90 6f 95 73 29 86 93 44 5d d9 85 b6 8d cc 1f 54 .o.s)..D]......T 00:20:33.995 000002c0 53 c2 47 76 d8 e6 65 75 04 5d ea 7c 68 52 77 45 S.Gv..eu.].|hRwE 00:20:33.995 000002d0 41 0c c9 96 7e b3 2a 3b 5c 97 95 dd 6b 16 5d b9 A...~.*;\...k.]. 00:20:33.995 000002e0 14 82 b1 0f 15 40 c8 a8 ce 35 29 4b c9 7d a5 7a .....@...5)K.}.z 00:20:33.995 000002f0 c5 f2 85 4d b6 27 d9 b3 7c 90 fa 50 67 c7 69 f7 ...M.'..|..Pg.i. 00:20:33.995 00000300 3e ce 25 58 ce 65 3a b6 27 73 1b 5d 11 b0 bc 2e >.%X.e:.'s.].... 00:20:33.995 00000310 17 a2 1b 2c 6b e9 b3 a2 24 c6 f6 b0 0e 5e d2 c1 ...,k...$....^.. 00:20:33.995 00000320 1a ef c4 06 57 c7 a2 0b dd 82 cf b5 46 56 d5 5d ....W.......FV.] 00:20:33.995 00000330 63 54 5d b8 92 c7 9f f2 3b 3f 40 d1 cc 81 11 3a cT].....;?@....: 00:20:33.995 00000340 5d 7e 64 98 93 dd ed 6a 1c 0d 28 6e 3f fa 54 f7 ]~d....j..(n?.T. 00:20:33.995 00000350 c5 fc f2 d9 a0 f5 ed 44 68 5e 6e 48 57 46 10 6d .......Dh^nHWF.m 00:20:33.995 00000360 06 0b f9 3b 85 5b 76 72 20 23 bd c5 ea 1d b6 c9 ...;.[vr #...... 00:20:33.995 00000370 49 e6 6a bd 5c 6f 53 d1 49 5d 16 af 53 50 67 9b I.j.\oS.I]..SPg. 00:20:33.995 00000380 f3 4e 48 3e e4 cc b5 46 bc 88 67 6b 31 7f d3 b6 .NH>...F..gk1... 00:20:33.995 00000390 fe a2 4b fe 6e 9a 81 a8 6f aa 0d a5 26 f2 ff 45 ..K.n...o...&..E 00:20:33.995 000003a0 bc c0 50 90 07 bf 43 86 15 98 04 91 22 ab 23 45 ..P...C.....".#E 00:20:33.995 000003b0 fc df bb 6f 5b a9 15 61 a9 6d 0e 69 50 d0 81 18 ...o[..a.m.iP... 00:20:33.995 000003c0 15 bc af d4 84 e2 fb 2f 30 0d 4c ce 60 5d 01 b9 ......./0.L.`].. 00:20:33.995 000003d0 c4 3e b8 ad bb 3e d3 87 9a 82 f1 0a 0c d0 02 2d .>...>.........- 00:20:33.995 000003e0 e8 1b 8a 71 14 95 1b 09 55 ff 1a 64 4a 87 36 70 ...q....U..dJ.6p 00:20:33.995 000003f0 8f 8d 5c 31 fc 34 88 fa 2e f6 9a 4b 3d b6 8a ee ..\1.4.....K=... 00:20:33.995 dh secret: 00:20:33.995 00000000 02 26 1f 81 f9 5d d3 38 06 69 e7 bd f7 87 90 bd .&...].8.i...... 00:20:33.995 00000010 9d a0 c3 c5 eb 9b 46 aa e0 32 72 6f 8b 75 ae 9c ......F..2ro.u.. 00:20:33.995 00000020 8b 32 ad b4 ee df 9d 19 79 95 f3 9e 28 8c 82 db .2......y...(... 00:20:33.995 00000030 38 02 5c d0 30 d1 6d 5e a1 a1 f0 d9 e7 ef 94 78 8.\.0.m^.......x 00:20:33.995 00000040 e7 82 82 96 f3 75 77 e9 a2 e9 d5 70 b9 25 97 2a .....uw....p.%.* 00:20:33.995 00000050 2e ff ae b4 47 2c e1 1e 9a 80 30 86 eb 6d a7 6a ....G,....0..m.j 00:20:33.995 00000060 e2 d5 f2 8d 26 bd 4d e1 3b 58 a4 4a 35 22 85 09 ....&.M.;X.J5".. 00:20:33.995 00000070 40 a2 52 a9 47 12 f1 7f f9 17 43 db c5 1b a1 35 @.R.G.....C....5 00:20:33.995 00000080 a0 49 56 c6 69 ef 8f a3 a3 db 60 5b 04 4d 0b a5 .IV.i.....`[.M.. 00:20:33.995 00000090 d1 95 d2 98 d3 9d 08 92 fe 49 96 50 4a 70 d1 20 .........I.PJp. 00:20:33.995 000000a0 a9 27 b1 89 c2 f6 6f 0b 3e 39 b1 4f 0a 50 de 4f .'....o.>9.O.P.O 00:20:33.995 000000b0 f5 51 85 65 59 01 68 6f 38 8e 78 77 7c 23 08 6e .Q.eY.ho8.xw|#.n 00:20:33.995 000000c0 d9 c1 4b 74 73 d5 5a c5 43 ec d7 05 a6 d1 24 aa ..Kts.Z.C.....$. 00:20:33.995 000000d0 2e 30 dc 5e 92 12 5c e7 5f 64 83 ad fc 12 14 91 .0.^..\._d...... 00:20:33.995 000000e0 00 23 19 36 ff 4a 63 00 e5 c0 a0 fd cf ce 4e e6 .#.6.Jc.......N. 00:20:33.995 000000f0 f4 05 49 ea ae 23 cc a9 27 bb e3 0e 2d 42 be 47 ..I..#..'...-B.G 00:20:33.995 00000100 6a f8 f7 2e 47 e8 06 e3 76 80 69 56 af d4 28 78 j...G...v.iV..(x 00:20:33.995 00000110 60 31 ee b2 fc ba 27 b1 da bf 05 c9 84 14 43 e9 `1....'.......C. 00:20:33.995 00000120 37 f3 0a 48 5f 0a e6 d6 5f 83 40 ed 88 11 96 c4 7..H_..._.@..... 00:20:33.995 00000130 b4 48 5b a3 46 be 9c 30 5b 1a d5 f1 19 31 74 ab .H[.F..0[....1t. 00:20:33.995 00000140 94 e5 41 32 aa 34 3d 2f 37 04 1d af b7 c5 20 f2 ..A2.4=/7..... . 00:20:33.995 00000150 2d ed 0b e1 7d ac 89 03 76 d0 ae 2b 0d c1 e6 8a -...}...v..+.... 00:20:33.995 00000160 40 f8 a7 06 eb 71 a0 02 5c df 81 9e 22 69 b9 98 @....q..\..."i.. 00:20:33.995 00000170 fd 08 62 1b 50 e1 5c 2d 4c ee 8b 3a 68 c2 cf f9 ..b.P.\-L..:h... 00:20:33.995 00000180 8e d2 19 ca 6b 70 eb 40 ec 80 98 5a c6 47 73 31 ....kp.@...Z.Gs1 00:20:33.995 00000190 ad 0e c7 16 4f 9c d9 2f ad d6 96 f0 16 cc 69 68 ....O../......ih 00:20:33.995 000001a0 a4 02 b8 01 05 ab 99 bb 6c c9 df a7 af 78 f3 20 ........l....x. 00:20:33.995 000001b0 88 d6 0d c9 cc 56 69 1d ff f0 22 29 f6 40 c7 41 .....Vi...").@.A 00:20:33.995 000001c0 e8 9a 16 9b bd 0e 73 c6 d0 ef 78 e4 30 e2 e3 3a ......s...x.0..: 00:20:33.995 000001d0 45 a9 ed 1e fc 3d 29 04 5d 46 97 a7 13 09 81 6f E....=).]F.....o 00:20:33.995 000001e0 e5 1d bd 87 4d f3 ca 23 c0 d6 0b 5e dc 82 43 06 ....M..#...^..C. 00:20:33.995 000001f0 ad 89 77 ed 79 65 25 77 86 08 cf 07 92 15 0d 9b ..w.ye%w........ 00:20:33.995 00000200 f6 17 00 c6 42 11 19 1f c1 a2 ed 98 cf 23 cb 1e ....B........#.. 00:20:33.995 00000210 ab bb 9f 96 ba c9 1d 60 63 f1 c9 77 70 d3 9d cc .......`c..wp... 00:20:33.995 00000220 a3 e9 6d 85 bc 75 c4 85 a8 2f 78 9c 36 29 42 16 ..m..u.../x.6)B. 00:20:33.995 00000230 09 4f a7 cd 34 ea dc e8 42 84 5b 4d 07 3f 83 50 .O..4...B.[M.?.P 00:20:33.995 00000240 16 6e 28 1f 98 ff 80 f8 7c 0b 31 e5 2f ed 05 91 .n(.....|.1./... 00:20:33.995 00000250 8f b2 42 46 57 a4 eb 58 11 53 3a 56 01 d6 c9 9e ..BFW..X.S:V.... 00:20:33.995 00000260 64 c3 a7 41 39 2c c6 3d b1 5e 9d c9 e9 89 9f 2f d..A9,.=.^...../ 00:20:33.995 00000270 8b 50 85 e3 1f 03 85 87 b9 ba 24 84 e4 50 5e 3a .P........$..P^: 00:20:33.995 00000280 be 50 ed f4 79 cb 66 98 f5 5a 2a 7b 50 c7 29 f5 .P..y.f..Z*{P.). 00:20:33.995 00000290 74 74 c5 30 43 3b d8 d4 ce 3d 6e f5 dc d1 7b 03 tt.0C;...=n...{. 00:20:33.995 000002a0 fb 5f bb 3b c0 7e df 47 fc bd 7c d7 95 5c e2 cb ._.;.~.G..|..\.. 00:20:33.995 000002b0 c0 63 9a 47 a2 90 2b d2 9a 23 7f da c2 30 08 b1 .c.G..+..#...0.. 00:20:33.995 000002c0 0b 49 03 ad a7 5b 1f a6 23 f7 71 ab 7a b1 73 a9 .I...[..#.q.z.s. 00:20:33.995 000002d0 8b 4c 4e 66 b5 0e b1 db 91 25 8c 8b cd aa 5c ae .LNf.....%....\. 00:20:33.996 000002e0 3f 04 b2 22 44 f3 f0 63 30 a7 45 a4 c2 61 ea 27 ?.."D..c0.E..a.' 00:20:33.996 000002f0 b8 71 f5 ac e4 0a 10 c5 55 64 dc ac 2f 46 6f a2 .q......Ud../Fo. 00:20:33.996 00000300 b1 eb e2 41 3f c1 e6 52 8b ad 72 4b 68 59 a8 fc ...A?..R..rKhY.. 00:20:33.996 00000310 7b 07 5d bb 9f 13 5c 93 a8 3f 04 90 c5 dc ee 3b {.]...\..?.....; 00:20:33.996 00000320 c4 6e 6f 38 45 8b 56 d5 40 49 d5 4d b1 1e 3f 17 .no8E.V.@I.M..?. 00:20:33.996 00000330 5e 11 32 fc 6c f7 c0 bf 7e 88 e2 62 ae 40 ac 75 ^.2.l...~..b.@.u 00:20:33.996 00000340 e5 2a ff 16 ce 99 2e 8f 85 fe 58 42 db a1 2a fe .*........XB..*. 00:20:33.996 00000350 b1 4f f8 08 60 81 53 af 76 74 32 f4 5d 7b 18 5a .O..`.S.vt2.]{.Z 00:20:33.996 00000360 73 51 90 25 3f 12 24 38 79 77 33 e5 aa eb 85 23 sQ.%?.$8yw3....# 00:20:33.996 00000370 89 82 67 c6 ef 6a 2a 61 97 b5 35 20 69 ad cb 80 ..g..j*a..5 i... 00:20:33.996 00000380 9e d2 d2 1c 90 bf b0 41 b8 bb 81 bd ad 51 b6 89 .......A.....Q.. 00:20:33.996 00000390 14 c0 38 3f bf c8 8b 04 10 92 c3 d6 fd ab eb 18 ..8?............ 00:20:33.996 000003a0 0f 38 93 72 11 70 64 cb 7c 65 02 e2 6f c7 3b c2 .8.r.pd.|e..o.;. 00:20:33.996 000003b0 79 47 67 1d 2d 98 0c 3f 6e 3e 51 08 16 db 50 75 yGg.-..?n>Q...Pu 00:20:33.996 000003c0 bd 80 cb 0d c3 ed 4b 41 06 9d e6 b0 bb ed d5 3c ......KA.......< 00:20:33.996 000003d0 1b c8 a9 5d 19 c1 aa 26 90 fe ac de 17 1d 40 9d ...]...&......@. 00:20:33.996 000003e0 ef ab cf 22 22 3e 85 79 8e ac 58 49 42 79 a3 6e ..."">.y..XIBy.n 00:20:33.996 000003f0 61 4f a1 d6 58 5f c5 9f 4e 3d 88 29 34 c4 ff a4 aO..X_..N=.)4... 00:20:33.996 [2024-09-27 15:25:34.019028] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key3, hash=3, dhgroup=5, seq=3428451841, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.996 [2024-09-27 15:25:34.075814] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.996 [2024-09-27 15:25:34.075858] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.996 [2024-09-27 15:25:34.075875] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.996 [2024-09-27 15:25:34.075894] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success2 00:20:33.996 [2024-09-27 15:25:34.075909] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.996 [2024-09-27 15:25:34.181763] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.996 [2024-09-27 15:25:34.181780] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.996 [2024-09-27 15:25:34.181788] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.996 [2024-09-27 15:25:34.181797] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.996 [2024-09-27 15:25:34.181851] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.996 ctrlr pubkey: 00:20:33.996 00000000 af 44 d4 21 3e 62 04 04 9a e0 60 34 13 19 2b 9f .D.!>b....`4..+. 00:20:33.996 00000010 1c e7 44 fd 1d 93 b0 f5 6a 26 f0 61 a9 41 df c7 ..D.....j&.a.A.. 00:20:33.996 00000020 b4 31 ad 06 e7 f4 1c fc da 47 58 13 ee 11 72 2a .1.......GX...r* 00:20:33.996 00000030 b5 e8 2d 3b 22 c4 4c 1b 50 5e 1e 3b fa 5a d6 8f ..-;".L.P^.;.Z.. 00:20:33.996 00000040 70 52 0a 21 4f bf a8 a7 e2 ed c2 55 2e ad 18 c1 pR.!O......U.... 00:20:33.996 00000050 67 17 b9 14 4b 13 e3 c6 34 02 1c d0 99 1e 32 d9 g...K...4.....2. 00:20:33.996 00000060 07 ba 2b 8c 3c 41 49 0d 2b 4f 1e 2b 45 8d d2 63 ..+.... 00:20:33.996 00000110 8b 29 9d 0a e5 87 35 70 dc f4 95 f4 f9 00 e1 7d .)....5p.......} 00:20:33.996 00000120 50 05 9f 84 20 7d 3f 46 62 5b e9 b6 f7 0a c2 d4 P... }?Fb[...... 00:20:33.996 00000130 f4 12 38 97 6c 51 d0 4a 48 fa c1 58 ea 49 56 1a ..8.lQ.JH..X.IV. 00:20:33.996 00000140 ff 82 e7 5a 3e 82 ce 70 4c e7 fc 45 4d d2 f5 dd ...Z>..pL..EM... 00:20:33.996 00000150 35 bb 7c 1d 66 5f d1 c0 f6 76 3b 28 49 99 f7 ea 5.|.f_...v;(I... 00:20:33.996 00000160 64 5a 87 e6 db fe cf d9 b8 8e 76 ca 44 b1 9e 22 dZ........v.D.." 00:20:33.996 00000170 56 c6 48 5c 13 3b ac 39 b8 c2 b8 cf c7 8e eb 19 V.H\.;.9........ 00:20:33.996 00000180 5e 3e 6a 2d ce 5c 84 b2 a4 2b 41 e3 2b af ba 85 ^>j-.\...+A.+... 00:20:33.996 00000190 a5 7d e9 74 33 63 c7 8c 3d f6 2f 6a 2b 92 d0 ba .}.t3c..=./j+... 00:20:33.996 000001a0 86 af 81 93 31 84 fa 52 40 17 6b 4b c9 ee c7 aa ....1..R@.kK.... 00:20:33.996 000001b0 18 aa 5c 9f 1b 92 ad 01 07 2d 1a d4 44 92 b3 62 ..\......-..D..b 00:20:33.996 000001c0 c4 fe 94 ff 32 09 8b 08 d1 34 27 9b 01 c4 b0 a8 ....2....4'..... 00:20:33.996 000001d0 cb ff be 8d d6 38 28 7c fd 7a e6 c8 8d 20 7b f6 .....8(|.z... {. 00:20:33.996 000001e0 39 e0 9d 4f 85 53 4a 87 21 83 6f 94 52 35 c0 e7 9..O.SJ.!.o.R5.. 00:20:33.996 000001f0 e9 4d 56 db 8e a4 cf 2f 78 b2 2c 11 da 09 ee ba .MV..../x.,..... 00:20:33.996 00000200 ae 0c 7d 09 6d 27 15 b5 bd e6 c8 59 a7 69 51 ed ..}.m'.....Y.iQ. 00:20:33.996 00000210 86 88 a7 15 dd 75 02 04 e6 e3 ed 8b 99 aa 72 8a .....u........r. 00:20:33.996 00000220 e3 f1 68 ec 0f 6f 31 24 9e 7d 5d 8e 8f 73 36 24 ..h..o1$.}]..s6$ 00:20:33.996 00000230 d7 2f 1b e2 e5 c6 40 09 0d f3 97 51 b0 64 db 35 ./....@....Q.d.5 00:20:33.996 00000240 56 8b d8 11 3e db 59 54 41 9c 4d 72 a9 0d 49 74 V...>.YTA.Mr..It 00:20:33.996 00000250 11 bb f9 a3 d7 0f 7f c0 7c 21 f7 05 d4 8e a0 9c ........|!...... 00:20:33.996 00000260 ec 67 ea 3c 1f 82 26 44 a5 45 30 b8 6c 7e ee db .g.<..&D.E0.l~.. 00:20:33.996 00000270 f7 46 df 87 1c 1f 34 a4 2c 57 e7 c0 ab 12 2f 3f .F....4.,W..../? 00:20:33.996 00000280 4d 44 6d 19 e3 f4 03 a6 58 d0 7e 03 75 cf 06 7d MDm.....X.~.u..} 00:20:33.996 00000290 f3 b1 68 1b 36 c4 d8 7c 37 3a ca 18 49 1f ac 9d ..h.6..|7:..I... 00:20:33.996 000002a0 1f b8 26 ab ff e5 d2 af 6b 30 ba c3 ad 4b cc 91 ..&.....k0...K.. 00:20:33.996 000002b0 66 94 d1 5f 0b d5 94 61 bd 96 df e2 9b 6d 8b 3b f.._...a.....m.; 00:20:33.996 000002c0 e9 54 6d 97 b9 e7 be 1e 13 39 46 92 ae d9 fc 77 .Tm......9F....w 00:20:33.996 000002d0 ef 43 32 c9 da 02 90 a6 b2 67 d4 d3 dc d9 f7 33 .C2......g.....3 00:20:33.996 000002e0 b0 1d 5b 48 eb 3a 2d ef 01 51 77 9d 80 58 0e 05 ..[H.:-..Qw..X.. 00:20:33.996 000002f0 64 4a 9f dd d7 05 31 8d eb 8f 46 65 5a be 10 24 dJ....1...FeZ..$ 00:20:33.996 00000300 84 7a 6d 05 4d f8 52 29 31 4f a0 cb bb da e0 23 .zm.M.R)1O.....# 00:20:33.996 00000310 69 86 23 77 10 42 38 5f 50 21 95 1a 61 07 ce c1 i.#w.B8_P!..a... 00:20:33.996 00000320 71 ff 8a cb a0 a4 b9 a5 08 28 b7 8a 54 0a e9 04 q........(..T... 00:20:33.996 00000330 92 9e 01 2e 1f a3 bb b5 cb 17 e8 fc 23 bf c9 a5 ............#... 00:20:33.996 00000340 e3 a1 52 c4 ca 85 94 21 47 8f a7 ae 25 ef ac 90 ..R....!G...%... 00:20:33.996 00000350 71 75 2d c6 6f b6 b5 0a 35 56 63 0c 4c 99 04 51 qu-.o...5Vc.L..Q 00:20:33.996 00000360 9a 5a fc 5b 5b a9 73 84 4a 58 56 41 3b 33 46 8d .Z.[[.s.JXVA;3F. 00:20:33.996 00000370 ad 48 a0 f1 42 ed eb 15 40 b7 9a 83 d7 af a8 41 .H..B...@......A 00:20:33.996 00000380 48 6d 2e a6 f6 d2 7d c2 a9 f9 af 9b 5c 09 45 52 Hm....}.....\.ER 00:20:33.996 00000390 23 6e 06 46 d2 14 59 07 f3 7f d9 0d 4c 06 19 57 #n.F..Y.....L..W 00:20:33.996 000003a0 af f8 f3 e9 a5 88 22 e4 d2 e8 5b b5 17 3a 27 26 ......"...[..:'& 00:20:33.996 000003b0 a4 30 e9 0d 15 9b c6 75 48 f2 2d ad e6 2f 70 df .0.....uH.-../p. 00:20:33.996 000003c0 cc e8 f9 ea 3e eb a5 50 44 6f 54 25 28 cf 4f a1 ....>..PDoT%(.O. 00:20:33.996 000003d0 59 96 a7 dc bf 6e c7 0e 52 eb 90 a6 ab a8 6b 85 Y....n..R.....k. 00:20:33.996 000003e0 dd 8e d3 7b 44 53 9e 55 86 92 12 a0 77 2b b5 19 ...{DS.U....w+.. 00:20:33.996 000003f0 8c 6b d2 05 de 27 0c eb b6 c3 e4 2c 55 70 80 f5 .k...'.....,Up.. 00:20:33.996 host pubkey: 00:20:33.996 00000000 60 3d ca f7 d3 e1 de 93 39 cb 48 ca 50 7d be f3 `=......9.H.P}.. 00:20:33.996 00000010 5a 9f 43 df 86 c9 69 b8 e5 57 2a 91 f4 98 01 5e Z.C...i..W*....^ 00:20:33.996 00000020 39 04 69 af 98 d9 a0 40 24 ef 83 14 14 25 11 a0 9.i....@$....%.. 00:20:33.996 00000030 41 f8 32 ca 30 00 63 e4 29 e2 13 0c 92 68 3b da A.2.0.c.)....h;. 00:20:33.996 00000040 e9 b8 ec 5e bc 6c 93 2f 22 ea ac 00 ff d2 af 3a ...^.l./"......: 00:20:33.996 00000050 64 15 2d ab c3 87 1c 03 5d 07 22 93 00 93 f3 f1 d.-.....]."..... 00:20:33.996 00000060 b7 f7 ff cd cd 20 44 c2 94 24 ce 6e b4 8d 19 67 ..... D..$.n...g 00:20:33.996 00000070 3b d1 d1 7d 00 3d e6 39 a0 59 d2 3b 63 6a 76 cb ;..}.=.9.Y.;cjv. 00:20:33.996 00000080 05 a9 1f ce cb ef 17 1a 9e 3f a5 e2 54 74 f1 e6 .........?..Tt.. 00:20:33.996 00000090 95 38 6d 30 ea 88 4c d7 a9 63 a0 7d 04 a5 5e 67 .8m0..L..c.}..^g 00:20:33.996 000000a0 3f 72 09 27 11 2e 33 0b 43 47 ae df 75 f7 42 5f ?r.'..3.CG..u.B_ 00:20:33.996 000000b0 48 c8 7d d4 bb 9e a3 6c 56 b2 27 a1 ab f0 7a 9b H.}....lV.'...z. 00:20:33.996 000000c0 d8 d4 52 8e 5f 15 54 02 a6 45 37 fa 00 37 a4 0c ..R._.T..E7..7.. 00:20:33.996 000000d0 39 0b 06 eb ce 2d 8f 5c fa f0 f0 64 11 7e 9c c3 9....-.\...d.~.. 00:20:33.996 000000e0 72 17 63 e7 d0 76 8c 65 b5 23 74 14 4e b8 3b bd r.c..v.e.#t.N.;. 00:20:33.996 000000f0 06 97 c2 79 6b f6 e9 00 bc 8c 99 41 09 e2 4b c6 ...yk......A..K. 00:20:33.996 00000100 d3 b9 ac c8 b2 b8 24 9f fe c2 8a d8 0c d2 a1 38 ......$........8 00:20:33.996 00000110 6c d7 a6 4d d3 7a f5 22 72 90 25 c4 74 48 b8 8b l..M.z."r.%.tH.. 00:20:33.996 00000120 c8 41 2a a3 d1 6e 74 b1 90 3d e3 af d7 2c ba 2d .A*..nt..=...,.- 00:20:33.996 00000130 ca 84 9d 73 c4 1c fc cc 2a 31 f9 16 0d 47 2d 58 ...s....*1...G-X 00:20:33.996 00000140 11 1a 65 79 25 b8 5e 53 18 a2 ec 9a c1 79 eb dc ..ey%.^S.....y.. 00:20:33.996 00000150 2f cc 0b 08 af f3 67 97 78 85 c8 3e 86 81 47 4a /.....g.x..>..GJ 00:20:33.996 00000160 41 d6 d2 08 f8 a2 77 08 87 6f 4e 11 10 9b b5 b8 A.....w..oN..... 00:20:33.996 00000170 cf 62 cd a1 af 89 98 b2 a2 5e 64 f9 81 3f 67 42 .b.......^d..?gB 00:20:33.996 00000180 71 15 6b 2c 6b 21 bd 5d 93 84 9f d1 f5 9e 67 13 q.k,k!.]......g. 00:20:33.996 00000190 0e dd 9c 94 69 3b dc 6d db bc 53 67 7a 98 5b e1 ....i;.m..Sgz.[. 00:20:33.996 000001a0 e2 1a 19 47 18 3f ad e5 36 b1 a7 e1 3f 3d 6e f0 ...G.?..6...?=n. 00:20:33.996 000001b0 a6 0d 42 77 e5 f5 77 dd 8f 2f cb 7c 2b 46 9e 18 ..Bw..w../.|+F.. 00:20:33.996 000001c0 02 aa 41 9c f8 5d 49 e1 54 77 29 43 97 34 58 63 ..A..]I.Tw)C.4Xc 00:20:33.996 000001d0 d4 51 f8 58 00 d1 fa 50 79 1a 0d 4c da ed 03 91 .Q.X...Py..L.... 00:20:33.996 000001e0 9b 1a 13 b8 e6 ed 45 40 4e 52 d1 dd aa c7 b2 8b ......E@NR...... 00:20:33.996 000001f0 22 39 19 cb 86 2c 95 ae 8a 95 9c b6 19 21 51 34 "9...,.......!Q4 00:20:33.996 00000200 cf eb e7 23 f4 d6 e6 c1 f0 ec 40 7c 8e 14 21 9a ...#......@|..!. 00:20:33.996 00000210 a6 c1 d3 98 0c 96 3e ac 62 cb 10 bb b5 eb 1d 3f ......>.b......? 00:20:33.996 00000220 75 0c 92 c1 1a f3 4d 03 a8 a9 dd 74 f5 57 cb 3a u.....M....t.W.: 00:20:33.996 00000230 7e d3 40 9a 52 f9 2b b6 0b 66 e7 18 ec ed 10 a7 ~.@.R.+..f...... 00:20:33.997 00000240 08 e0 45 9f 62 7c eb 3c 4b 4b b3 2c 29 a0 9b 67 ..E.b|.Qx.A...... 00:20:33.997 00000020 8e a6 81 f9 8f 98 95 99 66 cd 75 98 bd 94 6a 67 ........f.u...jg 00:20:33.997 00000030 6b 95 1c 70 d9 82 7a b1 55 0b b9 5f b9 e2 1a 2f k..p..z.U.._.../ 00:20:33.997 00000040 fb 70 e2 b7 38 d2 fd 46 10 a8 91 7f de cf 0c 17 .p..8..F........ 00:20:33.997 00000050 f2 f4 d9 5f f2 dc 39 5f ef c6 bb 9d fe a4 0d 29 ..._..9_.......) 00:20:33.997 00000060 f5 c1 99 c0 e7 f3 1d 2c 5a cf da 17 af 99 07 bd .......,Z....... 00:20:33.997 00000070 6a cc 6e 05 6f c9 a1 67 2c e9 c1 f2 a4 9b 37 25 j.n.o..g,.....7% 00:20:33.997 00000080 27 3e 2d 2a ff f3 e1 6b a3 fb 80 d7 ab ac cf dc '>-*...k........ 00:20:33.997 00000090 7a a8 31 f5 9d f6 40 ad c0 25 12 89 5e 78 f1 4d z.1...@..%..^x.M 00:20:33.997 000000a0 d6 b3 4a 12 6e 3b 31 8f 06 2f 6e c1 ff ed a7 95 ..J.n;1../n..... 00:20:33.997 000000b0 86 38 22 bc 90 51 be d9 05 5e 40 00 18 e5 7e e6 .8"..Q...^@...~. 00:20:33.997 000000c0 87 dc 73 c9 17 b2 17 c0 20 04 1d e2 5e 19 de f0 ..s..... ...^... 00:20:33.997 000000d0 f0 70 17 90 94 7b ba ac 09 53 98 c2 f1 67 d0 4c .p...{...S...g.L 00:20:33.997 000000e0 20 6e bf 57 92 4e 1a 85 5f 11 43 fd 21 14 f0 ed n.W.N.._.C.!... 00:20:33.997 000000f0 9d 32 f7 9c 26 0f 63 9a ab 7e d5 cd 50 c4 37 7c .2..&.c..~..P.7| 00:20:33.997 00000100 40 15 a5 b5 ff 01 21 53 be 30 70 b3 d4 50 3f 6a @.....!S.0p..P?j 00:20:33.997 00000110 68 56 bf 23 b6 f6 43 98 0e 91 97 17 69 86 fc 8a hV.#..C.....i... 00:20:33.997 00000120 aa fb e5 79 6c 9c 60 3a e1 8b 6c a9 6c ae b6 6b ...yl.`:..l.l..k 00:20:33.997 00000130 ce 82 e7 1c fe b3 06 27 d7 a1 6c 98 30 a1 4e 4e .......'..l.0.NN 00:20:33.997 00000140 24 7c bd 5a 4f f1 6d d1 aa f0 9f 4d ce 6d a1 4f $|.ZO.m....M.m.O 00:20:33.997 00000150 16 b4 a5 4b 53 03 bb d5 61 83 6f 2c c0 ec e1 36 ...KS...a.o,...6 00:20:33.997 00000160 5f 5d 2a e9 6a b0 42 83 d1 1d 7b 49 c6 90 22 bc _]*.j.B...{I..". 00:20:33.997 00000170 e3 4a 98 bc ed a3 ed 08 71 34 29 f7 06 20 d3 70 .J......q4).. .p 00:20:33.997 00000180 b5 15 65 81 42 e9 15 7b 1a 14 e3 f6 a1 7e fa 83 ..e.B..{.....~.. 00:20:33.997 00000190 dc 0c fd f6 71 87 35 7c 71 8e 22 6e 44 cd 36 d5 ....q.5|q."nD.6. 00:20:33.997 000001a0 cc 5a 9b 81 3e 4c 48 bb 58 48 01 94 3b c6 97 31 .Z..>LH.XH..;..1 00:20:33.997 000001b0 39 4c f5 b7 a8 09 e8 da dd 17 07 1d 2a b2 82 ee 9L..........*... 00:20:33.997 000001c0 3d 06 9c db 14 74 30 7b f8 8b 74 a8 2c b7 0f b0 =....t0{..t.,... 00:20:33.997 000001d0 94 45 64 57 3a 49 bd ce c1 22 86 7b a6 55 a2 fb .EdW:I...".{.U.. 00:20:33.997 000001e0 64 07 59 a7 b2 92 b7 39 69 8d 9f 23 47 95 ff 52 d.Y....9i..#G..R 00:20:33.997 000001f0 bb 05 96 d5 a1 54 81 be a7 f1 20 ad 37 62 5b 70 .....T.... .7b[p 00:20:33.997 00000200 8d bf 3c 16 50 7d 4a a6 ca 33 35 15 c7 1c a3 bc ..<.P}J..35..... 00:20:33.997 00000210 71 a5 2e e5 c7 bc 52 05 d9 8d be 0d ba 67 53 07 q.....R......gS. 00:20:33.997 00000220 f4 51 27 70 a1 9f aa 37 d0 88 29 ca c9 4f 39 22 .Q'p...7..)..O9" 00:20:33.997 00000230 c5 f9 0d f4 33 64 c0 27 53 e2 ad 3b 9c d9 f0 82 ....3d.'S..;.... 00:20:33.997 00000240 cf 76 8b 92 32 ce 93 96 24 17 17 4e 21 32 9d 1c .v..2...$..N!2.. 00:20:33.997 00000250 5b dc 6f b4 0f 77 dd 4a 70 b1 28 d6 25 2e ea 91 [.o..w.Jp.(.%... 00:20:33.997 00000260 9c d9 b7 b6 0d 1a 49 15 74 5d 62 71 96 32 b6 32 ......I.t]bq.2.2 00:20:33.997 00000270 2f 18 42 e4 01 8e c6 7c 1b f6 34 be bf 2e a0 fc /.B....|..4..... 00:20:33.997 00000280 ba 4b 89 b7 40 d1 80 fb 9b c3 3d 95 47 39 29 98 .K..@.....=.G9). 00:20:33.997 00000290 d0 6a af 2b 60 c1 19 7e 9d a4 2e ec 55 1d d3 5d .j.+`..~....U..] 00:20:33.997 000002a0 2c 10 62 32 ec d5 a2 fa 31 aa 01 ab 5d 90 ea 3d ,.b2....1...]..= 00:20:33.997 000002b0 fb 66 44 1e 1a 77 c2 9f b2 e4 92 ab 8c 9f 2a 25 .fD..w........*% 00:20:33.997 000002c0 b7 43 b0 d5 ac d0 d0 d4 4c cc a0 a3 c1 8c a4 7b .C......L......{ 00:20:33.997 000002d0 54 54 87 a6 78 b4 a1 35 d0 55 ef 77 ab 71 9f 8f TT..x..5.U.w.q.. 00:20:33.997 000002e0 75 f2 ab 53 65 8b b2 6a 61 3f bf 61 ec 45 2b db u..Se..ja?.a.E+. 00:20:33.997 000002f0 3e 37 92 fe b0 e6 61 e3 c1 4c 23 c4 3e 97 79 3e >7....a..L#.>.y> 00:20:33.997 00000300 3a e8 94 4d 1c 8a 13 db c5 b6 ee f6 05 27 e3 aa :..M.........'.. 00:20:33.997 00000310 1a cb 93 d2 97 7b 6f 41 b7 0c ed b4 c5 55 ec 3e .....{oA.....U.> 00:20:33.997 00000320 05 0b af 38 e5 1e 5e 6f 20 53 df 23 5b a2 8a b5 ...8..^o S.#[... 00:20:33.997 00000330 70 9b 15 9e 3c 5e 6b fd d4 58 fb 26 fa 8a 54 00 p...<^k..X.&..T. 00:20:33.997 00000340 62 0a f0 26 d4 3e 97 fd 56 26 4e be 85 d9 f0 f1 b..&.>..V&N..... 00:20:33.997 00000350 0d de 8c 51 e9 5b e9 e9 53 1a 8e 71 53 6a e8 77 ...Q.[..S..qSj.w 00:20:33.997 00000360 9d b1 74 cf ed 5b 9f b0 34 f7 9f 53 8a 51 4a 34 ..t..[..4..S.QJ4 00:20:33.997 00000370 92 b9 c5 e3 1e ec 80 b1 35 40 11 fa ef b5 40 11 ........5@....@. 00:20:33.997 00000380 95 a1 6f f3 a5 7a da 33 d3 b5 d4 f9 34 d1 64 59 ..o..z.3....4.dY 00:20:33.997 00000390 0d eb d0 d5 cf c7 95 85 be b6 3a 21 76 f2 19 83 ..........:!v... 00:20:33.997 000003a0 53 3b f6 16 cd b3 3e b8 18 19 99 e3 94 09 cd 99 S;....>......... 00:20:33.997 000003b0 40 de ad 6c 3f c1 63 7d 15 6b ec 26 2c cb ef d7 @..l?.c}.k.&,... 00:20:33.997 000003c0 0a 9f 45 d2 ff eb f5 cf bb e4 87 80 e8 41 e3 ae ..E..........A.. 00:20:33.997 000003d0 ae 51 ed 07 b1 d3 94 b2 27 d7 0d 09 37 95 11 1f .Q......'...7... 00:20:33.997 000003e0 38 b6 c4 f8 99 52 7f a0 67 de b9 67 b9 46 f9 af 8....R..g..g.F.. 00:20:33.997 000003f0 62 57 e4 8f d1 03 5f 6f d4 04 d2 ec 93 f5 f6 ee bW...._o........ 00:20:33.997 [2024-09-27 15:25:34.298970] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key3, hash=3, dhgroup=5, seq=3428451842, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.997 [2024-09-27 15:25:34.299095] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:33.997 [2024-09-27 15:25:34.380720] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:33.997 [2024-09-27 15:25:34.380763] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:33.997 [2024-09-27 15:25:34.380774] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success2 00:20:33.997 [2024-09-27 15:25:34.380800] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:33.997 [2024-09-27 15:25:34.575019] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:33.997 [2024-09-27 15:25:34.575043] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 3 (sha512) 00:20:33.997 [2024-09-27 15:25:34.575050] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 5 (ffdhe8192) 00:20:33.997 [2024-09-27 15:25:34.575099] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:33.997 [2024-09-27 15:25:34.575123] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:33.997 ctrlr pubkey: 00:20:33.997 00000000 7d 95 c5 4a 97 fa 35 71 7f 03 99 68 31 a6 76 aa }..J..5q...h1.v. 00:20:33.997 00000010 fd fb 29 4d 8e 67 41 77 d3 d1 d1 18 cd f5 00 ed ..)M.gAw........ 00:20:33.997 00000020 98 f6 81 5e 79 84 8a eb d4 11 3a ce e6 cf 7b c7 ...^y.....:...{. 00:20:33.997 00000030 a9 7b e4 31 45 59 a9 f2 d5 9a df 7a c8 49 2b 8e .{.1EY.....z.I+. 00:20:33.997 00000040 22 77 91 a3 07 3c 87 ce 45 7d 68 eb a4 15 63 03 "w...<..E}h...c. 00:20:33.997 00000050 6c 5f ee 21 fe aa 21 ff ac 6d a1 2c 14 5a 0e dc l_.!..!..m.,.Z.. 00:20:33.997 00000060 f4 a0 dc cb 8e d1 77 d4 b0 01 7b 06 49 11 13 2b ......w...{.I..+ 00:20:33.997 00000070 e6 20 6d f2 93 a0 ed 7b ed af 2a ef 90 74 f8 e5 . m....{..*..t.. 00:20:33.997 00000080 4f 8a aa 22 ad dd ff 06 bd c3 42 9f 56 81 73 2a O.."......B.V.s* 00:20:33.997 00000090 f4 6e 78 1a 7e 64 ca 49 a2 58 d5 34 be b7 60 dc .nx.~d.I.X.4..`. 00:20:33.997 000000a0 3e d0 cb c2 12 eb ca 7d 7c 85 2c 03 25 97 33 c1 >......}|.,.%.3. 00:20:33.997 000000b0 64 15 f8 04 c3 77 1b 55 d2 39 13 a6 c1 27 75 5b d....w.U.9...'u[ 00:20:33.997 000000c0 08 9c 59 4a 86 d8 e5 9e 37 9c 37 ed f0 06 ad 18 ..YJ....7.7..... 00:20:33.997 000000d0 34 99 c6 52 77 16 85 8f 73 4a 0c c0 49 39 4f 9b 4..Rw...sJ..I9O. 00:20:33.997 000000e0 75 6e bf 68 ae 7e bc cb ad 1f d7 3d c7 50 00 9f un.h.~.....=.P.. 00:20:33.997 000000f0 6e 12 b3 f9 20 54 a5 50 c0 a7 a7 b0 86 09 d0 5b n... T.P.......[ 00:20:33.997 00000100 66 fd 1c a0 0a 0e ba cd 3d b0 ab 83 34 6c 51 d5 f.......=...4lQ. 00:20:33.997 00000110 84 fc da b3 f5 ff da e6 77 85 71 5e d0 da 9e de ........w.q^.... 00:20:33.997 00000120 45 93 80 ad 12 91 8e e9 09 92 02 77 5e 27 38 bb E..........w^'8. 00:20:33.997 00000130 27 76 eb 33 d5 fb a7 2b 04 70 99 13 32 18 bf ce 'v.3...+.p..2... 00:20:33.997 00000140 d9 6f 24 ba 32 3e f6 da 6c 7a 18 e4 44 7f b2 39 .o$.2>..lz..D..9 00:20:33.997 00000150 a8 b8 1c be cc ed 1d b6 52 b7 6f f4 f5 7c ec a6 ........R.o..|.. 00:20:33.997 00000160 55 8e a6 0d 83 4f 5d 52 8f 40 44 6a 97 a1 36 1c U....O]R.@Dj..6. 00:20:33.997 00000170 4a f1 d8 6a 5d 1e 27 51 4e 2a 7b 14 31 09 77 58 J..j].'QN*{.1.wX 00:20:33.997 00000180 f9 03 12 42 46 3d cc 63 86 07 30 c1 33 f7 0b 76 ...BF=.c..0.3..v 00:20:33.997 00000190 eb d2 d1 67 27 ec 27 05 39 17 70 c3 6a d4 ef 8f ...g'.'.9.p.j... 00:20:33.997 000001a0 16 4b dd ec 75 ed 53 78 4c 46 fc 3e 2b 85 57 10 .K..u.SxLF.>+.W. 00:20:33.997 000001b0 51 c5 c3 38 bc 7e eb ca ae 6d 8a 8b af 73 1e 09 Q..8.~...m...s.. 00:20:33.997 000001c0 05 1e 26 03 80 db 30 3f cb e6 4f 68 f7 bd 22 91 ..&...0?..Oh..". 00:20:33.997 000001d0 d8 a1 9c ef dc 52 5f 01 1a 76 7d 35 14 83 b8 3e .....R_..v}5...> 00:20:33.997 000001e0 3b 0e 9d 99 fc 4a 0c 04 9c 01 8b 06 e3 29 5c 96 ;....J.......)\. 00:20:33.997 000001f0 21 63 d4 a1 fd 4f cf b8 af ef 6e d2 b0 f0 08 d4 !c...O....n..... 00:20:33.997 00000200 f4 ea 94 2e 03 f2 1c b7 9c 09 71 20 3d dc 00 0c ..........q =... 00:20:33.997 00000210 4a fc a3 c5 2f 9a 5b 95 8f f4 db 8f 4c 27 90 ea J.../.[.....L'.. 00:20:33.997 00000220 fa 38 d9 36 b2 85 b0 da 4d 19 54 15 fc 81 d2 69 .8.6....M.T....i 00:20:33.997 00000230 15 93 67 f2 06 9a 9d c4 41 05 63 45 d4 0f df 6a ..g.....A.cE...j 00:20:33.997 00000240 57 51 6c be 72 7c 60 46 01 6a 55 22 4d 3d 00 a9 WQl.r|`F.jU"M=.. 00:20:33.997 00000250 9f db 2f 41 3f 57 3e e4 96 48 94 b0 e0 51 63 a9 ../A?W>..H...Qc. 00:20:33.997 00000260 57 86 fc ea 3c b0 d1 ce 20 d1 37 72 57 b4 0e ea W...<... .7rW... 00:20:33.997 00000270 3a 78 8f d7 f4 f9 12 c6 16 c8 ca c9 81 5b f3 ac :x...........[.. 00:20:33.997 00000280 2f c3 35 b6 38 05 c3 3e 74 85 a5 3e f7 c6 71 9e /.5.8..>t..>..q. 00:20:33.997 00000290 84 56 b0 94 12 be e4 b3 55 25 ed 1a ed a1 14 34 .V......U%.....4 00:20:33.997 000002a0 4b e2 b4 c5 82 d6 6c 37 ff ca 6b 79 dc db be f5 K.....l7..ky.... 00:20:33.997 000002b0 31 78 64 68 f8 d6 09 ef 24 a5 a3 d5 69 73 bd 10 1xdh....$...is.. 00:20:33.997 000002c0 f3 88 d4 37 82 ae af 25 c7 f9 58 7f 46 1e 00 b6 ...7...%..X.F... 00:20:33.997 000002d0 3b 1f f3 b7 c2 f8 05 9c d0 3e 1c e5 4a ab 84 62 ;........>..J..b 00:20:33.998 000002e0 c5 0d 7b 0e 06 7e 79 14 75 94 25 34 45 01 66 e3 ..{..~y.u.%4E.f. 00:20:33.998 000002f0 73 cb a7 c8 b5 c8 f3 65 7b 63 d1 8a a1 89 d6 21 s......e{c.....! 00:20:33.998 00000300 24 ef 49 08 78 f2 d4 c7 68 db a3 e6 13 09 3c be $.I.x...h.....<. 00:20:33.998 00000310 2e 13 64 bc 30 57 21 12 58 d9 f6 e9 23 79 29 3c ..d.0W!.X...#y)< 00:20:33.998 00000320 a7 d9 ce f1 ee 93 d0 31 5a 68 dc 7c aa 86 25 9d .......1Zh.|..%. 00:20:33.998 00000330 17 f2 de 14 06 d9 bf 04 65 c3 05 82 e7 92 20 b9 ........e..... . 00:20:33.998 00000340 bf d7 5f 6b 79 20 54 86 b0 33 2b f9 cb 2f eb 25 .._ky T..3+../.% 00:20:33.998 00000350 b0 96 21 fb be 22 61 f5 e3 2a 4f a3 e7 c2 3f 32 ..!.."a..*O...?2 00:20:33.998 00000360 8b 91 c0 23 f9 df 6d c8 33 39 96 a6 de 55 68 d9 ...#..m.39...Uh. 00:20:33.998 00000370 b0 14 01 ab 3f 8f 16 38 b8 0f ac 7a 61 11 7d 6b ....?..8...za.}k 00:20:33.998 00000380 5d 08 08 37 46 13 76 45 c5 f6 d2 fe 0f a9 b3 25 ]..7F.vE.......% 00:20:33.998 00000390 9d c7 4d 6c 2d 2a 1e 13 90 e1 23 5a 75 17 f1 d2 ..Ml-*....#Zu... 00:20:33.998 000003a0 09 59 fc a7 2e 3e f8 de eb 3f 36 0d 00 aa 08 ca .Y...>...?6..... 00:20:33.998 000003b0 5a 45 51 61 0e 88 0a 69 d9 59 04 f4 eb 15 9b b3 ZEQa...i.Y...... 00:20:33.998 000003c0 e6 1a 16 88 53 02 20 50 56 6a fd 47 fd e8 99 bb ....S. PVj.G.... 00:20:33.998 000003d0 ff b7 5c 71 a5 18 47 6a 5b a4 00 ad 08 b7 c4 6e ..\q..Gj[......n 00:20:33.998 000003e0 b0 36 2d 03 24 3d 83 a3 ee c8 88 d7 56 89 04 20 .6-.$=......V.. 00:20:33.998 000003f0 e6 32 b5 e6 c4 8d 2d f2 45 3e f3 a0 d2 db 67 c2 .2....-.E>....g. 00:20:33.998 host pubkey: 00:20:33.998 00000000 2d d4 9c d4 0c d9 b5 27 fc 3d f6 5f d3 d9 54 6b -......'.=._..Tk 00:20:33.998 00000010 c6 d2 c2 2d 69 a7 ef 72 fe e4 e8 9c b3 4a 5e d8 ...-i..r.....J^. 00:20:33.998 00000020 c5 0b 94 82 81 4a 7b 6f 81 fe 0a 40 9c c5 d9 10 .....J{o...@.... 00:20:33.998 00000030 27 20 38 81 e2 e1 20 9c fb 8a 25 76 d7 ec 91 08 ' 8... ...%v.... 00:20:33.998 00000040 21 da 21 d9 08 55 de 43 e2 24 ac 8b e7 dd 05 15 !.!..U.C.$...... 00:20:33.998 00000050 24 64 f6 b9 80 60 e5 b9 4c a6 e6 39 94 4f 14 20 $d...`..L..9.O. 00:20:33.998 00000060 6a 48 21 51 4c 7c 66 85 74 ae 17 47 f5 ff fd 48 jH!QL|f.t..G...H 00:20:33.998 00000070 9c 62 4b 08 4a 49 28 3d 0e 2b 44 f2 38 fe 06 fe .bK.JI(=.+D.8... 00:20:33.998 00000080 96 b6 eb 27 68 65 65 c8 eb c5 9d d5 54 b3 12 9f ...'hee.....T... 00:20:33.998 00000090 f2 29 cf c8 cb 84 b9 4e 2e 7e f7 f6 55 5b 88 71 .).....N.~..U[.q 00:20:33.998 000000a0 f4 eb 4a 19 26 1f 30 c3 b8 4c 14 ef a3 45 cb d3 ..J.&.0..L...E.. 00:20:33.998 000000b0 34 a4 5c c7 c7 a1 e6 dd 64 c0 53 58 a3 5a 3f 2c 4.\.....d.SX.Z?, 00:20:33.998 000000c0 ab 7a 9b 3f 4d 4a 61 e4 3c 04 77 39 ae b6 8e 0c .z.?MJa.<.w9.... 00:20:33.998 000000d0 4e 4b 4d 6d 2f e9 31 c0 9b 55 8f 63 b0 fe ab 26 NKMm/.1..U.c...& 00:20:33.998 000000e0 44 04 6e 65 a8 00 f2 3b bb 77 98 53 f4 56 35 62 D.ne...;.w.S.V5b 00:20:33.998 000000f0 44 d8 75 14 fd 78 7a ae 1b b2 07 c0 bf f8 9e c8 D.u..xz......... 00:20:33.998 00000100 80 03 2e c6 08 f3 de eb b2 13 38 75 3d 99 7e 73 ..........8u=.~s 00:20:33.998 00000110 e6 23 b5 0c cc c3 7e 54 05 43 02 61 75 a6 79 bb .#....~T.C.au.y. 00:20:33.998 00000120 ef 5b 06 d7 07 93 b8 44 36 d5 06 62 cd 7d 98 35 .[.....D6..b.}.5 00:20:33.998 00000130 96 47 8b e0 a8 a6 7f 9a 34 bf 8e cc 5b 8d 2b c2 .G......4...[.+. 00:20:33.998 00000140 cd 7c ed ee e0 e4 7c 93 76 f6 6e 99 64 8e ef 24 .|....|.v.n.d..$ 00:20:33.998 00000150 03 1f 2f 62 97 05 b0 4a 09 41 30 97 e2 83 9f eb ../b...J.A0..... 00:20:33.998 00000160 92 cc 18 9f e5 2d 7a e0 15 91 4c 48 2e 32 bc a6 .....-z...LH.2.. 00:20:33.998 00000170 c7 e3 87 f0 88 86 d5 9f 19 ec 72 35 29 0d 6c be ..........r5).l. 00:20:33.998 00000180 47 a6 64 5a 98 e7 13 d9 ef 62 1c 95 b7 da a7 9a G.dZ.....b...... 00:20:33.998 00000190 27 84 f7 25 3d d0 56 c6 5b 4c 9a 54 43 60 1a dc '..%=.V.[L.TC`.. 00:20:33.998 000001a0 7f c8 86 34 c7 c6 56 b2 f4 de b9 ce ef 84 dd 35 ...4..V........5 00:20:33.998 000001b0 a2 bf 24 fa 25 9b 33 39 41 95 0d bd de 4b 45 47 ..$.%.39A....KEG 00:20:33.998 000001c0 e2 33 59 07 a9 e8 27 36 1e 4b 11 ea ab 7a d0 83 .3Y...'6.K...z.. 00:20:33.998 000001d0 d2 f4 cf d0 16 f5 7a c8 d2 b0 1a 9c 4d 1c 0b 63 ......z.....M..c 00:20:33.998 000001e0 8f 69 ab f4 c2 49 73 2c 98 6c 54 1d b8 8f 04 b6 .i...Is,.lT..... 00:20:33.998 000001f0 e0 aa 54 af 83 76 03 45 f0 7e e1 3a 0d 56 d9 78 ..T..v.E.~.:.V.x 00:20:33.998 00000200 7c ad e0 22 bf 00 9f e9 60 c0 9a 4e 69 20 a6 4d |.."....`..Ni .M 00:20:33.998 00000210 71 0c 1e 62 cf 5c 46 ea 9c 40 fd 11 be 8e 65 14 q..b.\F..@....e. 00:20:33.998 00000220 43 97 67 ae 03 5e 90 b4 63 16 0c 8b 35 14 fd ab C.g..^..c...5... 00:20:33.998 00000230 63 03 c5 a0 a6 27 6f 79 f2 fd 70 98 fa 74 0a 2a c....'oy..p..t.* 00:20:33.998 00000240 08 29 7a 59 6f f8 1e 08 f0 ae 3e e2 f1 80 f3 88 .)zYo.....>..... 00:20:33.998 00000250 df 61 54 3b ce 1b 07 03 85 8d c8 cf 63 21 1e 4b .aT;........c!.K 00:20:33.998 00000260 b0 c1 7d 91 e4 6f 8d e0 e7 a4 63 46 90 06 ed dc ..}..o....cF.... 00:20:33.998 00000270 85 20 33 9e 66 28 3e 91 33 a7 be 32 31 7e 0f 43 . 3.f(>.3..21~.C 00:20:33.998 00000280 4e 3d 83 bf 60 9c 9a d4 61 cf 71 0d 22 32 e3 ed N=..`...a.q."2.. 00:20:33.998 00000290 6c 07 45 2f 7a a0 52 f7 dd 7c 9d c5 5e 75 fa 6d l.E/z.R..|..^u.m 00:20:33.998 000002a0 08 cf aa c9 1f 2a 91 f3 b6 b5 cd 92 00 c0 6f 18 .....*........o. 00:20:33.998 000002b0 4d 00 53 60 54 f3 a0 78 78 ae 20 0e c8 da 2f ef M.S`T..xx. .../. 00:20:33.998 000002c0 81 ab 35 a5 b3 b6 36 2c 37 a1 a4 c5 6d 5e b0 37 ..5...6,7...m^.7 00:20:33.998 000002d0 0c 46 59 45 6f 49 5c 70 9c fe b3 4a dd 1e a9 a1 .FYEoI\p...J.... 00:20:33.998 000002e0 c9 55 3f 75 fe 85 79 0e 7f 7a 4c f6 60 70 41 b5 .U?u..y..zL.`pA. 00:20:33.998 000002f0 d5 37 e9 d5 d1 ed 21 7a 9b 8d 28 f5 31 5c 59 c2 .7....!z..(.1\Y. 00:20:33.998 00000300 9b 04 32 66 7d 47 a2 3a 20 7b 1b 26 26 b3 d9 2a ..2f}G.: {.&&..* 00:20:33.998 00000310 83 98 4d 97 43 86 34 1e ff 19 2b 54 59 3f a1 78 ..M.C.4...+TY?.x 00:20:33.998 00000320 2c 24 e0 4e e0 c3 6e 4a 92 e4 f1 a9 66 fb de 83 ,$.N..nJ....f... 00:20:33.998 00000330 c9 42 b2 95 e8 2d 62 8c 60 6d 8d 4c fc a8 34 69 .B...-b.`m.L..4i 00:20:33.998 00000340 87 a2 01 d7 51 e4 07 37 49 4b 6e f2 cc 9a 90 91 ....Q..7IKn..... 00:20:33.998 00000350 5e 47 b4 2b 3c 5a ce a3 5d 27 9a b0 a7 80 46 71 ^G.+.y..).E+.. 00:20:33.998 00000070 41 b4 5f 5e 78 6c d9 82 23 fb 9e 13 21 13 c4 d8 A._^xl..#...!... 00:20:33.998 00000080 ef 01 c2 e5 8c 6d 25 41 5c 7a bd d9 0c 4e 9b 90 .....m%A\z...N.. 00:20:33.998 00000090 4d d5 6f df b2 57 a8 34 fb 5f be 1b 87 03 2c 44 M.o..W.4._....,D 00:20:33.998 000000a0 56 1f 6b 1e 42 38 55 0b 39 a0 40 88 e1 00 b1 1e V.k.B8U.9.@..... 00:20:33.998 000000b0 a8 73 14 11 46 e5 7b 8c 7f 72 09 12 78 95 19 bf .s..F.{..r..x... 00:20:33.998 000000c0 6e d0 15 cc 90 5d 19 bd f7 10 99 c8 d4 e1 e8 ee n....].......... 00:20:33.998 000000d0 f4 f2 ee 5f 25 64 5e ed aa d7 f4 3b 6c d8 26 b5 ..._%d^....;l.&. 00:20:33.998 000000e0 58 b4 95 d9 cc f9 0e 0d 28 e0 1c b1 7f ae f1 b2 X.......(....... 00:20:33.998 000000f0 3a 17 f3 f3 8f ad 03 d7 c9 93 59 59 33 49 49 70 :.........YY3IIp 00:20:33.998 00000100 73 68 81 4c 6d bf 78 21 64 72 81 f8 ef 6d 29 88 sh.Lm.x!dr...m). 00:20:33.998 00000110 9d 1c c9 ce 2a 77 84 be 38 fa 8a 3b b5 87 77 be ....*w..8..;..w. 00:20:33.998 00000120 66 d5 d9 5f 54 cb c3 9e e9 13 9c c2 3c f5 6e 22 f.._T.......<.n" 00:20:33.998 00000130 95 db 47 0c 56 74 25 e2 4b 5c 1c 44 4b 61 17 50 ..G.Vt%.K\.DKa.P 00:20:33.998 00000140 37 2e a8 6f 0e 66 d5 52 2a a1 b1 d2 9d b6 29 83 7..o.f.R*.....). 00:20:33.998 00000150 8a b3 de dd b1 c1 80 0c 19 95 b5 b6 05 76 06 d4 .............v.. 00:20:33.998 00000160 c4 2d 1f ee 75 ee 04 cb 86 18 e5 85 c5 d8 45 23 .-..u.........E# 00:20:33.998 00000170 31 d1 ae 6a 33 46 c0 62 78 38 e0 4d 01 23 77 d0 1..j3F.bx8.M.#w. 00:20:33.998 00000180 4d ca 1d 7a bb 0b 17 c0 c2 c0 58 2f 02 91 98 0c M..z......X/.... 00:20:33.998 00000190 e9 36 bc 16 95 6f 04 be 0b 82 f9 8b ad 1e e7 b9 .6...o.......... 00:20:33.998 000001a0 65 d0 25 90 3b 37 d5 0d 11 0c fc d7 c7 da 87 22 e.%.;7........." 00:20:33.998 000001b0 d9 36 c4 f7 75 e6 7c 71 b9 f0 8d 5a 4b 2e ad 7c .6..u.|q...ZK..| 00:20:33.998 000001c0 e5 81 82 9c a9 60 26 e6 d3 15 58 ea 34 ce aa 69 .....`&...X.4..i 00:20:33.998 000001d0 93 ec 4a cc 1d 56 6a ac 92 ff b3 04 3c db 39 3e ..J..Vj.....<.9> 00:20:33.998 000001e0 00 1a 52 14 a3 2b c0 1e 22 3e cd 15 4d c7 0e 74 ..R..+..">..M..t 00:20:33.998 000001f0 b0 8f da 74 15 4c b5 7e ec 8e a2 d0 b8 2b ef 9b ...t.L.~.....+.. 00:20:33.998 00000200 57 9d af ef 0a 7e f1 07 65 25 bd 9d 19 0d a3 41 W....~..e%.....A 00:20:33.998 00000210 92 05 51 8e 5e 91 a1 1c 3d 3f af e2 17 49 22 1f ..Q.^...=?...I". 00:20:33.998 00000220 e1 a2 a4 ac 1a 2c 5b 73 66 16 ea c5 90 2d 99 e6 .....,[sf....-.. 00:20:33.998 00000230 6b 8e ce fc e1 cf 58 bf 96 80 47 11 f3 ef 9d 8c k.....X...G..... 00:20:33.998 00000240 f2 23 9b 96 64 e5 cf 07 64 db 3b ff 90 c5 42 3b .#..d...d.;...B; 00:20:33.998 00000250 2a fb 0d 8d d5 af de 85 1d 13 8e 3f 23 69 37 d4 *..........?#i7. 00:20:33.998 00000260 93 dc 4c 92 f6 c6 e0 f6 78 44 52 0b ca 29 07 e8 ..L.....xDR..).. 00:20:33.998 00000270 54 c6 0b 1f ef 29 3c df ed 62 d5 2e c3 e9 bf 03 T....)<..b...... 00:20:33.998 00000280 d5 8b eb 8e ba 48 7d f6 a1 9a 88 85 6f b4 c3 18 .....H}.....o... 00:20:33.998 00000290 30 91 ee 73 e8 b2 27 09 6a 05 cb 70 43 ce 31 b9 0..s..'.j..pC.1. 00:20:33.998 000002a0 12 7a 1a 5e c9 ad 95 a0 3d ab ca 7f 4a 23 43 98 .z.^....=...J#C. 00:20:33.998 000002b0 8a b5 e6 7e eb e0 fa 43 66 7c 1b 01 19 0e ff f3 ...~...Cf|...... 00:20:33.998 000002c0 ac 48 68 08 b0 3e 56 26 41 7e 10 ad 09 b6 3b c3 .Hh..>V&A~....;. 00:20:33.998 000002d0 1a b6 4d 11 e8 22 7c d0 19 c6 40 05 b6 ce 80 5e ..M.."|...@....^ 00:20:33.998 000002e0 a8 48 37 00 bb 7b ec b0 cb ba 52 3d 87 a5 fc 24 .H7..{....R=...$ 00:20:33.998 000002f0 38 4a 4a 87 c4 87 b9 15 00 e8 f4 34 c7 8f d2 fb 8JJ........4.... 00:20:33.998 00000300 bb 65 8e a4 f8 ba a4 fc b4 d1 e4 08 3e f1 2d 17 .e..........>.-. 00:20:33.998 00000310 0a 80 ef b1 f4 8d 47 55 51 c3 04 b6 e2 dd 46 42 ......GUQ.....FB 00:20:33.998 00000320 19 1b 83 d5 2b f7 8c 80 a0 8d f4 d6 1c a7 69 61 ....+.........ia 00:20:33.998 00000330 3b 52 55 36 f4 7a f6 60 ca c0 d3 64 e3 af 03 02 ;RU6.z.`...d.... 00:20:33.998 00000340 89 46 a9 63 f6 20 08 13 51 45 b5 36 12 43 ec 66 .F.c. ..QE.6.C.f 00:20:33.998 00000350 82 ba 08 23 d2 df 4c ec b8 cc 21 40 f4 d1 ec f6 ...#..L...!@.... 00:20:33.998 00000360 86 67 cf c5 42 e8 cd 58 2d 8d 0a cb 05 ed 60 7b .g..B..X-.....`{ 00:20:33.998 00000370 bc 31 90 d4 83 df 0e 38 3b fb 8f 27 cf 67 33 e8 .1.....8;..'.g3. 00:20:33.998 00000380 27 dc b7 8f f2 20 14 d9 ac 38 38 44 ee d7 ed ba '.... ...88D.... 00:20:33.998 00000390 a2 8c 85 a8 59 8e 4b 6a 75 66 e5 2b 58 37 69 4a ....Y.Kjuf.+X7iJ 00:20:33.998 000003a0 56 d3 d6 ff 40 2e 93 8b 9e e8 c9 fe 61 68 30 fa V...@.......ah0. 00:20:33.998 000003b0 ed fd fb 81 33 8e 6f da 1f be ae 93 85 47 51 4d ....3.o......GQM 00:20:33.998 000003c0 43 a5 44 6d e1 10 c3 55 50 8c 66 bf ee ab 1d 3e C.Dm...UP.f....> 00:20:33.998 000003d0 75 56 06 99 7e 65 6f 99 2d 14 5a 65 65 1b db c2 uV..~eo.-.Zee... 00:20:33.998 000003e0 77 69 ec f4 8b 95 15 9a 42 71 d6 dd dc 57 11 2b wi......Bq...W.+ 00:20:33.998 000003f0 81 50 21 35 5c 94 a7 d1 ec 1d ac 51 d4 43 a9 21 .P!5\......Q.C.! 00:20:33.998 [2024-09-27 15:25:34.691903] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key4, hash=3, dhgroup=5, seq=3428451843, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:33.998 [2024-09-27 15:25:34.754983] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:33.998 [2024-09-27 15:25:34.755017] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:33.998 [2024-09-27 15:25:34.755034] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] authentication completed successfully 00:20:33.998 [2024-09-27 15:25:34.755041] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:33.998 [2024-09-27 15:25:34.861386] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: negotiate 00:20:33.998 [2024-09-27 15:25:34.861407] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] digest: 3 (sha512) 00:20:33.998 [2024-09-27 15:25:34.861414] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] dhgroup: 5 (ffdhe8192) 00:20:33.998 [2024-09-27 15:25:34.861424] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-negotiate 00:20:33.999 [2024-09-27 15:25:34.861482] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-challenge 00:20:33.999 ctrlr pubkey: 00:20:33.999 00000000 7d 95 c5 4a 97 fa 35 71 7f 03 99 68 31 a6 76 aa }..J..5q...h1.v. 00:20:33.999 00000010 fd fb 29 4d 8e 67 41 77 d3 d1 d1 18 cd f5 00 ed ..)M.gAw........ 00:20:33.999 00000020 98 f6 81 5e 79 84 8a eb d4 11 3a ce e6 cf 7b c7 ...^y.....:...{. 00:20:33.999 00000030 a9 7b e4 31 45 59 a9 f2 d5 9a df 7a c8 49 2b 8e .{.1EY.....z.I+. 00:20:33.999 00000040 22 77 91 a3 07 3c 87 ce 45 7d 68 eb a4 15 63 03 "w...<..E}h...c. 00:20:33.999 00000050 6c 5f ee 21 fe aa 21 ff ac 6d a1 2c 14 5a 0e dc l_.!..!..m.,.Z.. 00:20:33.999 00000060 f4 a0 dc cb 8e d1 77 d4 b0 01 7b 06 49 11 13 2b ......w...{.I..+ 00:20:33.999 00000070 e6 20 6d f2 93 a0 ed 7b ed af 2a ef 90 74 f8 e5 . m....{..*..t.. 00:20:33.999 00000080 4f 8a aa 22 ad dd ff 06 bd c3 42 9f 56 81 73 2a O.."......B.V.s* 00:20:33.999 00000090 f4 6e 78 1a 7e 64 ca 49 a2 58 d5 34 be b7 60 dc .nx.~d.I.X.4..`. 00:20:33.999 000000a0 3e d0 cb c2 12 eb ca 7d 7c 85 2c 03 25 97 33 c1 >......}|.,.%.3. 00:20:33.999 000000b0 64 15 f8 04 c3 77 1b 55 d2 39 13 a6 c1 27 75 5b d....w.U.9...'u[ 00:20:33.999 000000c0 08 9c 59 4a 86 d8 e5 9e 37 9c 37 ed f0 06 ad 18 ..YJ....7.7..... 00:20:33.999 000000d0 34 99 c6 52 77 16 85 8f 73 4a 0c c0 49 39 4f 9b 4..Rw...sJ..I9O. 00:20:33.999 000000e0 75 6e bf 68 ae 7e bc cb ad 1f d7 3d c7 50 00 9f un.h.~.....=.P.. 00:20:33.999 000000f0 6e 12 b3 f9 20 54 a5 50 c0 a7 a7 b0 86 09 d0 5b n... T.P.......[ 00:20:33.999 00000100 66 fd 1c a0 0a 0e ba cd 3d b0 ab 83 34 6c 51 d5 f.......=...4lQ. 00:20:33.999 00000110 84 fc da b3 f5 ff da e6 77 85 71 5e d0 da 9e de ........w.q^.... 00:20:33.999 00000120 45 93 80 ad 12 91 8e e9 09 92 02 77 5e 27 38 bb E..........w^'8. 00:20:33.999 00000130 27 76 eb 33 d5 fb a7 2b 04 70 99 13 32 18 bf ce 'v.3...+.p..2... 00:20:33.999 00000140 d9 6f 24 ba 32 3e f6 da 6c 7a 18 e4 44 7f b2 39 .o$.2>..lz..D..9 00:20:33.999 00000150 a8 b8 1c be cc ed 1d b6 52 b7 6f f4 f5 7c ec a6 ........R.o..|.. 00:20:33.999 00000160 55 8e a6 0d 83 4f 5d 52 8f 40 44 6a 97 a1 36 1c U....O]R.@Dj..6. 00:20:33.999 00000170 4a f1 d8 6a 5d 1e 27 51 4e 2a 7b 14 31 09 77 58 J..j].'QN*{.1.wX 00:20:33.999 00000180 f9 03 12 42 46 3d cc 63 86 07 30 c1 33 f7 0b 76 ...BF=.c..0.3..v 00:20:33.999 00000190 eb d2 d1 67 27 ec 27 05 39 17 70 c3 6a d4 ef 8f ...g'.'.9.p.j... 00:20:33.999 000001a0 16 4b dd ec 75 ed 53 78 4c 46 fc 3e 2b 85 57 10 .K..u.SxLF.>+.W. 00:20:33.999 000001b0 51 c5 c3 38 bc 7e eb ca ae 6d 8a 8b af 73 1e 09 Q..8.~...m...s.. 00:20:33.999 000001c0 05 1e 26 03 80 db 30 3f cb e6 4f 68 f7 bd 22 91 ..&...0?..Oh..". 00:20:33.999 000001d0 d8 a1 9c ef dc 52 5f 01 1a 76 7d 35 14 83 b8 3e .....R_..v}5...> 00:20:33.999 000001e0 3b 0e 9d 99 fc 4a 0c 04 9c 01 8b 06 e3 29 5c 96 ;....J.......)\. 00:20:33.999 000001f0 21 63 d4 a1 fd 4f cf b8 af ef 6e d2 b0 f0 08 d4 !c...O....n..... 00:20:33.999 00000200 f4 ea 94 2e 03 f2 1c b7 9c 09 71 20 3d dc 00 0c ..........q =... 00:20:33.999 00000210 4a fc a3 c5 2f 9a 5b 95 8f f4 db 8f 4c 27 90 ea J.../.[.....L'.. 00:20:33.999 00000220 fa 38 d9 36 b2 85 b0 da 4d 19 54 15 fc 81 d2 69 .8.6....M.T....i 00:20:33.999 00000230 15 93 67 f2 06 9a 9d c4 41 05 63 45 d4 0f df 6a ..g.....A.cE...j 00:20:33.999 00000240 57 51 6c be 72 7c 60 46 01 6a 55 22 4d 3d 00 a9 WQl.r|`F.jU"M=.. 00:20:33.999 00000250 9f db 2f 41 3f 57 3e e4 96 48 94 b0 e0 51 63 a9 ../A?W>..H...Qc. 00:20:33.999 00000260 57 86 fc ea 3c b0 d1 ce 20 d1 37 72 57 b4 0e ea W...<... .7rW... 00:20:33.999 00000270 3a 78 8f d7 f4 f9 12 c6 16 c8 ca c9 81 5b f3 ac :x...........[.. 00:20:33.999 00000280 2f c3 35 b6 38 05 c3 3e 74 85 a5 3e f7 c6 71 9e /.5.8..>t..>..q. 00:20:33.999 00000290 84 56 b0 94 12 be e4 b3 55 25 ed 1a ed a1 14 34 .V......U%.....4 00:20:33.999 000002a0 4b e2 b4 c5 82 d6 6c 37 ff ca 6b 79 dc db be f5 K.....l7..ky.... 00:20:33.999 000002b0 31 78 64 68 f8 d6 09 ef 24 a5 a3 d5 69 73 bd 10 1xdh....$...is.. 00:20:33.999 000002c0 f3 88 d4 37 82 ae af 25 c7 f9 58 7f 46 1e 00 b6 ...7...%..X.F... 00:20:33.999 000002d0 3b 1f f3 b7 c2 f8 05 9c d0 3e 1c e5 4a ab 84 62 ;........>..J..b 00:20:33.999 000002e0 c5 0d 7b 0e 06 7e 79 14 75 94 25 34 45 01 66 e3 ..{..~y.u.%4E.f. 00:20:33.999 000002f0 73 cb a7 c8 b5 c8 f3 65 7b 63 d1 8a a1 89 d6 21 s......e{c.....! 00:20:33.999 00000300 24 ef 49 08 78 f2 d4 c7 68 db a3 e6 13 09 3c be $.I.x...h.....<. 00:20:33.999 00000310 2e 13 64 bc 30 57 21 12 58 d9 f6 e9 23 79 29 3c ..d.0W!.X...#y)< 00:20:33.999 00000320 a7 d9 ce f1 ee 93 d0 31 5a 68 dc 7c aa 86 25 9d .......1Zh.|..%. 00:20:33.999 00000330 17 f2 de 14 06 d9 bf 04 65 c3 05 82 e7 92 20 b9 ........e..... . 00:20:33.999 00000340 bf d7 5f 6b 79 20 54 86 b0 33 2b f9 cb 2f eb 25 .._ky T..3+../.% 00:20:33.999 00000350 b0 96 21 fb be 22 61 f5 e3 2a 4f a3 e7 c2 3f 32 ..!.."a..*O...?2 00:20:33.999 00000360 8b 91 c0 23 f9 df 6d c8 33 39 96 a6 de 55 68 d9 ...#..m.39...Uh. 00:20:33.999 00000370 b0 14 01 ab 3f 8f 16 38 b8 0f ac 7a 61 11 7d 6b ....?..8...za.}k 00:20:33.999 00000380 5d 08 08 37 46 13 76 45 c5 f6 d2 fe 0f a9 b3 25 ]..7F.vE.......% 00:20:33.999 00000390 9d c7 4d 6c 2d 2a 1e 13 90 e1 23 5a 75 17 f1 d2 ..Ml-*....#Zu... 00:20:33.999 000003a0 09 59 fc a7 2e 3e f8 de eb 3f 36 0d 00 aa 08 ca .Y...>...?6..... 00:20:33.999 000003b0 5a 45 51 61 0e 88 0a 69 d9 59 04 f4 eb 15 9b b3 ZEQa...i.Y...... 00:20:33.999 000003c0 e6 1a 16 88 53 02 20 50 56 6a fd 47 fd e8 99 bb ....S. PVj.G.... 00:20:33.999 000003d0 ff b7 5c 71 a5 18 47 6a 5b a4 00 ad 08 b7 c4 6e ..\q..Gj[......n 00:20:33.999 000003e0 b0 36 2d 03 24 3d 83 a3 ee c8 88 d7 56 89 04 20 .6-.$=......V.. 00:20:33.999 000003f0 e6 32 b5 e6 c4 8d 2d f2 45 3e f3 a0 d2 db 67 c2 .2....-.E>....g. 00:20:33.999 host pubkey: 00:20:33.999 00000000 32 e9 96 32 8c 3b 9b 13 b9 df ee d7 e5 cf cc cc 2..2.;.......... 00:20:33.999 00000010 a2 af 60 0c 6c fe 46 2b 08 65 50 0d 93 a0 e8 8f ..`.l.F+.eP..... 00:20:33.999 00000020 25 f2 56 50 ed 2a a2 56 10 ff a3 1a 2f b2 63 c6 %.VP.*.V..../.c. 00:20:33.999 00000030 af 62 d4 17 2f 0a 40 5a 62 3b 97 5d ee 0c 89 7f .b../.@Zb;.].... 00:20:33.999 00000040 30 35 b1 f6 67 6a 61 35 ea 3b 65 1d cf 09 b9 5e 05..gja5.;e....^ 00:20:33.999 00000050 08 d0 66 20 e2 51 ba 6b 68 c0 df 76 35 a2 5e 74 ..f .Q.kh..v5.^t 00:20:33.999 00000060 41 53 35 4b 86 6f b4 f6 dd 74 68 12 b2 83 e9 21 AS5K.o...th....! 00:20:33.999 00000070 fd bf b9 ed aa db 74 9c 81 a7 2f c3 73 d9 04 89 ......t.../.s... 00:20:33.999 00000080 06 c1 02 df d4 c2 5f dd 0c f9 0a 09 4c 56 75 ae ......_.....LVu. 00:20:33.999 00000090 d1 3d 3b 3e e7 10 2b 1b 71 43 eb 66 63 99 c3 25 .=;>..+.qC.fc..% 00:20:33.999 000000a0 7b fd 6e 43 4d be 13 75 3f 69 3e db cf af bb 9d {.nCM..u?i>..... 00:20:33.999 000000b0 1a d1 fa b4 3a 43 b1 b2 9b a7 24 9b a9 f6 61 f6 ....:C....$...a. 00:20:33.999 000000c0 8f 7b 4a b4 e8 54 ab 6b cd 33 69 5b 4c 7c a1 27 .{J..T.k.3i[L|.' 00:20:33.999 000000d0 35 b7 a4 a1 72 61 14 08 12 f7 f7 a3 d5 35 71 0b 5...ra.......5q. 00:20:33.999 000000e0 06 fc ad dc 03 e9 46 eb f0 2a 4c 56 1c 3f d6 42 ......F..*LV.?.B 00:20:33.999 000000f0 07 02 e3 b0 2d 0c 9a f7 e4 f4 79 a9 ad 32 18 0a ....-.....y..2.. 00:20:33.999 00000100 11 16 ce e1 ce 4c 83 2d 76 9a c7 31 b3 ab 69 46 .....L.-v..1..iF 00:20:33.999 00000110 bb 0d 78 97 14 da a8 43 2b 41 4f c0 d2 15 99 fa ..x....C+AO..... 00:20:33.999 00000120 bd 7a ac be 03 f4 3a 6e 7f 02 6b ff b8 84 6e 00 .z....:n..k...n. 00:20:33.999 00000130 4e 77 57 4c c0 28 2d 78 97 a1 2f 91 99 c7 bb 38 NwWL.(-x../....8 00:20:33.999 00000140 a2 17 30 94 6e 17 e2 f5 b1 00 92 06 12 33 21 41 ..0.n........3!A 00:20:33.999 00000150 b8 8e 58 82 81 9c e7 be 1e 1d 3f 1d dc 15 b1 7b ..X.......?....{ 00:20:33.999 00000160 95 e5 b9 8d 84 77 01 72 61 73 ba 35 02 87 1f db .....w.ras.5.... 00:20:33.999 00000170 1d 97 61 a7 96 09 b8 52 82 a7 43 3d 65 18 a3 96 ..a....R..C=e... 00:20:33.999 00000180 9e d1 96 7b 04 e1 f1 9b 1e 8f 89 98 4b fc 31 08 ...{........K.1. 00:20:33.999 00000190 1d ac ea bd 4f 82 14 fc 24 fd 99 39 0b eb d0 2f ....O...$..9.../ 00:20:33.999 000001a0 ae 06 a8 6d 7a ba 9f 55 c2 b8 60 55 94 a0 ec 83 ...mz..U..`U.... 00:20:33.999 000001b0 30 bc 4b 59 31 88 ca ef 57 92 f7 a5 8e 18 39 93 0.KY1...W.....9. 00:20:33.999 000001c0 76 bb 8e ba 6f 47 d0 86 3e f4 ab 1d 20 d8 68 86 v...oG..>... .h. 00:20:33.999 000001d0 5c d2 98 11 3a 6d a5 04 56 a3 68 a5 36 b4 fe bd \...:m..V.h.6... 00:20:33.999 000001e0 c6 39 22 f8 fc 7a 64 c7 7f 08 f5 4c e9 01 f7 ea .9"..zd....L.... 00:20:33.999 000001f0 e5 1f ee d0 8d bd 33 fe 75 fe fe 2b c6 fe e4 06 ......3.u..+.... 00:20:33.999 00000200 42 b8 cd 43 ea 95 6a f2 4d e3 4b e5 f6 be 2e b0 B..C..j.M.K..... 00:20:33.999 00000210 19 5f 17 28 57 f2 37 51 ec bd 3b 17 d2 9a 13 72 ._.(W.7Q..;....r 00:20:33.999 00000220 2b 5b c3 63 28 b6 45 85 d3 4e de 67 57 31 18 50 +[.c(.E..N.gW1.P 00:20:33.999 00000230 48 98 26 64 4a fe da b4 e2 c4 be 41 50 0c 13 a6 H.&dJ......AP... 00:20:33.999 00000240 70 87 25 50 55 ff 09 fa 1b 20 ae 9c f8 17 fc b6 p.%PU.... ...... 00:20:33.999 00000250 81 72 38 08 3e 75 d4 8d 84 22 e1 b3 b9 05 7d fd .r8.>u..."....}. 00:20:33.999 00000260 02 d8 0e 7f 93 14 82 3b 45 03 36 c3 78 5c 41 ce .......;E.6.x\A. 00:20:33.999 00000270 82 e0 66 1d 5c 59 db 19 63 5c 4e 4c 2f c0 94 51 ..f.\Y..c\NL/..Q 00:20:33.999 00000280 e9 dc 9c 1d 4f 7d 39 e0 8a ed 62 cf 9a c1 2e 6c ....O}9...b....l 00:20:33.999 00000290 b9 1c 2e 97 86 3d 37 d7 90 b7 48 c6 81 ca 2c af .....=7...H...,. 00:20:33.999 000002a0 51 8e b3 99 5b d3 d0 a2 2b 79 4a cc 0c e3 5d 29 Q...[...+yJ...]) 00:20:33.999 000002b0 73 4f 85 62 8a ff ae 46 50 a8 cc 28 72 d6 49 81 sO.b...FP..(r.I. 00:20:33.999 000002c0 1e d8 2e ee 73 4c 97 25 b0 c1 d5 dd 14 80 95 eb ....sL.%........ 00:20:33.999 000002d0 d5 15 cc 79 57 89 b8 eb c2 ee bd 8e bb c1 3e 7d ...yW.........>} 00:20:33.999 000002e0 d0 86 68 75 4d 5f ef 52 34 b1 bf 38 c2 28 c7 85 ..huM_.R4..8.(.. 00:20:33.999 000002f0 42 b7 a7 cf 18 66 a7 1d 04 57 86 e7 1d 06 9f d4 B....f...W...... 00:20:33.999 00000300 ce 18 86 c1 87 56 81 35 77 72 a3 5c 77 3e 98 13 .....V.5wr.\w>.. 00:20:33.999 00000310 c9 d5 e6 12 d0 0a 56 ce 02 8c 79 ee 42 34 7c 0b ......V...y.B4|. 00:20:33.999 00000320 23 7e f0 c2 9c 9f 9c fe 47 6c 7f 46 80 4b 0c 6a #~......Gl.F.K.j 00:20:33.999 00000330 9a 36 9e e6 de 87 e6 3f 55 53 74 9b f9 d4 9c fe .6.....?USt..... 00:20:33.999 00000340 4b b2 72 af 1b bf 64 00 6b c6 98 7d 9e 29 97 be K.r...d.k..}.).. 00:20:33.999 00000350 4b bf ff 62 72 ca f0 64 32 ee 03 ca aa 85 cf a4 K..br..d2....... 00:20:33.999 00000360 cd a8 41 f0 14 98 65 76 77 f7 78 ab 1c 9e 3f 26 ..A...evw.x...?& 00:20:33.999 00000370 cb f2 58 42 06 9a 18 cd cb 49 eb c5 de 59 14 25 ..XB.....I...Y.% 00:20:33.999 00000380 c3 33 39 9e 70 b9 7a df 40 c9 2f f1 28 69 67 5e .39.p.z.@./.(ig^ 00:20:33.999 00000390 0e 1a ee a2 28 84 61 5b ad 8f 7d 6b 43 38 bb 47 ....(.a[..}kC8.G 00:20:33.999 000003a0 0d 56 20 39 de d1 e5 42 06 37 44 d4 e6 38 db e6 .V 9...B.7D..8.. 00:20:33.999 000003b0 41 92 72 a6 a1 ec 14 08 43 d0 f2 a6 db e2 51 c7 A.r.....C.....Q. 00:20:33.999 000003c0 89 64 e6 15 d7 31 37 5d 90 3b 63 4d fb 39 55 0f .d...17].;cM.9U. 00:20:33.999 000003d0 f0 ad d9 7e c4 37 09 51 65 22 84 53 e6 cf 39 31 ...~.7.Qe".S..91 00:20:34.000 000003e0 74 5f f6 7e b4 cf 81 7f 0a a6 89 36 e4 3a 0e d2 t_.~.......6.:.. 00:20:34.000 000003f0 92 89 c2 86 21 d7 de 4c af 75 f0 48 29 82 21 06 ....!..L.u.H).!. 00:20:34.000 dh secret: 00:20:34.000 00000000 6e c6 58 44 a8 16 e9 a7 af 2e 31 8f 36 69 81 70 n.XD......1.6i.p 00:20:34.000 00000010 a5 b9 18 18 58 97 49 f9 61 dd 1d fe de 1b fc 15 ....X.I.a....... 00:20:34.000 00000020 9a e1 52 91 2e 4b 40 ee cc 19 8f 4a 6e ca dc 31 ..R..K@....Jn..1 00:20:34.000 00000030 26 0d 4b e1 da de ce fa 7c 5c 35 3b 08 b5 5e 0b &.K.....|\5;..^. 00:20:34.000 00000040 1f 7a 26 6c 40 d3 7e a3 b2 98 be 6f 67 5e eb fe .z&l@.~....og^.. 00:20:34.000 00000050 78 dc 4e 7c de d2 da 78 cf 4e 35 6d 56 98 29 60 x.N|...x.N5mV.)` 00:20:34.000 00000060 4e f9 ec 1d 25 07 f3 f5 9e c2 97 fc 36 e3 63 f4 N...%.......6.c. 00:20:34.000 00000070 25 97 f9 8a d9 e1 77 6b 30 e0 36 82 86 e0 ba 6b %.....wk0.6....k 00:20:34.000 00000080 7d e5 4a a2 85 ed 4e 5d 13 07 6d ae 3e ef c4 f3 }.J...N]..m.>... 00:20:34.000 00000090 8b a1 ea d2 bb 15 10 7d be 9e 9e ea 66 ed 46 8d .......}....f.F. 00:20:34.000 000000a0 19 7e 3a 69 61 d2 0e 97 b4 f6 90 9a 70 b1 69 62 .~:ia.......p.ib 00:20:34.000 000000b0 4c a4 a1 59 83 fe 83 08 87 4d c2 d8 76 35 be e9 L..Y.....M..v5.. 00:20:34.000 000000c0 94 d9 59 2e 59 88 a4 6e ab 64 ea ea 19 6c 24 8b ..Y.Y..n.d...l$. 00:20:34.000 000000d0 47 50 15 30 d0 b6 a8 67 e1 4e e5 f0 19 dd 2d 9c GP.0...g.N....-. 00:20:34.000 000000e0 c7 a4 2e 32 ab 31 7b c2 47 3f 0b e1 46 a3 06 c4 ...2.1{.G?..F... 00:20:34.000 000000f0 33 1d 62 65 36 04 86 8f e4 41 d5 77 cd 44 46 a4 3.be6....A.w.DF. 00:20:34.000 00000100 8f f7 0c f4 3c a8 59 5a d3 89 f7 be 34 a0 e7 aa ....<.YZ....4... 00:20:34.000 00000110 e7 62 f7 ba 53 ff 52 f4 48 86 eb 29 71 1d a9 77 .b..S.R.H..)q..w 00:20:34.000 00000120 85 ea 4a e8 c5 44 fa a1 83 99 d7 09 13 35 db 30 ..J..D.......5.0 00:20:34.000 00000130 6f 5e b7 cd 27 b0 e1 61 ee db 72 3b cb ab fb d9 o^..'..a..r;.... 00:20:34.000 00000140 1a d7 c2 3e e1 56 a5 7c 02 08 a6 17 c0 0d 98 1e ...>.V.|........ 00:20:34.000 00000150 b2 54 b9 6e 04 63 07 22 83 66 36 53 59 0d 32 4b .T.n.c.".f6SY.2K 00:20:34.000 00000160 bc dc 93 d2 13 f7 ed 2e 77 18 11 b2 b7 58 5a 89 ........w....XZ. 00:20:34.000 00000170 e0 bd 5f ef 23 aa 03 ea 2f 0a c8 81 ba 52 92 33 .._.#.../....R.3 00:20:34.000 00000180 bc 9b 9b b2 99 37 b3 3f 62 36 dd de d9 05 71 0d .....7.?b6....q. 00:20:34.000 00000190 d6 b4 f2 01 c1 02 ab ad cd 1f 17 2d 1b 0d 48 d3 ...........-..H. 00:20:34.000 000001a0 ee 1c 77 b4 0e e0 83 2c c5 25 c3 4e 5f 36 c1 c0 ..w....,.%.N_6.. 00:20:34.000 000001b0 ca b4 99 21 70 ed ea b5 51 04 0c 91 06 cc d1 b7 ...!p...Q....... 00:20:34.000 000001c0 5c 6c 8e 1a 3f 7c cf e5 b9 e5 d1 f3 f2 95 ef b2 \l..?|.......... 00:20:34.000 000001d0 be 4e 3b 07 29 71 f7 3a 07 6f 65 d5 31 44 ad df .N;.)q.:.oe.1D.. 00:20:34.000 000001e0 a8 26 75 dd d0 30 6d 9f 67 30 aa 81 38 6c c0 ee .&u..0m.g0..8l.. 00:20:34.000 000001f0 18 a5 f2 c7 60 1f df b1 86 48 7d c2 7b 1d dc 7b ....`....H}.{..{ 00:20:34.000 00000200 a6 19 a8 e5 27 66 fd 46 e6 ac a5 e2 dc 4a f5 1a ....'f.F.....J.. 00:20:34.000 00000210 58 16 f6 77 ec 83 e9 45 9c f7 4e f6 60 b3 55 9a X..w...E..N.`.U. 00:20:34.000 00000220 4d 23 4d 1e 53 89 72 f9 5d ed 85 fc 18 d1 f4 dc M#M.S.r.]....... 00:20:34.000 00000230 6d 54 05 1b 60 c9 24 51 1d 85 ab 00 fb e7 c4 23 mT..`.$Q.......# 00:20:34.000 00000240 f0 2a f6 79 ad 22 e2 3b d0 98 a3 f9 4f cd 33 0c .*.y.".;....O.3. 00:20:34.000 00000250 0a 2b 02 7d 69 ab f8 1d 4a 6f 60 b8 d8 1d a2 2d .+.}i...Jo`....- 00:20:34.000 00000260 f7 47 aa c8 2c 18 95 e2 8e 88 12 5e 0c ba e3 80 .G..,......^.... 00:20:34.000 00000270 fc 0a 50 62 82 39 07 c2 92 31 24 05 10 2e 9f 88 ..Pb.9...1$..... 00:20:34.000 00000280 0a 60 0a bf 11 a5 c3 29 2a f3 69 2f 5a 94 89 91 .`.....)*.i/Z... 00:20:34.000 00000290 5a 76 ec 1d 7a 3c 3c d4 ad eb 29 87 80 48 df 43 Zv..z<<...)..H.C 00:20:34.000 000002a0 4c 0b e1 ce bf 48 7e 7f da f9 1a 49 81 b4 dc 8f L....H~....I.... 00:20:34.000 000002b0 1d 67 fb 37 b8 b0 74 a9 be 71 4d 85 0c 0e 40 ad .g.7..t..qM...@. 00:20:34.000 000002c0 53 c1 92 6a b2 cf c1 1e 5b 24 25 24 33 3a 27 ea S..j....[$%$3:'. 00:20:34.000 000002d0 d6 77 f6 bb 02 ca b7 80 19 a6 f7 72 4e 03 da 53 .w.........rN..S 00:20:34.000 000002e0 5a 9f 27 5e 03 85 2f db 48 50 fd f5 ec af 9d 16 Z.'^../.HP...... 00:20:34.000 000002f0 85 ad 58 25 c4 6e c2 d0 c0 5f 24 7f 31 81 01 59 ..X%.n..._$.1..Y 00:20:34.000 00000300 fd 9a 1d a1 aa bf ce f6 4a b9 47 9f 8c 23 2b 22 ........J.G..#+" 00:20:34.000 00000310 71 17 a6 79 2f 7d a9 35 0f 03 e0 b2 5e 0f 68 b3 q..y/}.5....^.h. 00:20:34.000 00000320 cf d6 37 99 30 d7 ab 64 9c 6e 55 ff da a4 22 ee ..7.0..d.nU...". 00:20:34.000 00000330 5b f5 85 59 5f a5 8a 5e 8a ba bc da a6 8a 27 bb [..Y_..^......'. 00:20:34.000 00000340 c6 e2 bc 7d 5b dc 8d 5a 95 92 e0 43 fe 50 88 22 ...}[..Z...C.P." 00:20:34.000 00000350 95 88 47 95 14 64 c3 13 68 94 eb 71 22 20 2a 95 ..G..d..h..q" *. 00:20:34.000 00000360 a0 71 6a 13 d2 8d 22 15 00 02 fc f1 a7 5d 61 2f .qj..."......]a/ 00:20:34.000 00000370 c5 53 09 ff ce 6e ed e9 15 49 9f 11 0b 01 6c 08 .S...n...I....l. 00:20:34.000 00000380 7b dd 69 37 98 31 7d 06 c5 07 f7 26 23 c5 59 73 {.i7.1}....&#.Ys 00:20:34.000 00000390 ee c9 a7 b3 0e 15 f8 e1 bb 3d 21 11 ee 14 e1 b4 .........=!..... 00:20:34.000 000003a0 de 3d cc 56 b8 85 30 9b ff ce e9 14 97 62 a9 bd .=.V..0......b.. 00:20:34.000 000003b0 ac 20 47 dc 18 b5 26 53 b2 55 f1 41 2b 84 9b 8f . G...&S.U.A+... 00:20:34.000 000003c0 5c 89 05 15 f0 bd 29 a7 a6 40 bf 33 77 ae 2f 8d \.....)..@.3w./. 00:20:34.000 000003d0 55 d9 92 56 ef 75 98 70 4e a9 8c e8 86 6c 6e 96 U..V.u.pN....ln. 00:20:34.000 000003e0 c8 4e 3a 32 68 81 c5 02 27 7a 45 30 02 ab c1 96 .N:2h...'zE0.... 00:20:34.000 000003f0 d9 a6 61 77 e3 c4 8b 8a 1f 36 24 ea 63 8b a9 40 ..aw.....6$.c..@ 00:20:34.000 [2024-09-27 15:25:34.978534] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] key=key4, hash=3, dhgroup=5, seq=3428451844, tid=1, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=64 00:20:34.000 [2024-09-27 15:25:34.978627] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-reply 00:20:34.000 [2024-09-27 15:25:35.061077] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: await-success1 00:20:34.000 [2024-09-27 15:25:35.061106] nvme_auth.c:1179:nvme_fabric_qpair_authenticate_poll: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] authentication completed successfully 00:20:34.000 [2024-09-27 15:25:35.061113] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:1] auth state: done 00:20:34.000 [2024-09-27 15:25:35.219274] nvme_auth.c:1230:nvme_fabric_qpair_authenticate_async: *ERROR*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] missing DH-HMAC-CHAP key 00:20:34.000 [2024-09-27 15:25:35.219301] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x2000138f1680 00:20:34.000 [2024-09-27 15:25:35.237388] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:20:34.000 [2024-09-27 15:25:35.237411] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2024-02.io.spdk:cnode0] in failed state. 00:20:34.000 [2024-09-27 15:25:35.237432] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2024-02.io.spdk:cnode0] Ctrlr is in error state 00:20:34.000 [2024-09-27 15:25:35.237443] nvme.c: 708:nvme_ctrlr_poll_internal: *ERROR*: Failed to initialize SSD: 10.0.0.2 00:20:34.000 [2024-09-27 15:25:35.237452] nvme.c: 884:nvme_dummy_attach_fail_cb: *ERROR*: Failed to attach nvme ctrlr: trtype=RDMA adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 subnqn=nqn.2024-02.io.spdk:cnode0, Operation not permitted 00:20:34.000 [2024-09-27 15:25:35.237462] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2024-02.io.spdk:cnode0] already in failed state 00:20:34.000 [2024-09-27 15:25:35.362123] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:34.000 [2024-09-27 15:25:35.362146] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:34.000 [2024-09-27 15:25:35.362154] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:34.000 [2024-09-27 15:25:35.362199] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:34.000 [2024-09-27 15:25:35.362229] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:34.000 ctrlr pubkey: 00:20:34.000 00000000 ac 1b e3 4e 64 37 e4 96 34 51 da 90 e3 5f 6e cf ...Nd7..4Q..._n. 00:20:34.000 00000010 ec a2 a3 3f 41 ca 6b 9e 2c af a0 d0 04 97 1f cc ...?A.k.,....... 00:20:34.000 00000020 84 69 24 c7 45 5a a8 c9 ee dc 7f c5 ee a8 b2 61 .i$.EZ.........a 00:20:34.000 00000030 f4 88 dd 1b c3 73 26 01 2a 43 bb 19 ce 4c 3e a0 .....s&.*C...L>. 00:20:34.000 00000040 86 3f f7 2a 02 94 4f af e2 e9 e4 28 5a 8a 40 ec .?.*..O....(Z.@. 00:20:34.000 00000050 79 fd ea f8 d2 87 35 3b 98 f9 48 1c 96 48 9e 25 y.....5;..H..H.% 00:20:34.000 00000060 76 36 8e 75 ec fc 0a e3 05 3b df 39 8b 0c 68 b0 v6.u.....;.9..h. 00:20:34.000 00000070 88 75 97 3f 25 a4 4f c5 e8 d4 7f c8 e0 a5 46 b2 .u.?%.O.......F. 00:20:34.000 00000080 b7 93 65 75 bd a2 5f 50 fa d3 c3 5e fe c6 f1 9f ..eu.._P...^.... 00:20:34.000 00000090 86 72 a9 41 cf f1 66 d3 09 09 e1 f8 1b 80 60 31 .r.A..f.......`1 00:20:34.000 000000a0 bd 4e 23 83 87 29 26 c6 e9 e8 99 ed c7 28 6f 29 .N#..)&......(o) 00:20:34.000 000000b0 66 c5 64 0c 5a 0c c5 5e f6 0b 55 86 f5 22 30 db f.d.Z..^..U.."0. 00:20:34.000 000000c0 e8 40 c2 7d cd 8c e9 65 d6 63 a5 61 39 97 e6 aa .@.}...e.c.a9... 00:20:34.000 000000d0 03 35 7c 97 46 46 6b 05 c5 46 93 d6 5b b6 3e 4d .5|.FFk..F..[.>M 00:20:34.000 000000e0 b8 5e 0a c2 27 0a 69 19 c9 2d b0 bc 40 4b 48 eb .^..'.i..-..@KH. 00:20:34.000 000000f0 bc bf 43 8c cc 14 39 c3 84 d2 7d f3 43 d4 0a 0f ..C...9...}.C... 00:20:34.000 host pubkey: 00:20:34.001 00000000 53 9b b9 ab 15 b9 d4 cd 5c e2 30 08 f3 c1 0a b6 S.......\.0..... 00:20:34.001 00000010 3a 64 e1 1f 6d 58 46 16 50 da 00 63 47 8a 7f cc :d..mXF.P..cG... 00:20:34.001 00000020 9f 9f 60 af 0c 4a 0f a8 37 d6 61 37 2f ae d4 7f ..`..J..7.a7/... 00:20:34.001 00000030 bf 05 12 b2 80 98 57 13 23 38 fd be 4d 46 5b f5 ......W.#8..MF[. 00:20:34.001 00000040 79 f1 75 7f ce 65 a0 f9 dd 2b b1 15 4a b8 57 8e y.u..e...+..J.W. 00:20:34.001 00000050 15 42 2d 2f 37 69 29 3b cb 00 a7 ef 94 b2 78 40 .B-/7i);......x@ 00:20:34.001 00000060 29 28 e1 b0 44 a8 db 56 8f 28 e5 72 14 6f a7 71 )(..D..V.(.r.o.q 00:20:34.001 00000070 9b e1 17 1a 7c 11 e4 9a b9 a9 be 73 05 0e 76 25 ....|......s..v% 00:20:34.001 00000080 b7 b1 6e e5 72 25 dd 73 96 09 3c 60 10 52 1f 20 ..n.r%.s..<`.R. 00:20:34.001 00000090 fa 22 98 e2 e9 13 13 49 60 09 66 3a 7c 5e ff 3f .".....I`.f:|^.? 00:20:34.001 000000a0 d2 3c a3 92 8a 61 a9 3f 4b ec 19 87 25 cc f2 a1 .<...a.?K...%... 00:20:34.001 000000b0 22 7f 0d de 97 a1 98 1d ff 33 8f 88 21 c6 d9 18 "........3..!... 00:20:34.001 000000c0 4d d5 22 74 37 85 7b b0 91 1d 9c e6 94 a3 78 bb M."t7.{.......x. 00:20:34.001 000000d0 27 73 3a 6b cd 16 cb b2 a2 19 22 69 5a 78 c1 6b 's:k......"iZx.k 00:20:34.001 000000e0 32 b6 f7 ce aa 7e be c4 8e 19 cd d6 9f 9f ed 3a 2....~.........: 00:20:34.001 000000f0 36 9b 38 7b 4e bd aa 75 c7 1a ea f5 7e 56 43 8b 6.8{N..u....~VC. 00:20:34.001 dh secret: 00:20:34.001 00000000 7f 3e 50 a5 fa 9f 12 ca 3f 66 40 dc a8 01 41 97 .>P.....?f@...A. 00:20:34.001 00000010 54 ef a7 3f 20 79 d2 90 51 c8 67 62 19 5c 8a 7f T..? y..Q.gb.\.. 00:20:34.001 00000020 28 06 9c 4c 5d e3 bf 90 e4 37 4e 6c 70 02 49 d5 (..L]....7Nlp.I. 00:20:34.001 00000030 7e 0d 45 88 66 13 49 ab fc 67 2c 12 14 ac cf d1 ~.E.f.I..g,..... 00:20:34.001 00000040 60 47 1f 45 6b 63 b8 e4 8d 82 9a 36 b8 54 f4 c8 `G.Ekc.....6.T.. 00:20:34.001 00000050 d2 23 14 be 53 cd ba 67 df b8 20 cf 53 13 6c db .#..S..g.. .S.l. 00:20:34.001 00000060 10 22 a9 62 36 e6 ee 87 47 fb 6b 9e d9 8c b6 a8 .".b6...G.k..... 00:20:34.001 00000070 95 6f 95 1c 5c 62 20 09 d0 8e 37 6f fe 74 97 c5 .o..\b ...7o.t.. 00:20:34.001 00000080 71 31 45 d5 fe d9 b9 6e d2 0a 74 e9 62 c7 15 ad q1E....n..t.b... 00:20:34.001 00000090 ae f1 ce 32 db c9 75 65 fb d5 65 82 70 81 ad 72 ...2..ue..e.p..r 00:20:34.001 000000a0 02 68 f4 41 75 49 d5 e1 ba 03 6a 95 96 a5 e2 08 .h.AuI....j..... 00:20:34.001 000000b0 8b 55 0e 8d f0 b5 fd fb d2 c0 4e 90 d6 3e c5 57 .U........N..>.W 00:20:34.001 000000c0 75 59 c4 0a cf 43 55 89 35 41 62 1a ea 50 64 f9 uY...CU.5Ab..Pd. 00:20:34.001 000000d0 65 65 7e 85 64 43 3e 8a af 06 94 05 ff ef 1a 82 ee~.dC>......... 00:20:34.001 000000e0 75 bb 02 a6 89 03 aa 1e 16 11 c7 cf f5 35 93 47 u............5.G 00:20:34.001 000000f0 71 8a 29 e0 3e 53 46 65 72 c2 de 42 ff 76 c1 07 q.).>SFer..B.v.. 00:20:34.001 [2024-09-27 15:25:35.364780] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key2, hash=1, dhgroup=1, seq=3428451845, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:34.001 [2024-09-27 15:25:35.381806] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:34.001 [2024-09-27 15:25:35.400411] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:34.001 [2024-09-27 15:25:35.400437] nvme_auth.c: 764:nvme_auth_check_message: *ERROR*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] received AUTH_failure1: rc=1, rce=1 (authentication failed) 00:20:34.001 [2024-09-27 15:25:35.400453] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:34.001 [2024-09-27 15:25:35.400461] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x2000138f1680 00:20:34.001 [2024-09-27 15:25:35.428999] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:20:34.001 [2024-09-27 15:25:35.429056] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2024-02.io.spdk:cnode0] in failed state. 00:20:34.001 [2024-09-27 15:25:35.429100] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2024-02.io.spdk:cnode0] Ctrlr is in error state 00:20:34.001 [2024-09-27 15:25:35.429131] nvme.c: 708:nvme_ctrlr_poll_internal: *ERROR*: Failed to initialize SSD: 10.0.0.2 00:20:34.001 [2024-09-27 15:25:35.429159] nvme.c: 884:nvme_dummy_attach_fail_cb: *ERROR*: Failed to attach nvme ctrlr: trtype=RDMA adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 subnqn=nqn.2024-02.io.spdk:cnode0, Operation not permitted 00:20:34.001 [2024-09-27 15:25:35.429188] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2024-02.io.spdk:cnode0] already in failed state 00:20:34.001 [2024-09-27 15:25:35.549639] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: negotiate 00:20:34.001 [2024-09-27 15:25:35.549658] nvme_auth.c: 796:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] digest: 1 (sha256) 00:20:34.001 [2024-09-27 15:25:35.549666] nvme_auth.c: 804:nvme_auth_send_negotiate: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] dhgroup: 1 (ffdhe2048) 00:20:34.001 [2024-09-27 15:25:35.549711] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-negotiate 00:20:34.001 [2024-09-27 15:25:35.549734] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-challenge 00:20:34.001 ctrlr pubkey: 00:20:34.001 00000000 60 0d bb 15 25 b9 91 8e ff 06 69 b5 39 5a 2a fa `...%.....i.9Z*. 00:20:34.001 00000010 37 c5 db 64 03 21 d5 e6 36 c2 47 58 4a 1c 6f e1 7..d.!..6.GXJ.o. 00:20:34.001 00000020 41 10 19 a5 bf ec b8 f4 0d 59 fa 32 57 45 3e 31 A........Y.2WE>1 00:20:34.001 00000030 27 f0 72 0d 28 5b 7b cb 02 7d e0 83 4a 59 23 d8 '.r.([{..}..JY#. 00:20:34.001 00000040 c4 eb 03 f9 aa 12 9e d0 e7 aa 30 cb 57 c0 98 d6 ..........0.W... 00:20:34.001 00000050 07 c4 4e b6 45 79 01 df 4d fb 57 e0 df f5 96 e2 ..N.Ey..M.W..... 00:20:34.001 00000060 b7 84 0d 7d b4 18 2f 23 c5 fb d3 b9 4d 51 a5 42 ...}../#....MQ.B 00:20:34.001 00000070 f6 6a 30 d7 5f ed 11 f8 e7 d8 2a 6f e7 97 84 33 .j0._.....*o...3 00:20:34.001 00000080 f1 86 8b e6 b5 79 b6 59 71 eb 4b eb 3d 7c 1b cc .....y.Yq.K.=|.. 00:20:34.001 00000090 96 d9 4d 62 7e d8 f4 af 7e 9a 6d 50 64 c8 1a 12 ..Mb~...~.mPd... 00:20:34.001 000000a0 cf e8 36 c2 fb 11 2a 95 35 66 c7 9b e0 1d eb da ..6...*.5f...... 00:20:34.001 000000b0 9a 30 4f 0c b2 71 84 13 e7 61 1c be ce 8e 86 c6 .0O..q...a...... 00:20:34.001 000000c0 6e b3 69 bc 82 03 e2 4e 76 23 68 2a 86 78 e2 6b n.i....Nv#h*.x.k 00:20:34.001 000000d0 f6 30 cc b0 26 f8 4c e5 ce c2 5e 55 82 29 ea de .0..&.L...^U.).. 00:20:34.001 000000e0 42 e1 96 ed b9 aa 67 1d d2 65 4d 4e 01 1b c8 2c B.....g..eMN..., 00:20:34.001 000000f0 fc 00 53 21 c0 6d 8b 92 e0 36 9a f7 1d f9 61 25 ..S!.m...6....a% 00:20:34.001 host pubkey: 00:20:34.001 00000000 9a b9 33 6d b5 99 56 c5 36 d7 89 98 9b b2 99 fb ..3m..V.6....... 00:20:34.001 00000010 c2 03 b1 b7 48 ba 7a 4a b5 6b 94 af b7 09 50 1b ....H.zJ.k....P. 00:20:34.001 00000020 53 b8 d4 55 c1 c8 65 63 08 c5 fa ed e9 8c 72 f1 S..U..ec......r. 00:20:34.001 00000030 b4 5c 43 bf ec 61 e0 46 f7 2a 2c c9 c0 1e 6e 3b .\C..a.F.*,...n; 00:20:34.001 00000040 11 eb 55 7f 9f ef 39 7f 8a 43 7f 24 11 9a 8d 54 ..U...9..C.$...T 00:20:34.001 00000050 1b 8c 00 bf 93 ee b6 7f 50 38 0e 05 27 cc 59 58 ........P8..'.YX 00:20:34.001 00000060 6b 3b ea 86 61 c7 ad f6 d4 92 95 84 d7 e5 73 af k;..a.........s. 00:20:34.001 00000070 f4 b1 cb 5c 55 0a 7b fa 35 0d 00 b9 86 0c 35 fe ...\U.{.5.....5. 00:20:34.001 00000080 da c5 10 ab c2 21 49 61 2a 1d 72 70 ac 61 2c 6f .....!Ia*.rp.a,o 00:20:34.001 00000090 fc 67 ba b4 6d 6e 07 72 25 8f e5 9c dd a0 41 45 .g..mn.r%.....AE 00:20:34.001 000000a0 f8 f5 4c c3 c5 27 9a 55 db 1f 09 36 8b c6 46 d1 ..L..'.U...6..F. 00:20:34.001 000000b0 9f 18 77 6e d8 f5 b9 05 c3 07 13 39 83 84 9a a0 ..wn.......9.... 00:20:34.001 000000c0 c3 7f 07 c6 cc 3c 90 05 e4 e4 52 c3 cc dd 9d 75 .....<....R....u 00:20:34.001 000000d0 c7 bf e7 31 14 f7 9a a8 e9 cb 6a 37 2b 24 ce 8e ...1......j7+$.. 00:20:34.001 000000e0 12 bf 7d 0f 3b e5 2d e4 00 a7 75 ee ef 92 2f 9c ..}.;.-...u.../. 00:20:34.001 000000f0 55 ed a2 bf 4f e9 54 69 9f ac 08 a1 13 c2 43 30 U...O.Ti......C0 00:20:34.001 dh secret: 00:20:34.001 00000000 c3 24 e2 c1 36 59 f2 50 2e 18 9b d0 1c ad 1f 7a .$..6Y.P.......z 00:20:34.001 00000010 38 97 e0 cc 70 5a 5e 18 1a bb 02 06 a4 8b 64 3e 8...pZ^.......d> 00:20:34.001 00000020 a4 dd 08 27 2a a1 52 4c cd 9b 72 13 3b e6 ab ec ...'*.RL..r.;... 00:20:34.001 00000030 b8 3a 30 2f b2 27 ac 72 66 29 53 95 68 7e 7d f1 .:0/.'.rf)S.h~}. 00:20:34.001 00000040 22 4e fa 3a 71 bb 8a 4b f8 46 8e a5 4e e0 2e 9e "N.:q..K.F..N... 00:20:34.001 00000050 b8 f9 dc b2 71 7c 2f eb e9 08 7b 1b 53 03 99 f6 ....q|/...{.S... 00:20:34.001 00000060 f8 e0 1f 5e f3 7a a3 79 b2 62 ff 06 26 d8 0e a5 ...^.z.y.b..&... 00:20:34.001 00000070 ad 76 fc 03 d5 58 6b 9d 3a a7 0a 58 19 63 eb c5 .v...Xk.:..X.c.. 00:20:34.001 00000080 bc 87 87 46 03 89 d6 15 60 b1 ed 8a 42 e9 51 42 ...F....`...B.QB 00:20:34.001 00000090 4c 19 61 1e 4b 3a 89 0e 65 f4 98 20 3f 4a 3a 86 L.a.K:..e.. ?J:. 00:20:34.001 000000a0 f3 e4 98 6c 26 8d de 57 3a aa f1 34 39 9f 30 c4 ...l&..W:..49.0. 00:20:34.001 000000b0 99 a9 59 0b d2 ce a6 0a bf 64 00 1c f8 6a 47 54 ..Y......d...jGT 00:20:34.001 000000c0 b8 ee a0 6a a8 1b 1f c6 9a 8a 84 62 1a 95 c6 60 ...j.......b...` 00:20:34.001 000000d0 bc 6a 2f e8 bc 47 e5 10 e8 14 d9 1a b3 26 d5 71 .j/..G.......&.q 00:20:34.001 000000e0 74 d0 ee ee bb 57 ea c3 44 b2 b4 fe ea 2c e6 72 t....W..D....,.r 00:20:34.001 000000f0 de 53 8f b9 a7 c8 dc 60 59 cb d4 14 20 fc 64 a7 .S.....`Y... .d. 00:20:34.001 [2024-09-27 15:25:35.552476] nvme_auth.c: 950:nvme_auth_send_reply: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] key=key1, hash=1, dhgroup=1, seq=3428451846, tid=0, subnqn=nqn.2024-02.io.spdk:cnode0, hostnqn=nqn.2024-02.io.spdk:host0, len=32 00:20:34.001 [2024-09-27 15:25:35.555199] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-reply 00:20:34.001 [2024-09-27 15:25:35.555244] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-success1 00:20:34.001 [2024-09-27 15:25:35.555261] nvme_auth.c:1053:nvme_auth_check_success1: *ERROR*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] controller challenge mismatch 00:20:34.001 received: 00:20:34.001 00000000 66 93 6f 3e b0 cb 84 bf 37 0d 56 9e db dc 44 9f f.o>....7.V...D. 00:20:34.001 00000010 6b ec 54 0d 26 66 59 1e 72 95 d3 77 87 3c 24 d4 k.T.&fY.r..w.<$. 00:20:34.001 expected: 00:20:34.001 00000000 00 ce 7f e8 77 08 c4 31 94 8b 36 ed f1 94 b3 ed ....w..1..6..... 00:20:34.001 00000010 fa c6 56 20 cc 80 4d b5 43 87 7b 5a 48 d1 c9 cc ..V ..M.C.{ZH... 00:20:34.001 [2024-09-27 15:25:35.569064] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: await-failure2 00:20:34.001 [2024-09-27 15:25:35.569106] nvme_auth.c: 163:nvme_auth_set_state: *DEBUG*: [nqn.2024-02.io.spdk:cnode0:nqn.2024-02.io.spdk:host0:0] auth state: done 00:20:34.001 [2024-09-27 15:25:35.569118] nvme_rdma.c:2696:nvme_rdma_qpair_process_completions: *ERROR*: Failed to connect rqpair=0x2000138f1680 00:20:34.001 [2024-09-27 15:25:35.585510] nvme_qpair.c: 804:spdk_nvme_qpair_process_completions: *ERROR*: CQ transport error -6 (No such device or address) on qpair id 0 00:20:34.002 [2024-09-27 15:25:35.585530] nvme_ctrlr.c:1106:nvme_ctrlr_fail: *ERROR*: [nqn.2024-02.io.spdk:cnode0] in failed state. 00:20:34.002 [2024-09-27 15:25:35.585545] nvme_ctrlr.c:4193:nvme_ctrlr_process_init: *ERROR*: [nqn.2024-02.io.spdk:cnode0] Ctrlr is in error state 00:20:34.002 [2024-09-27 15:25:35.585570] nvme.c: 708:nvme_ctrlr_poll_internal: *ERROR*: Failed to initialize SSD: 10.0.0.2 00:20:34.002 [2024-09-27 15:25:35.585578] nvme.c: 884:nvme_dummy_attach_fail_cb: *ERROR*: Failed to attach nvme ctrlr: trtype=RDMA adrfam=IPv4 traddr=10.0.0.2 trsvcid=4420 subnqn=nqn.2024-02.io.spdk:cnode0, Operation not permitted 00:20:34.002 [2024-09-27 15:25:35.585588] nvme_ctrlr.c:1094:nvme_ctrlr_fail: *NOTICE*: [nqn.2024-02.io.spdk:cnode0] already in failed state 00:20:34.002 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1 -- # cleanup 00:20:34.306 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@24 -- # nvmftestfini 00:20:34.306 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@331 -- # nvmfcleanup 00:20:34.306 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@99 -- # sync 00:20:34.306 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@101 -- # '[' rdma == tcp ']' 00:20:34.306 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@101 -- # '[' rdma == rdma ']' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@102 -- # set +e 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@103 -- # for i in {1..20} 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@104 -- # modprobe -v -r nvme-rdma 00:20:34.307 rmmod nvme_rdma 00:20:34.307 rmmod nvme_fabrics 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@105 -- # modprobe -v -r nvme-fabrics 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@106 -- # set -e 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@107 -- # return 0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@332 -- # '[' -n 1857835 ']' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@333 -- # killprocess 1857835 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@950 -- # '[' -z 1857835 ']' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@954 -- # kill -0 1857835 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@955 -- # uname 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1857835 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1857835' 00:20:34.307 killing process with pid 1857835 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@969 -- # kill 1857835 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@974 -- # wait 1857835 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@335 -- # '[' '' == iso ']' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@338 -- # nvmf_fini 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@264 -- # local dev 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@267 -- # remove_target_ns 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@323 -- # xtrace_disable_per_cmd _remove_target_ns 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@22 -- # eval '_remove_target_ns 15> /dev/null' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@22 -- # _remove_target_ns 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@268 -- # delete_main_bridge 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@130 -- # [[ -e /sys/class/net/nvmf_br/address ]] 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@130 -- # return 0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_1/address ]] 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@279 -- # flush_ip mlx_0_1 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@221 -- # local dev=mlx_0_1 in_ns= 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_1' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_1 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@270 -- # for dev in "${dev_map[@]}" 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@271 -- # [[ -e /sys/class/net/mlx_0_0/address ]] 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@275 -- # (( 4 == 3 )) 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@279 -- # flush_ip mlx_0_0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@221 -- # local dev=mlx_0_0 in_ns= 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@222 -- # [[ -n '' ]] 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@224 -- # eval ' ip addr flush dev mlx_0_0' 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@224 -- # ip addr flush dev mlx_0_0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@283 -- # reset_setup_interfaces 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@41 -- # _dev=0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@41 -- # dev_map=() 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/setup.sh@284 -- # iptr 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@538 -- # iptables-save 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@538 -- # grep -v SPDK_NVMF 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@538 -- # iptables-restore 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@25 -- # rm /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0/allowed_hosts/nqn.2024-02.io.spdk:host0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@26 -- # rmdir /sys/kernel/config/nvmet/hosts/nqn.2024-02.io.spdk:host0 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@27 -- # clean_kernel_target 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@482 -- # [[ -e /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0 ]] 00:20:34.307 15:25:35 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@484 -- # echo 0 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@486 -- # rm -f /sys/kernel/config/nvmet/ports/1/subsystems/nqn.2024-02.io.spdk:cnode0 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@487 -- # rmdir /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0/namespaces/1 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@488 -- # rmdir /sys/kernel/config/nvmet/ports/1 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@489 -- # rmdir /sys/kernel/config/nvmet/subsystems/nqn.2024-02.io.spdk:cnode0 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@491 -- # modules=(/sys/module/nvmet/holders/*) 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@493 -- # modprobe -r nvmet_rdma nvmet 00:20:34.307 15:25:36 nvmf_rdma.nvmf_host.nvmf_auth_host -- nvmf/common.sh@496 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh 00:20:37.621 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:5e:00.0 (144d a80a): nvme -> vfio-pci 00:20:37.621 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:20:37.621 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:20:37.881 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:20:37.881 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:20:37.881 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:20:37.881 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:20:37.881 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:20:37.881 15:25:39 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@28 -- # rm -f /tmp/spdk.key-null.MgL /tmp/spdk.key-null.w0T /tmp/spdk.key-sha256.urL /tmp/spdk.key-sha384.GPR /tmp/spdk.key-sha512.GPy /var/jenkins/workspace/nvmf-phy-autotest/spdk/../output/nvme-auth.log 00:20:37.881 15:25:39 nvmf_rdma.nvmf_host.nvmf_auth_host -- host/auth.sh@31 -- # /var/jenkins/workspace/nvmf-phy-autotest/spdk/scripts/setup.sh 00:20:41.170 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:5e:00.0 (144d a80a): Already using the vfio-pci driver 00:20:41.170 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:20:41.170 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1125 -- # trap - ERR 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1125 -- # print_backtrace 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1153 -- # [[ ehxBET =~ e ]] 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1155 -- # args=('--transport=rdma' '/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh' 'nvmf_auth_host' '--transport=rdma') 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1155 -- # local args 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1157 -- # xtrace_disable 00:20:41.170 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@10 -- # set +x 00:20:41.170 ========== Backtrace start: ========== 00:20:41.170 00:20:41.430 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh:1125 -> run_test(["nvmf_auth_host"],["/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/host/auth.sh"],["--transport=rdma"]) 00:20:41.430 ... 00:20:41.430 1120 timing_enter $test_name 00:20:41.430 1121 echo "************************************" 00:20:41.430 1122 echo "START TEST $test_name" 00:20:41.430 1123 echo "************************************" 00:20:41.430 1124 xtrace_restore 00:20:41.430 1125 time "$@" 00:20:41.430 1126 xtrace_disable 00:20:41.430 1127 echo "************************************" 00:20:41.430 1128 echo "END TEST $test_name" 00:20:41.430 1129 echo "************************************" 00:20:41.430 1130 timing_exit $test_name 00:20:41.430 ... 00:20:41.430 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_host.sh:27 -> main(["--transport=rdma"]) 00:20:41.430 ... 00:20:41.430 22 run_test "nvmf_fio_host" $rootdir/test/nvmf/host/fio.sh "${TEST_ARGS[@]}" 00:20:41.430 23 run_test "nvmf_failover" $rootdir/test/nvmf/host/failover.sh "${TEST_ARGS[@]}" 00:20:41.430 24 run_test "nvmf_host_multipath_status" $rootdir/test/nvmf/host/multipath_status.sh "${TEST_ARGS[@]}" 00:20:41.430 25 run_test "nvmf_discovery_remove_ifc" $rootdir/test/nvmf/host/discovery_remove_ifc.sh "${TEST_ARGS[@]}" 00:20:41.430 26 run_test "nvmf_identify_kernel_target" "$rootdir/test/nvmf/host/identify_kernel_nvmf.sh" "${TEST_ARGS[@]}" 00:20:41.430 => 27 run_test "nvmf_auth_host" "$rootdir/test/nvmf/host/auth.sh" "${TEST_ARGS[@]}" 00:20:41.430 28 run_test "nvmf_bdevperf" "$rootdir/test/nvmf/host/bdevperf.sh" "${TEST_ARGS[@]}" 00:20:41.430 29 run_test "nvmf_target_disconnect" "$rootdir/test/nvmf/host/target_disconnect.sh" "${TEST_ARGS[@]}" 00:20:41.430 30 00:20:41.430 31 if [[ "$SPDK_TEST_NVMF_TRANSPORT" == "tcp" ]]; then 00:20:41.430 32 run_test "nvmf_digest" "$rootdir/test/nvmf/host/digest.sh" "${TEST_ARGS[@]}" 00:20:41.430 ... 00:20:41.430 00:20:41.430 ========== Backtrace end ========== 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1194 -- # return 0 00:20:41.430 00:20:41.430 real 0m54.009s 00:20:41.430 user 0m44.150s 00:20:41.430 sys 0m15.793s 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host.nvmf_auth_host -- common/autotest_common.sh@1 -- # exit 1 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1125 -- # trap - ERR 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1125 -- # print_backtrace 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1153 -- # [[ ehxBET =~ e ]] 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1155 -- # args=('--transport=rdma' '/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_host.sh' 'nvmf_host' '--transport=rdma') 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1155 -- # local args 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1157 -- # xtrace_disable 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@10 -- # set +x 00:20:41.430 ========== Backtrace start: ========== 00:20:41.430 00:20:41.430 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh:1125 -> run_test(["nvmf_host"],["/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf_host.sh"],["--transport=rdma"]) 00:20:41.430 ... 00:20:41.430 1120 timing_enter $test_name 00:20:41.430 1121 echo "************************************" 00:20:41.430 1122 echo "START TEST $test_name" 00:20:41.430 1123 echo "************************************" 00:20:41.430 1124 xtrace_restore 00:20:41.430 1125 time "$@" 00:20:41.430 1126 xtrace_disable 00:20:41.430 1127 echo "************************************" 00:20:41.430 1128 echo "END TEST $test_name" 00:20:41.430 1129 echo "************************************" 00:20:41.430 1130 timing_exit $test_name 00:20:41.430 ... 00:20:41.430 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf.sh:12 -> main(["--transport=rdma"]) 00:20:41.430 ... 00:20:41.430 7 rootdir=$(readlink -f $testdir/../..) 00:20:41.430 8 source $rootdir/test/common/autotest_common.sh 00:20:41.430 9 00:20:41.430 10 run_test "nvmf_target_core" $rootdir/test/nvmf/nvmf_target_core.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.430 11 run_test "nvmf_target_extra" $rootdir/test/nvmf/nvmf_target_extra.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.430 => 12 run_test "nvmf_host" $rootdir/test/nvmf/nvmf_host.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.430 13 00:20:41.430 14 # Interrupt mode for now is supported only on the target, with the TCP transport and posix or ssl socket implementations. 00:20:41.430 15 if [[ "$SPDK_TEST_NVMF_TRANSPORT" = "tcp" && $SPDK_TEST_URING -eq 0 ]]; then 00:20:41.430 16 run_test "nvmf_target_core_interrupt_mode" $rootdir/test/nvmf/nvmf_target_core.sh --transport=$SPDK_TEST_NVMF_TRANSPORT --interrupt-mode 00:20:41.430 17 run_test "nvmf_interrupt" $rootdir/test/nvmf/target/interrupt.sh --transport=$SPDK_TEST_NVMF_TRANSPORT --interrupt-mode 00:20:41.430 ... 00:20:41.430 00:20:41.430 ========== Backtrace end ========== 00:20:41.430 15:25:43 nvmf_rdma.nvmf_host -- common/autotest_common.sh@1194 -- # return 0 00:20:41.430 00:20:41.430 real 4m8.021s 00:20:41.431 user 8m6.906s 00:20:41.431 sys 1m22.628s 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1125 -- # trap - ERR 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1125 -- # print_backtrace 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1153 -- # [[ ehxBET =~ e ]] 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1155 -- # args=('--transport=rdma' '/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf.sh' 'nvmf_rdma' '/var/jenkins/workspace/nvmf-phy-autotest/autorun-spdk.conf') 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1155 -- # local args 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1157 -- # xtrace_disable 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@10 -- # set +x 00:20:41.431 ========== Backtrace start: ========== 00:20:41.431 00:20:41.431 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/test/common/autotest_common.sh:1125 -> run_test(["nvmf_rdma"],["/var/jenkins/workspace/nvmf-phy-autotest/spdk/test/nvmf/nvmf.sh"],["--transport=rdma"]) 00:20:41.431 ... 00:20:41.431 1120 timing_enter $test_name 00:20:41.431 1121 echo "************************************" 00:20:41.431 1122 echo "START TEST $test_name" 00:20:41.431 1123 echo "************************************" 00:20:41.431 1124 xtrace_restore 00:20:41.431 1125 time "$@" 00:20:41.431 1126 xtrace_disable 00:20:41.431 1127 echo "************************************" 00:20:41.431 1128 echo "END TEST $test_name" 00:20:41.431 1129 echo "************************************" 00:20:41.431 1130 timing_exit $test_name 00:20:41.431 ... 00:20:41.431 in /var/jenkins/workspace/nvmf-phy-autotest/spdk/autotest.sh:277 -> main(["/var/jenkins/workspace/nvmf-phy-autotest/autorun-spdk.conf"]) 00:20:41.431 ... 00:20:41.431 272 if [ $SPDK_TEST_NVMF -eq 1 ]; then 00:20:41.431 273 export NET_TYPE 00:20:41.431 274 # The NVMe-oF run test cases are split out like this so that the parser that compiles the 00:20:41.431 275 # list of all tests can properly differentiate them. Please do not merge them into one line. 00:20:41.431 276 if [ "$SPDK_TEST_NVMF_TRANSPORT" = "rdma" ]; then 00:20:41.431 => 277 run_test "nvmf_rdma" $rootdir/test/nvmf/nvmf.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.431 278 run_test "spdkcli_nvmf_rdma" $rootdir/test/spdkcli/nvmf.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.431 279 elif [ "$SPDK_TEST_NVMF_TRANSPORT" = "tcp" ]; then 00:20:41.431 280 run_test "nvmf_tcp" $rootdir/test/nvmf/nvmf.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.431 281 if [[ $SPDK_TEST_URING -eq 0 ]]; then 00:20:41.431 282 run_test "spdkcli_nvmf_tcp" $rootdir/test/spdkcli/nvmf.sh --transport=$SPDK_TEST_NVMF_TRANSPORT 00:20:41.431 ... 00:20:41.431 00:20:41.431 ========== Backtrace end ========== 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1194 -- # return 0 00:20:41.431 00:20:41.431 real 15m58.537s 00:20:41.431 user 37m10.061s 00:20:41.431 sys 5m6.229s 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1 -- # autotest_cleanup 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1392 -- # local autotest_es=1 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@1393 -- # xtrace_disable 00:20:41.431 15:25:43 nvmf_rdma -- common/autotest_common.sh@10 -- # set +x 00:20:56.322 INFO: APP EXITING 00:20:56.322 INFO: killing all VMs 00:20:56.323 INFO: killing vhost app 00:20:56.323 WARN: no vhost pid file found 00:20:56.323 INFO: EXIT DONE 00:20:59.617 Waiting for block devices as requested 00:20:59.617 0000:5e:00.0 (144d a80a): vfio-pci -> nvme 00:20:59.617 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:20:59.617 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:20:59.876 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:20:59.876 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:20:59.876 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:21:00.135 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:21:00.135 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:21:00.135 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:21:00.394 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:21:00.394 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:21:00.394 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:21:00.654 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:21:00.654 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:21:00.654 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:21:00.914 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:21:00.914 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:21:05.112 Cleaning 00:21:05.112 Removing: /var/run/dpdk/spdk0/config 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:21:05.112 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:21:05.112 Removing: /var/run/dpdk/spdk0/hugepage_info 00:21:05.112 Removing: /var/run/dpdk/spdk1/config 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-0-0 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-0-1 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-0-2 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-0-3 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-1-0 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-1-1 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-1-2 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memseg-2048k-1-3 00:21:05.112 Removing: /var/run/dpdk/spdk1/fbarray_memzone 00:21:05.112 Removing: /var/run/dpdk/spdk1/hugepage_info 00:21:05.112 Removing: /var/run/dpdk/spdk1/mp_socket 00:21:05.112 Removing: /var/run/dpdk/spdk2/config 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-0-0 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-0-1 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-0-2 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-0-3 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-1-0 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-1-1 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-1-2 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memseg-2048k-1-3 00:21:05.112 Removing: /var/run/dpdk/spdk2/fbarray_memzone 00:21:05.112 Removing: /var/run/dpdk/spdk2/hugepage_info 00:21:05.112 Removing: /var/run/dpdk/spdk3/config 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-0-0 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-0-1 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-0-2 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-0-3 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-1-0 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-1-1 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-1-2 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memseg-2048k-1-3 00:21:05.112 Removing: /var/run/dpdk/spdk3/fbarray_memzone 00:21:05.112 Removing: /var/run/dpdk/spdk3/hugepage_info 00:21:05.112 Removing: /var/run/dpdk/spdk4/config 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-0-0 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-0-1 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-0-2 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-0-3 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-1-0 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-1-1 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-1-2 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memseg-2048k-1-3 00:21:05.112 Removing: /var/run/dpdk/spdk4/fbarray_memzone 00:21:05.113 Removing: /var/run/dpdk/spdk4/hugepage_info 00:21:05.113 Removing: /dev/shm/bdevperf_trace.pid1677428 00:21:05.113 Removing: /dev/shm/bdev_svc_trace.1 00:21:05.113 Removing: /dev/shm/nvmf_trace.0 00:21:05.113 Removing: /dev/shm/spdk_tgt_trace.pid1640248 00:21:05.113 Removing: /var/run/dpdk/spdk0 00:21:05.113 Removing: /var/run/dpdk/spdk1 00:21:05.113 Removing: /var/run/dpdk/spdk2 00:21:05.113 Removing: /var/run/dpdk/spdk3 00:21:05.113 Removing: /var/run/dpdk/spdk4 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1639735 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1640248 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1640804 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1641558 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1641751 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1642536 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1642721 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1643029 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1646740 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1647238 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1647490 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1647904 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1648163 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1648420 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1648630 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1648831 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1649096 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1649899 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1652970 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1653188 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1653569 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1653625 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1654159 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1654293 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1654757 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1654927 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1655153 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1655334 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1655477 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1655565 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1656037 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1656235 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1656496 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1660137 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1663899 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1672489 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1673210 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1677428 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1677665 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1681355 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1686514 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1689229 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1698226 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1713504 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1716875 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1754402 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1758827 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1763814 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1771309 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1805028 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1805884 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1806870 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1807874 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1812120 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1818574 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1818589 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1822106 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1822609 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1823011 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1823558 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1823614 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1827795 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1828244 00:21:05.113 Removing: /var/run/dpdk/spdk_pid1832044 00:21:05.372 Removing: /var/run/dpdk/spdk_pid1834284 00:21:05.372 Removing: /var/run/dpdk/spdk_pid1839128 00:21:05.372 Removing: /var/run/dpdk/spdk_pid1847135 00:21:05.372 Removing: /var/run/dpdk/spdk_pid1853636 00:21:05.372 Removing: /var/run/dpdk/spdk_pid1853676 00:21:05.372 Clean 00:23:26.914 15:28:16 nvmf_rdma -- common/autotest_common.sh@1451 -- # return 1 00:23:26.914 15:28:16 nvmf_rdma -- common/autotest_common.sh@1 -- # : 00:23:26.914 15:28:16 nvmf_rdma -- common/autotest_common.sh@1 -- # exit 1 00:23:26.927 [Pipeline] } 00:23:26.946 [Pipeline] // stage 00:23:26.954 [Pipeline] } 00:23:26.974 [Pipeline] // timeout 00:23:26.981 [Pipeline] } 00:23:26.985 ERROR: script returned exit code 1 00:23:26.986 Setting overall build result to FAILURE 00:23:27.001 [Pipeline] // catchError 00:23:27.006 [Pipeline] } 00:23:27.022 [Pipeline] // wrap 00:23:27.028 [Pipeline] } 00:23:27.042 [Pipeline] // catchError 00:23:27.051 [Pipeline] stage 00:23:27.054 [Pipeline] { (Epilogue) 00:23:27.067 [Pipeline] catchError 00:23:27.069 [Pipeline] { 00:23:27.082 [Pipeline] echo 00:23:27.083 Cleanup processes 00:23:27.089 [Pipeline] sh 00:23:27.376 + sudo pgrep -af /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:23:27.376 1891891 sudo pgrep -af /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:23:27.389 [Pipeline] sh 00:23:27.674 ++ sudo pgrep -af /var/jenkins/workspace/nvmf-phy-autotest/spdk 00:23:27.674 ++ grep -v 'sudo pgrep' 00:23:27.674 ++ awk '{print $1}' 00:23:27.674 + sudo kill -9 00:23:27.674 + true 00:23:27.686 [Pipeline] sh 00:23:27.971 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:23:32.296 [Pipeline] sh 00:23:32.579 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:23:32.579 Artifacts sizes are good 00:23:32.592 [Pipeline] archiveArtifacts 00:23:32.599 Archiving artifacts 00:23:32.760 [Pipeline] sh 00:23:33.046 + sudo chown -R sys_sgci: /var/jenkins/workspace/nvmf-phy-autotest 00:23:33.062 [Pipeline] cleanWs 00:23:33.072 [WS-CLEANUP] Deleting project workspace... 00:23:33.072 [WS-CLEANUP] Deferred wipeout is used... 00:23:33.080 [WS-CLEANUP] done 00:23:33.082 [Pipeline] } 00:23:33.099 [Pipeline] // catchError 00:23:33.110 [Pipeline] echo 00:23:33.112 Tests finished with errors. Please check the logs for more info. 00:23:33.116 [Pipeline] echo 00:23:33.118 Execution node will be rebooted. 00:23:33.134 [Pipeline] build 00:23:33.137 Scheduling project: reset-job 00:23:33.150 [Pipeline] sh 00:23:33.436 + logger -p user.info -t JENKINS-CI 00:23:33.446 [Pipeline] } 00:23:33.459 [Pipeline] // stage 00:23:33.465 [Pipeline] } 00:23:33.479 [Pipeline] // node 00:23:33.485 [Pipeline] End of Pipeline 00:23:33.528 Finished: FAILURE